Metrics Layer - Metriql Implementation For Snowflake Data On Tableau

Author: Akshaya Jayakumar

This article would discuss the concept of a new strategy of having stand-alone metrics in an organization where all the downstream applications can pull metrics from a single centralized framework called Metrics Layer. Let’s get to know it better.

Why did the Metrics layer gain limelight?

Every organization makes use of different BI tools for their analytical purposes. Various dashboards are created from the ever accumulating volatile data. Traditionally all the business KPI metrics are defined in each BI tool individually for the end visualization dashboards. To an organization that has standard measures defined and KPIs centralized, it would be a repetitive chore while building new dashboards and modifying the existing dashboards.

Thoughts emerged to have an intermediate abstraction layer where key business measures like Revenue, Transaction, Sales, etc. are made centrally available. Any data consumption tool could directly connect to this layer and make best use of the defined measures, wisely following the principle, Don’t Repeat Yourself (DRY).

Simple representation on the inclusion of Metriql in an organization
What is a Metrics Layer?

The above referred abstraction layer is known as the Metrics Layer.

A metric layer is a semantic layer where data teams can centrally define and store their business metrics or Key Performance Indicators in the code. This layer would lie between the data storage and the data usage tools in any typical architecture. This new addition of tech-stack, would change the traditional belief of having a data warehouse as the Single Source of Truth to the metrics of your organization.

Already few companies have come up with this thought and have developed metrics tool which can be used readily. Metriql, Transform, Supergrain, Trace, LightDash, Looker, Minerva, etc are some tools that are open in the market to bridge the gaps. Here we have considered Metriql in our discussion.


As data warehouses contain voluminous data that would be consumed by different people in an organization like,

  • Historical business and finance data be used by Business Analysts.

  • Real time Production and Management Data used by Product managers and Clients.

  • Analytical data by Data analysts.

  • Custom data for automation by Data Scientists.

Metriql helps all of these professionals to consume their required data with standardized measures from the same place.

Metriql as a centralized metric definition platform for the BI and data tools

Metriql which is a Metric Layer tool, works as an extension of dbt sources and dbt models[1]. If we are making use of dbt for transforming the data into cleaner and required format, Metriql would be a best option for your Metric Layer implementation.

Otherwise also we can use Metriql by defining the data source in dbt and use it for creating measures.

Metriql to connect Snowflake data warehouse and Tableau

Metriql supports many databases and provides a good number of integration options.We would find setting up Metriql for Snowflake data warehouse and Tableau visualization in this article.

The prerequisites for metriql installation ( for Windows ):

  • Docker

  • Windows 10 or more

  • Python 3.9

  • Dbt Core or Dbt Cloud (enterprise version)

  • Database or BI tool

  • Data warehouse like BigQuery or Snowflake

  • Dataset

Follow the below steps to setup metriql onto your local system[2]

  1. Install dbt

  2. Create dbt project

  3. Configure dbt profiles

  4. Run dbt list

  5. Install and run metriql

1. DBT Installation[3]:

a. Install dbt Core on your local system, for which it requires Python as a prerequisite. Optimal version would be Python 3.9 ( compatible with both dbt and docker requirements)

b. Install Snowflake dbt package for connecting with Snowflake as a datasource.

pip install dbt-snowflake

2. DBT project creation:

Once dbt core and dbt snowflake packages are installed, firstly we need to create a dbt project from your CLI.

dbt init <project_name>

3. Configuring Snowflake account in DBT:

To connect with a Snowflake account, we need to provide Snowflake connection details like:

  1. Type of Database [Snowflake]

  2. Snowflake account [locator]

  3. Authentication type[password, keypair, sso]

  4. Authentication key [password , private key]

  5. Role name

  6. Warehouse name

  7. Database name

  8. Schema name

  9. Number of concurrent threads [1 or more]

Once the source account is set up, we can create a source.yml file in models folder and start defining our metrics for our dataset. Models can be created with those dataset inside the \models directory.

This is how our .profile.yml file should look after creating our project.

4. Run dbt list:

dbt list

command would list down all the configured project profiles.

5. Install Metriql:

To install metriql we need to pull metriql’s docker image, for which we would need docker installed to our system.

The following hardware prerequisites are required to successfully run docker on Windows 10 or Windows 11[4]:

  • 64-bit processor with Second Level Address Translation (SLAT)

  • 4GB system RAM

  • BIOS-level hardware virtualization support must be enabled in the BIOS settings.

  • Docker pull command:

  • docker pull buremba/metriql:latest

Sample image of docker pull
  • Set your project folder , dbt profile path and port number to run docker. Use the following commands for the same.

  • export DBT_PROJECT_DIR=${PWD}

  • export DBT_PROFILES_DIR=${HOME}/.dbt

  • export METRIQL_PORT=5656

NOTE: Use ‘set’ instead of ‘export’ for Windows machines.

  • docker run -it -p "${METRIQL_PORT}:5656" -v "${DBT_PROJECT_DIR}:/root/app" -v "${DBT_PROFILES_DIR}:/root/.dbt" -e METRIQL_RUN_HOST= -e DBT_PROJECT_DIR=/root/app buremba/metriql serve

  • Once Metriql is pulled and docker is started, we will be able to see our Metriql UI on our localhost or

Metriql UI
Creating Metriql Datasets [5]:

A dataset represents a dataset in your data warehouse. Metriql automatically creates datasets from your dbt models, sources, and seeds. A dataset has three important properties:

  • dimension lets you drill down into the dataset, it can be used to filter query results. It can be either a column in your table or an SQL expression that processes a single row and returns a value.

  • measure is a field that uses an SQL aggregate function, such as count, sum, or average. Measures can be used to aggregate dimensions. For example, measures for a Sales model might include total items sold (a count), total sales price (a sum), and the average sales price (an average).

  • relation defines a join in between the datasets. Metriql automatically generates SQL join logic to bring in all fields from another dataset correctly then the user analyzes a specific dataset.

  • Create datasets from dbt sources:

In case you want to create models that point to tables in your database, you can make use of dbt's source properties as follows:

If you're analyzing the time-series data, you can also define mappings so that metriql understands your data in a better way and lets you access specific features such as funnel and retention:

Tableau direct Integration:

After we have the datasets ready after metrics definition, we can integrate with Metriql supported tools. As of now, tableau integration is in beta stage.

In the dataset tab, we can see all our Metriql sources, which can be downloaded as .tds file and can be loaded into Tableau.

Tableau integration in Metriql UI
Integrating Metriql and Tableau using Python flask REST API:

To cut down manual processes and to facilitate connection with Tableau, we had made use of REST API connectivity. To simplify the implementation we made use of Python Flask to connect with the metriql datasource.

Following steps were followed to achieve the same:

  1. Initialize a Flask API[6]

  2. Import trino’s python package.

  3. Give the Metriql connection details.

  4. Run it on the local server. This now gets hosted onto our local system.

5. Make an ODBC connection with the REST url.

6. The ODBC connection setup for Metrics data shows up in Tableau Data sources.

7. With that connection selected, we can Sign in to load the data into our Tableau.

Preview of data loaded into Tableau:

Metriql, a metric definition tool can help create metrics for your organization’s data with zero implementation cost. If you are a dbt user then, metriql would be a piece of cake to model your data effectively. All its services are developed in Kotlin and it is also open for public contribution.Centralized and standardized metrics obtained through Metriql makes your data more organized and simplify your work while creating any new visualization.


For more details please refer:







48 views0 comments

Recent Posts

See All