Author: Akshaya Jayakumar
This article will discuss the concept of a new strategy of having stand-alone metrics in an organization where all the downstream applications can pull metrics from a single centralized framework called the Metrics Layer. Let’s get to know it better.
Why did the Metrics layer gain limelight?
Every organization makes use of different BI tools for its analytical purposes. Various dashboards are created from the ever-accumulating volatile data. Traditionally all the business KPI metrics are defined in each BI tool individually for the end visualization dashboards. For an organization that has standard measures defined and KPIs centralized, it would be a repetitive chore while building new dashboards and modify the existing dashboards.
Thoughts emerged to have an intermediate abstraction layer where key business measures like Revenue, Transaction, Sales, etc., are made centrally available. Any data consumption tool could directly connect to this layer and make the best use of the defined measures, wisely following the principle, Don’t Repeat Yourself (DRY).

What is a Metrics Layer?
The above-referred abstraction layer is known as the Metrics Layer.
A metric layer is a semantic layer where data teams can centrally define and store their business metrics or Key Performance Indicators in the code. This layer would lie between the data storage and the data using tools in any typical architecture. This new addition of tech-stack would change the traditional belief of having a data warehouse as the Single Source of Truth to the metrics of your organization.
Already few companies have come up with this thought and have developed metrics tools that can be used readily. Metriql, Transform, Supergrain, Trace, LightDash, Looker, Minerva, etc., are some tools that are open in the market to bridge the gaps. Here we have considered Metriql in our discussion.
Metriql
As data warehouses contain voluminous data that would be consumed by different people in an organization, like:
Historical business and finance data be used by Business Analysts.
Real-time Production and Management Data used by Product managers and Clients.
Analytical data by Data analysts.
Custom data for automation by Data Scientists.
Metriql helps all of these professionals to consume their required data with standardized measures from the same place.

Metriql, which is a Metric Layer tool works as an extension of DBT sources and DBT models[1]. If we are making use of DBT for transforming the data into a cleaner and required format, Metriql would be the best option for your Metric Layer implementation.
Otherwise, also, we can use Metriql by defining the data source in DBT and use it for creating measures.
Metriql to connect Snowflake data warehouse and Tableau
Metriql supports many databases and provides a good number of integration options. We would find setting up Metriql for Snowflake data warehouse and Tableau visualization in this article.
The prerequisites for metriql installation ( for Windows ):
Docker
Windows 10 or more
Python 3.9
DBT Core or DBT Cloud (enterprise version)
Database or BI tool
Data warehouses like BigQuery or Snowflake
Dataset
Follow the below steps to setup metriql onto your local system[2]
Install DBT
Create DBT project
Configure DBT profiles
Run DBT list
Install and run metriql
1. DBT Installation[3]:
a. Install DBT Core on your local system, which requires Python as a prerequisite. The optimal version would be Python 3.9 ( compatible with both DBT and docker requirements)


b. Install Snowflake DBT package for connecting with Snowflake as a data source.
pip install dbt-snowflake

2. DBT project creation:
Once DBT core and DBT snowflake packages are installed, firstly, we need to create a DBT project from your CLI.
dbt init <project_name>
3. Configuring Snowflake account in DBT:
To connect with a Snowflake account, we need to provide Snowflake connection details like:
Type of Database [Snowflake]
Snowflake account [locator]
Authentication type[password, keypair, sso]
Authentication key [password, private key]
Role name
Warehouse name
Database name
Schema name
Number of concurrent threads [1 or more]

Once the source account is set up, we can create a source.yml file in the model's folder and start defining our metrics for our dataset. Models can be created with that dataset inside the \models directory.

4. Run DBT list:
DBT list
The command would list down all the configured project profiles.
5. Install Metriql:
To install metriql, we need to pull metriql’s docker image, for which we would need docker installed on our system.
The following hardware prerequisites are required to successfully run docker on Windows 10 or Windows 11[4]:
64-bit processor with Second Level Address Translation (SLAT)
4GB system RAM
BIOS-level hardware virtualization support must be enabled in the BIOS settings.
Docker pull command:
docker pull buremba/metriql:latest

Set your project folder, DBT profile path, and port number to run docker. Use the following commands for the same.
export DBT_PROJECT_DIR=${PWD}
export DBT_PROFILES_DIR=${HOME}/.dbt
export METRIQL_PORT=5656
NOTE: Use ‘set’ instead of ‘export’ for Windows machines.
docker run -it -p "${METRIQL_PORT}:5656" -v "${DBT_PROJECT_DIR}:/root/app" -v "${DBT_PROFILES_DIR}:/root/.dbt" -e METRIQL_RUN_HOST=0.0.0.0 -e DBT_PROJECT_DIR=/root/app buremba/metriql serve
Once Metriql is pulled, and docker is started, we will be able to see our Metriql UI on our localhost or 127.0.0.1:5656

Creating Metriql Datasets [5]:
A dataset represents a dataset in your data warehouse. Metriql automatically creates datasets from your DBT models, sources, and seeds. A dataset has three important properties:
dimension lets you drill down into the dataset; it can be used to filter query results. It can be either a column in your table or an SQL expression that processes a single row and returns a value.
measure is a field that uses an SQL aggregate function, such as count, sum, or average. Measures can be used to aggregate dimensions. For example, measures for a Sales model might include total items sold (a count), total sales price (a sum), and the average sales price (an average).
relation defines a joining between the datasets. Metriql automatically generates SQL join logic to bring in all fields from another dataset correctly then the user analyzes a specific dataset.
Create datasets from DBT sources:
In case you want to create models that point to tables in your database, you can make use of DBT's source properties as follows:

If you're analyzing the time-series data, you can also define mappings so that metriql understands your data in a better way and lets you access specific features such as funnel and retention:

Tableau direct Integration:
After we have the datasets ready after the metrics definition, we can integrate them with Metriql supported tools. As of now, tableau integration is in the beta stage.
In the dataset tab, we can see all our Metriql sources, which can be downloaded as .tds files and can be loaded into Tableau.

Integrating Metriql and Tableau using Python flask REST API:
To cut down manual processes and to facilitate connection with Tableau, we made use of REST API connectivity. To simplify the implementation, we made use of Python Flask to connect with the metriql data source.
The following steps were followed to achieve the same:
Initialize a Flask API[6]
Import trino’s python package.
Give the Metriql connection details.
Run it on the local server. This now gets hosted on our local system.

5. Make an ODBC connection with the REST URL.

6. The ODBC connection setup for Metrics data shows up in Tableau Data sources.

7. With that connection selected, we can Sign in to load the data into our Tableau.

Conclusion:
Metriql, a metric definition tool, can help create metrics for your organization’s data with zero implementation cost. If you are a DBT user, then metriql would be a piece of cake to model your data effectively. All its services are developed in Kotlin, and it is also open for public contribution. Centralized and standardized metrics obtained through Metriql make your data more organized and simplify your work while creating any new visualization.
References:
For more details please refer:
[1]. https://metriql.com/introduction/intro
[2]. https://metriql.com/tutorial/for-starters
[3]. https://docs.getdbt.com/dbt-cli/install/overview
[4]. https://docs.docker.com/desktop/windows/install/
[5]. https://metriql.com/introduction/creating-datasets
[6]. https://www.analyticsvidhya.com/blog/2022/01/rest-api-with-python-and-flask/