top of page

Snaplogic ETL Tool

Updated: Mar 16

Author: Navanil Roy


Introduction

In today’s world, the enterprise that leverages the worth of its data has the fastest, leads. SnapLogic ETL enables us to source and transform data at scale visually inside a data lake. With the identical drag-and-drop functionality, SnapLogic ELT powers us to quickly load data into cloud data warehouses, and hence transform data in place. SnapLogic has a low-code/no-code platform that makes it easy for data teams to quickly hydrate data lakes, create data pipelines, and empower business teams with insights needed for business decisions that improve customer experiences. SnapLogic empowers data teams to work quickly so you can focus on the details of the data.


Features in Snaplogic
  • MULTI-POINT INTEGRATION

  • Simple and complex orchestration use cases

  • Java SDK for rapid custom cloud connector development

  • SELF-SERVICE USER EXPERIENCE

  • Artificial intelligence / Machine learning

  • Advanced monitoring

  • MODERN ARCHITECTURE

  • Parity between on-premises and cloud deployments

  • Elastic scale-out architecture

  • Multi-tenancy

  • Event-based and real-time integration

  • Batch or scheduled integration

  • Big Data Integration

  • IoT integration

  • ENTERPRISE-READY

  • Bulk data movement

  • High availability

  • End-to-end audit trails

  • Messaging

  • Self-upgrading on-premises and cloud software

  • Standards-based security


Sign Up

Sign up from Snowflake Partner connect. Alternatively, we can also sign up directly.



Building a Pipeline in Snaplogic

After we log in, we will find a screen similar to below with an empty canvas.



The left pane contains all the snaps that are being supported by default and we can search for the required connector. For example, we can search for MYSQL and it will list down all the available connectors.



Let us jump right into building our first pipeline.


The objective is to fetch the data from MYSQL and load it to Snowflake.


  1. Search for the MYSQL Select connector and put it to the canvas as shown above. Edit the snap and click on Account to configure your Account connection.

  2. Search for Snowflake Bulk Load or Snowflake Insert based on the requirement and configure the connection for Snaplogic to be able to connect.

  3. Search for Mapper snap which will be required to map the columns of the source and sink. Below you can find how a typical Mapper looks like where Expression is the source column and Target is the sink column. You can also perform some transformations if required in the source columns before loading them to the sink.



After completion, the pipeline will look like the below.



Configuring Error Notifications

The other most important feature that a pipeline should have is to send out notifications which will be configured using SNS in this particular scenario but there are many other ways of doing that.


Getting an email in the event of an error in a particular snap is easy. Click on a particular component and select views from the top pane and have the below configurations.



After that, we should have an error output as below.



Finally link the error output to the SNS component as below.


Conclusion

Wrapping it up, we got introduced to Snaplogic and went over some of the very important features that it comes with.


Moving ahead, we built our first pipeline and loaded our data from the OLTP system to the OLAP system successfully. We also configured email notifications to get alerts from the pipeline.


As we can see that Snaplogic has quite some features which can help us in achieving our ETL use cases faster and easier. It has all the necessary features that we require along with a few others that are exclusive.


Recent Posts

See All
bottom of page