Author: Manohar Perabathula
Overview
When we want to load data from cloud providers like AWS, Azure, or GCP, we will create and connect an external stage to the cloud provider by providing some credentials like a secret key or access token. So that data will be loaded to staging, and from there, it is copied to the Snowflake table. This blog will describe how to establish a secure connection between the Snowflake external stage and the Azure container for secure access to data. So storage integration avoids the need for passing cloud provider credentials such as access tokens or secret keys. So the uniqueness of this blog is to load data into a Snowflake external stage from a cloud provider without providing credentials.
Why Storage Integration?
Storage integration is a configurable object that lives inside the Snowflake. It stores a generated identity and access management (IAM) entity for your external storage, along with an optional set of allowed or blocked storage locations. A single storage integration can support multiple external stages. Once storage integration is configured, we can use it to create an external stage without having to input the SAS token every time and also avoid the risk of data exfiltration.
Demo
1. Initially, log in to your Snowflake account and run the below query for creating a storage integration.

2. To integrate the Snowflake account, we need a certain ID called Tenant. This ID is available in the Azure directory.

3. To get a container name, create a container in your Azure account and upload any feed file.

4. In order to get the properties of storage integration, run the below query in your Snowflake account.

5. For creating stage and file format in Snowflake, run the queries as shown below.

6. Next, we need to assign the roles to users. For this, click on access control (IAM) and then click on "add role assignment."

7. Then, select the roles that need to be assigned.

8. After selecting the required role, click on select members and then search for AZURE_MULTI_TENANT_APP_NAME.

9. Finally, Snowflake integration with Azure is established. Now create a table and execute the copy into the command.

10. Then, in the Snowflake account, run the query below to get the entire data of the table.

Conclusion
As a result, storage integration in Snowflake is crucial for secure access to data files stored in Microsoft Azure containers. Moreover, it avoids the need to supply a SAS token every time when loading data and also avoids the risk of data exfiltration.
References
https://docs.snowflake.com/en/user-guide/data-load-azure-config.html