Author: Gaurav Bhatt
What is the need of DevOps in Snowflake?
DevOps plays an important role in automating the development process, smoothen release process and easy maintenance of softwares. It is utmost important to handle changes in the Snowflake environment, which deals with new oil of the age “THE DATA”.
In this blog, we will look into the step by step process of automating release management for Snowflake using Github Action and managing database objects using light-weigh Python based tool named schemachange.
Prerequisites
Github Account ( Create Github Account )
Snowflake Account ( Create Snowflake trial account )
IDE Installed on your system ( We are using Visual Studio Code )
How does schemachange works?
It's a python based tool which helps in managing Snowflake objects. Combining it with CI/CD tool and version control system we can deploy database change using a defined pipeline.
Type of Scripts in schemachange
Versioned Scripts
These scripts are executed if their version is greater than already deployed scripts.

Fig Source: schemachange github page
Once we have defined a version string, we need to always use the same convention for all version scripts. Below are few valid string:
1.1
1_1
1.2.3
1_2_3
Repeatable Scripts
These scripts are executed every time, if there is any change in the file.
Those scripts which need to be applied entirely can be used as repeatable scripts e.g. stored procedure.
It should be noted that repeatable scripts are executed in order of their description. Moreover, these are applied after versioned files.

Fig Source: schemachange github page
Always Scripts
These scripts execute for every run of schema change.
Helpful in case of environment related activity.
Important point to note here is, always scripts are executed at last.
Example: A__description_name.sql
Change History Table
All the activities performed during script execution are recorded in a table by default at METADATA.SCHEMACHANGE.CHANGE_HISTORY table.
The structure of theCHANGE_HISTORY table can be checked here: Change History Table
Required Parameters to run SchemaChange
To run schema change, below parameters are required to be passed ( we will see passing parameters with github actions workflow in next section ):
snowflake-account
snowflake-use
snowflake-role
snowflake-warehouse
Implement schemachange
Cloning Repository
Once we have repository created, clone the repository into VSCode, from left menu open “Source Control” or press “Ctrl+Shift+G”
Paste git clone URL and clone repository.

Creating SQL files
Create a folder named it as root for this demo. Inside root directory create below files and update code to test schemachange:
V1.1.1__first_file.sql
use database demo_db;
use schema public;
create table hello_world
(
first_name varchar
,last_name varchar
);
R__one_repeat.sql
use database demo_db; use schema public; create or replace function pi_udf() returns float as '3.14::FLOAT' ;
A__one_always.sql
use database demo_db;
use schema public;
insert into hello_world(first_name,last_name) values('first','last');
Note: Make sure, database and schema are created for above script execution.
Configure Actions Secrets in Github
To securely store creating secrets in GitHub for the parameters used by schemachange.
Navigate to repository and click Settings
From left menu, select “Secrets and variables” → Actions
Click “New repository secret” and create below secrets with required values:

Github Actions Workflow
In this step, we will create a workflow to run schemachange and deploy our changes to Snowflake.
In github repository create directory:
.github/workflows
Inside workflows directory create and update with below code:
Devops-workflow.yml
name: devops-workflow
# Controls when the action will run.
on:
push:
branches:
- main
paths:
- 'root/**'
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
jobs:
deploy-snowflake-changes-job:
runs-on: ubuntu-latest
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- name: Checkout repository
uses: actions/checkout@v2
- name: Use Python 3.8.x
uses: actions/setup-python@v2.2.1
with:
python-version: 3.8.x
- name: Run schemachange
env:
SF_ACCOUNT: ${{ secrets.SF_ACCOUNT }}
SF_USERNAME: ${{ secrets.SF_USERNAME }}
SF_ROLE: ${{ secrets.SF_ROLE }}
SF_WAREHOUSE: ${{ secrets.SF_WAREHOUSE }}
SF_DATABASE: ${{ secrets.SF_DATABASE }}
SNOWFLAKE_PASSWORD: ${{ secrets.SF_PASSWORD }}
run: |
echo "GITHUB_WORKSPACE: $GITHUB_WORKSPACE"
python --version
echo "Step 1: Installing schemachange"
pip install schemachange
echo "Step 2: Running schemachange"
schemachange -f $GITHUB_WORKSPACE/root -a $SF_ACCOUNT -u $SF_USERNAME -r $SF_ROLE -w $SF_WAREHOUSE -d $SF_DATABASE -c $SF_DATABASE.SCHEMACHANGE.CHANGE_HISTORY --create-change-history-table
Commit file to github.
Before executing workflow, let’s see how schemachange is executed with created workflow:
name: Run schemachange
env:
SF_ACCOUNT: ${{ secrets.SF_ACCOUNT }}
SF_USERNAME: ${{ secrets.SF_USERNAME }}
SF_ROLE: ${{ secrets.SF_ROLE }}
SF_WAREHOUSE: ${{ secrets.SF_WAREHOUSE }}
SF_DATABASE: ${{ secrets.SF_DATABASE }}
SNOWFLAKE_PASSWORD: ${{ secrets.SF_PASSWORD }}
In this step we are fetching secrets configured in github and setting them as environment variables.
run: |
echo "GITHUB_WORKSPACE: $GITHUB_WORKSPACE"
python --version
echo "Step 1: Installing schemachange"
pip install schemachange
echo "Step 2: Running schemachange"
schemachange -f $GITHUB_WORKSPACE/root -a $SF_ACCOUNT -u $SF_USERNAME -r $SF_ROLE -w $SF_WAREHOUSE -d $SF_DATABASE -c $SF_DATABASE.SCHEMACHANGE.CHANGE_HISTORY --create-change-history-table
This is the command which will execute schema change with input parameters set at environment. To explicitly create a CHANGE_HISTORY table in our database, we have to use --create-change-history-table argument.
Execute Github Workflow
Let’s run the workflow created and deploy scripts in Snowflake.
Login to the github repository and open the “Actions” tab.
Click on “devops-workflow” from the Actions panel at left and click “Run workflow”
Select “main” branch and run workflow.

Wait for workflow to get completed.


Once the workflow is successfully executed, let's validate the deployed script in Snowflake.
We can check that the CHANGE_HISTORY table is created at the desired location and all scripts are executed.


For V1.1.1__first_file.sql script HELLO_WORLD table is created.

For R__one_repeat.sql script function pi_udf is created.

For A__one_always.sql scripts, we can check inserted records in the HELLO_WORLD table.

Give it a try!
Hope this blog helped you to get started with schemachange using github actions. Try running workflow with below changes and observe how what changes are reflected in Snowflake:
What do you observe in the CHANGE_HISTORY table, if you run workflow again?
What happens if you update R__one_repeat.sql and run workflow?
I updated V1.1.1__first_file.sql content , but don’t see changes in Snowflake. What could be the reason?
References
Schemachange github:https://github.com/Snowflake-Labs/schemachange
Quickstart: https://quickstarts.snowflake.com/guide/devops_dcm_schemachange_github/#0