Author: Fakiha Amber

Introduction
The Pipeline is scheduled in the Dockerized Jenkins that fetches SQL files committed into the Bitbucket repository. Once the SQL file gets committed successfully, the job runs through Jenkins to execute the changes. Schema change will help Jenkins to deploy the changes mentioned in the loaded SQL file into the provided Snowflake account.

Prerequisites:
1. Snowflake
A Snowflake Account, a Database (DEMO_DB), User created with appropriate permissions. This user requires permission to create objects in the database(DEMO_DB).
2. BitBucket
A BitBucket Account.
A repository on BitBucket. If the repository has not yet been created, create one. (For the time being, ignore the README, gitignore, and license.)
3. Any IDE Of your choice
Recommended - VS Code with git integrations.
Clone the project repository to your computer using VS Code. Copy the HTTPS link from the repository to clone it.

4. Docker
Download Docker Desktop on your laptop to run Jenkins as a container. Install Docker Desktop by following the Docker setup instructions.
STEPS
Create Your First Database Migration
Open up your cloned repository in your preferred IDE and create a folder (e.g., migrations).
In the new folder, create a script named "V1.1.1__initial_objects.sql" make sure there are two underscores after the version number) with the following contents:
CREATE SCHEMA DEMO;
CREATE TABLE EMPLOYEE
(
FIRST_NAME VARCHAR,
LAST_NAME VARCHAR
);
Commit the new script and push the changes to your Bitbucket repository.
git add --all
git commit -m "<commit message>"
git push -u origin main
Build and Run a Docker Image
To deploy into Jenkins, create a custom Docker image and run it locally. So first, create a Docker file named "Dockerfile" in the root of your Bitbucket repository with the following contents and commit it.
FROM jenkins/jenkins:lts
# Install required applications
USER root
RUN apt-get update
RUN apt-get install -y docker.io
# Come back to the regular jenkins user
USER jenkins
To create the custom Docker image, run the following command from the shell:
docker build -t jenkins .
docker run -p 8080:8080 -v /var/run/docker.sock:/var/run/docker.sock -- name jenkins jenkins
The initial admin password for your Jenkins installation will be displayed in the Docker output. Copy it for later usage.
Once the Jenkins container is running, we need to give Jenkins access to the Docker engine using the command.
docker exec -it -u root jenkins bash -c 'chmod 666 /var/run/docker.sock'
Configure Jenkins
To access the Jenkins UI, open localhost:8080 in the web browser of your choice.
In the first screen titled "Unlock Jenkins'' enter the admin password saved in the previous step and click next.
Now you need to install the plugins either you can install all the plugins or select plugins to install as shown below.

Once the plugins are installed, you will be taken to the "Create First Admin User" page.
Enter "admin" for the "Username" and "Full name," enter a password and enter a valid email address. Confirm that your screen looks like the image below and click on the Save and Continue buttons.

In the "Getting Started" screen, leave the "Instance Configuration" Jenkins URL as "http://localhost:8080 " and click on the Save and Finish button; later, click on the "Start using Jenkins" button.
Now you are ready to use Jenkins.
If directed to the login page (like below), fill in the credentials to sign in.

Now, we need to install the Docker Pipeline plugin in Jenkins. It will allow Jenkins Pipeline Projects to build and test using Docker images.
Click on the "Manage Jenkins" in the left navigation bar and then on the "Manage Plugins" under the "System Configuration" tab, as shown below.

From the "Plugin Manager," click on the "Available" tab and search for "docker pipeline." Check the box under the "Install" column next to the plugin and then click on "Install without restart."

In the result page, you should see a bunch of green check marks with "Success" status. Now Jenkins is set up and ready to use.
Creating a Jenkins Pipeline
Creating the Jenkinsfile
In this step, we will create the Jenkins Pipeline. A Jenkins Pipeline orchestrates long-running activities that can span multiple build agents.
A Jenkins pipeline is defined through a file format known as "Jenkinsfile." It is a text file that contains the definition of a Jenkins Pipeline and is confirmed into source control.
For more details on the "Jenkinsfile," refer to using Jenkinsfile documentation.
First, create a Jenkinsfile and commit it to the repository.
Create a file named "Jenkinsfile" in the root of your BitBucket repository using the IDE with the following contents and commit it.
pipeline {
agent {
docker {
image "python:3.8"
args '--user 0:0'
}
}
stages {
stage('Run schemachange') {
steps {
sh "pip install schemachange --upgrade"
sh "schemachange -f migrations -a ${SF_ACCOUNT} -u ${SF_USERNAME} -r ${SF_ROLE} -w ${SF_WAREHOUSE} -d ${SF_DATABASE} -c ${SF_DATABASE}.SCHEMACHANGE.CHANGE_HISTORY --create-change-history-table"
}
}
}
}
After committing Jenkinsfile, the repository should look like this:

Creating the Pipeline
Now that the pipeline definition has been created in the Jenkinsfile and updated to our repository, we are ready to build the Pipeline in Jenkins.
Click on "New Item" in the left navigation bar of the main Jenkins dashboard.
For the item name, enter <snowflake-pipeline> and click on the Pipeline item type to select it and click OK.

We have created a new Job in Jenkins. Firstly We need to tell the newly created job where the Jenkinsfile definition is. Be on the job configuration page and click on the Pipeline tab and change the "Definition" to Pipeline script from SCM. Under the SCM section, select Git and then paste the BitBucket repository URL into the Repository URL field. Click on the Advanced button and then enter these values for the below parameters:
Name: origin
Refspec: +refs/pull/*:refs/remotes/origin/pr/*
Branches to build: leave blank
Repository browser: (Auto)
Additional Behaviors: Wipe out repository & force clone
Script Path: Jenkinsfile
Uncheck "Lightweight checkout."
NOTE: Do not miss step #5; click on the "Add" button under "Additional Behaviors" and then select "Wipe out repository & force clone".
This step ensures that your Jenkins pipeline will always work with the latest version of your repository.

Click on the save button to save changes.
Tip - If you are using a private BitBucket repository, then input the credentials of your BitBucket repository. If not, leave the credentials blank.
Now add the Pipeline Parameters. It allows Jenkins to securely store values/variables used in CI/CD pipelines. In addition to creating variables accessed in your Jenkins Pipeline, let us create parameters for each of the arguments used by schema change.
Open the snowflake-pipeline job and click on configure in the left navigation bar. Under the General settings, click the “This project is parametrized” option, as shown below.

For each parameter listed below, click on "Add Parameter" and enter the parameter name given below with the appropriate value.
NOTE: Adjust the values according to your account parameters.

When you have entered all the parameters, click on the save button.
Run the Pipeline
In this step, manually run the new Jenkins Pipeline. It deploys the first database migration script that was created. From the job overview page, click on the Build with Parameters option in the left navigation bar, as shown below:

Verify that all the parameters and values are correct, then click the Build button to start the Pipeline. If all goes well, it displays, “The build was successful."

Confirm Changes Deployed to Snowflake
Now that the database migration is deployed to Snowflake, log into your Snowflake account and confirm the Database Objects.
You should now see a few new objects in your DEMO_DB database as following:
A new schema DEMO and table EMPLOYEE(created by the first migration script)
A new schema SCHEMACHANGE and table CHAGE_HISTORY (created by schemachange to track the deployed changes)
Explore the contents of the CHANGE_HISTORY table to see where/how schemachange keeps track of state.

CONCLUSION
In this blog, we understood how to set up the CICD pipeline using the Bitbucket repository and the dockerized Jenkins environment. When the sql script file is committed to the repository, the Jenkins pipeline will run that script in Snowflake.
Using this method, we can easily run the sql files outside Snowflake to perform any tasks in the Snowflake environment. We can also use this method when we want to deploy all the Snowflake scripts from the development environment to the production environment; we just need to place all the scripts in the repository and then use the Jenkins pipeline parameters to specify the Snowflake account details and later run the pipeline.
REFERENCES
https://selectfrom.dev/ci-cd-pipeline-for-snowflake-using-jenkins-and-schemachange-d806eaac3e07