paint-brush
Deployment Automation via SSH With Python Fabric: How It Worksby@agzuniverse
12,559 reads
12,559 reads

Deployment Automation via SSH With Python Fabric: How It Works

by Aswin GApril 17th, 2021
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

Python Fabric is an open source python library that is used to execute commands remotely over SSH. Fabric is compatible with Python 2 and 3 and can be used to deploy a basic project to a remote server through Gitlab CI using Python Fabric. Fabric uses a library called Paramiko under the hood for handling the SSH side of things, we can pass arguments to Paramiko's SSH client. Paramiko is a Python context manager with the help of Python context managers, but there are many other ways to do this as well.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Deployment Automation via SSH With Python Fabric: How It Works
Aswin G HackerNoon profile picture

One of the most basic ways in which a project gets deployed is by SSHing into a remote host followed by executing a few basic commands. Apart from deployment, this can also be useful for running any command you want on a remote host, such as when a CI/CD pipeline is triggered. In this article I'll be taking a look on how to deploy a basic project to a remote server through Gitlab CI using Python Fabric.

What is Fabric?

Fabric is an open source python library that is used to execute commands remotely over SSH.

Fabric is compatible with Python 2 and 3. To use it with Python3, install it with:

pip3 install fabric

Since the end goal here is to use fabric in an automated deployment pipeline, you don't actually need to have fabric installed on your local machine (unless that is where your pipeline runs).

Installing fabric also installs the

fab
binary stub, which essentially allows fabric to read and execute commands defined in a file called the fabfile.

Creating a fabfile

Create a file named

fabfile.py
and start out with the following code:

from fabric import task


@task
def deploy(ctx):
    print("Inside the task!")

This defines a task called "deploy" which can be passed as an argument to the fab binary for execution. The

@task
decorator (Which is the Python equivalent of a closure - a subroutine that takes another subroutine as a parameter or returns another subroutine.) is used to convert the
deploy 
function to a task that can be executed by the fab binary. The function must take a context argument, which is given as
ctx
in this case.

On executing this by running

fab deploy
from the same directory as
fabfile.py
, "Inside the task!" will be printed as the output.

Now to use this to do what Fabric is intended for - connecting a remote host and executing commands in it.

from fabric import Connection, task


@task
def deploy(ctx):
    with Connection("HOST") as c:
        with c.cd("/home/project/path/"):
            c.run("docker-compose down")
            c.run("git pull origin master --recurse-submodules --rebase")
            c.run("docker-compose up --build -d")

Here, replace

HOST
with the name or IP address of the host to which you are establishing a connection. From there, you can execute any command that the user as which you are logged in has permission to do.
c.cd()
is used to ensure the remaining commands are executed from that particular folder. The commands I have given after this with
c.run()
are just examples. Replace them with what you need to execute.

In this example I'm making use of Python context managers with the help of

with
blocks, but there are many other ways to do this as well. You can read up on the Fabric documentation to explore other ways to do this.

By now you might have noticed just the host name is often not enough to establish a connection. What about the username and the private key to establish an SSH connection? This isn't immediately obvious, but because Fabric uses a library called Paramiko under the hood for handling the SSH side of things, we can pass arguments to Paramiko's

SSHclient.connect
method by using the
connect_kwargs
argument in Fabric's Connection. This looks like the following:

from fabric import Connection, task


@task
def deploy(ctx):
    with Connection(
        "HOST",
        user="USERNAME",
        connect_kwargs={"key_filename": "~/.ssh/your_key"}
    ) as c:
        with c.cd("/home/project/path/"):
            c.run("docker-compose down")
            c.run("git pull origin master --recurse-submodules --rebase")
            c.run("docker-compose up --build -d")

Paramiko's

SSHclient.connect
takes a
key_filename
argument, which specifies the filename of the key to be used. Now you are passing all the info required to establish a connection via SSH.

You can also pass the SSH key as an instance of Paramiko's

pkey.Pkey
class, or make use of a bunch of other options, which you can find in Paramiko's documentation.

Of course, it's much safer to load all of these as environment variables. When using Gitlab CI, while setting the environment variables you can choose to make the key available as a file instead of a string, which is needed to make it work with the

key_filename
argument.

Here is a screenshot of the Variables section under Gitlab CI/CD settings with the hostname added as a string and the key added as a file.

You might want to load these values in other ways depending on how your team is organized and as per your security requirements.

With these environment variables in place, the fabfile changes to this:

import os
from fabric import Connection, task


@task
def deploy(ctx):
    with Connection(
        os.environ["HOST"],
        user="USERNAME",
        connect_kwargs={"key_filename": os.environ["DEPLOY_KEY_FILE"]},
    ) as c:
        with c.cd("/home/project/path/"):
            c.run("docker-compose down")
            c.run("git pull origin master --recurse-submodules --rebase")
            c.run("docker-compose up --build -d")

Setting up Gitlab CI

Now that the fabfile is ready, executing it with the Python Fabric package is all we have to do. This can be done in any way you want - manually, using Github actions, etc. Here I have an example of running it with Gitlab's CI.

To do this, create a

.gitlab-ci.yml
file at the project root. An example is given below:

image: "python:3.6"

stages:
  - deploy

deploy_to_production:
  stage: deploy
  script:
    - pip3 install fabric
    - fab deploy
  only:
    - master

This is a basic configuration file that should be easy to understand if you're familiar with Gitlab's CI. It just has one job called

deploy_to_production
that executes two things (It uses a Python 3 image as base):
pip3 install fabric
to install Fabric, and
fab deploy
, which makes Fabric read our
fabfile.py
and execute the task named "deploy" in it.

Conclusion

Fabric has quite a lot more options than what is outlined here - such as dealing with sending the sudo password to the remote host when it's required, dealing with multiple hosts etc. It's worth going through their documentation if you're planning to make use of it.

In conclusion, this is a really simple and effective way to programmatically build scripts to be executed over SSH and trigger them from common methods such as through a CI/CD tool.

I hope you found this post useful. You can find me on Twitter and LinkedIn.