I think most of you know the moment when you say to yourself >> Hey, I'm doing this thing all the time. I should have a script for this! <<
First thing you do is have a look at your collection of scripts in order to copy ’n’ paste all the boilerplate code you need to write a kickass script. If you are a Pythonista there is a better way.
The name is chosen without reason. I use fabric for task automation all the time. First of all let's install fabric to get going. Just install it using pip.
pip install fabric
The following basic examples are also available on Runnable. You can use the shell on the bottom to try them yourself. We will utilize a quote by bon vivant Walter Sobchak in our examples, because learning from Walter means learning to succeed.
Let's start with a little piece of code to get familiar with fabric.
def walter(name='Smokey'): """Tell smokey it isn't Nam.""" print '%s, this is not Nam. This is bowling. There are rules.' % name
Nothing special here, just a normal python function with a keyword argument.
So how much extra code do you need to execute this function from the command line? Actually zero lines, just put the code into a fabfile.py and run it.
# cd into fabfile.py folder > cd /location/of/your/fabfile.py # list available fabric tasks > fab -l Available commands: walter Tell smokey it isn't Nam. # run a fabric task > fab walter Smokey, this is not Nam. This is bowling. There are rules. Done. # pass arguments to tasks > fab walter:Luke OR fab walter:name=Luke Luke, this is not Nam. This is bowling. There are rules. Done. # chain tasks > fab walter walter:Luke Smokey, this is not Nam. This is bowling. There are rules. Luke, this is not Nam. This is bowling. There are rules. Done.
This was easy, wasn't it? And blazing fast to create.. That's the reason why I use fabric all the time. When you spend most of your day writing python code using fabric is a no-brainer.
What we have learned now is that a fabric task is basically a normal python function you define in your fabfile and you can execute that task using the
Remote execution? No Problem!
So far, so boring! We haven't seen any fabric specific code until now. Wouldn't it be nice to perform a task on a different machine?
That is where fabric will blow your mind!
The real magic of fabric comes from it's extremely powerful set of operators that let you create complex tasks that execute across machines.
Fabric has SSH builtin and makes it ridiculously easy to connect to remote hosts and execute shell commands. Let me tell you about some operators fabric has on board to achieve this.
- local (fabric.operations.local) - Runs a command on a local machine.
- run (fabric.operations.run) - Runs a command on a remote host.
- sudo (fabric.operations.sudo) - Runs a command on a remote host with sudo privileges.
- get (fabric.operations.get) - Copies a file from the remote host to the local machine.
- put (fabric.operations.put) - Copies a file from the local machine to the remote host.
- cd (fabric.context_managers.cd) - Changes to current working directory on the remote host.
- lcd (fabric.context_managers.lcd) - Changes to current working directory on the local machine.
You are properly asking yourself >> How the heck does fabric know on what remote hosts to execute tasks and how does authentication work? <<
We need one more piece for this. The fabric
env is the place where the global fabric configuration lives.
The most important env variables to get going are:
- env.hosts - Fabric will execute a tasks on every host in this list one after another.
- env.user - Fabric will use this username to connect to the host. If not provided the current username will be used.
- env.password - Fabric will use this password to connect to the host or ask for the password if not provided.
A complete list env variables can be found here.
Let's create a little more advanced fabfile to illustrate this.
from fabric.api import env, run, task, hide env.user = 'valid_username' env.password = 'valid_password' def get_hostname(): with hide('output', 'running'): return run('hostname') @task def localhost(): env.hosts.append('localhost') @task def remotehost(): env.hosts.append('re.mote.i.p') @task def fab_walter(): """Tell hostname it isn't Nam.""" run('echo "%s, this is not Nam. This is bowling. ' 'There are rules."' % get_hostname())
Some things I want to point out here. Have a look at the
@task decorator. Once you decorate a method with it, only decorated methods become fabric tasks and are visible to the
fab command. This allows you to define helper methods in your fabfile that can not be called directly, but can be used in tasks. That's why the
get_hostname method will not show up when you run
Also note that the only thing the
localhost and the
remotehost tasks do is to add a string (hostname or IP) to the
env.hosts list. This way the
fab_walter task will be executed on these hosts when called in a chain.
As long as the username and the password provided is valid for ssh authentication the
fab_walter task should succeed on both hosts.
> fab localhost remotehost fab_walter [localhost] Executing task 'fab_walter' [localhost] run: echo "Local_Hostname, this is not Nam. This is bowling. There are rules." [localhost] out: Local_Hostname, this is not Nam. This is bowling. There are rules. [localhost] out: Done. Disconnecting from localhost... done. [re.mote.i.p] Executing task 'fab_walter' [re.mote.i.p] run: echo "Remote_hostanme, this is not Nam. This is bowling. There are rules." [re.mote.i.p] out: Remote_hostanme, this is not Nam. This is bowling. There are rules. [re.mote.i.p] out: Done. Disconnecting from re.mote.i.p... done.
Right now we are only calling the
hostname command on the command line of the targeted host, but I hope you can see the power in this. It means you can execute arbitrary commands on any remote host as if they where normal python functions and use the output in your task logic. Further you can easily execute commands on your local machine and on remote hosts in the same task or even copy files between them.
Cleartext Passwords in the Fabfile? Seriously?
Not really. We should not do this! I use ssh public-key auth on all my machines. This means that you can log into the machine without a password as long a your public key is existing on the remote host. So there is no need to define a password in your fabfile.
If you want to take this even further add this line to your fabfile to enable ssh agent-forwarding.
env.forward_agent = True
What this does is that your local ssh agent is forwarded to the remote server so that you can establish ssh connections from the remote server using the private key of your local machine. For example you could clone one of your private Github repositories to the remote machine without providing any credentials.
def clone_private_repo(): with cd('~/repos') run('git clone git://github.com/your-user/private-repo.git ' 'local-private-repo')
The task above will succeed on the remote host as long as your local private key is authenticated to access your Github account.
How I use Fabric
I hope you got a feeling by now what fabric is and why it's great. I have a fabfile in the root path of most of my repositories. I use it to bundle all the actions related to a project.
May it be packaging, deployment or just a series commands that I don't want to remember. If I resume working on a project I haven't touched for a while I can find all the related tasks properly grouped in my fabfile so that I can instantly go back to work.
Real World Usecase
Let me outline a workflow I created for myself and use every day to deploy applications. It is heavily inspired by the blogpost by Hynek Schlawack.
- [developer machine] push latest changes from local repository to remote
- [buildslave] clone the remote repo based on the local repo we just pushed
(possible because of agent-forwarding)
- [buildslave] determine current version based on git tags of remote repo
- [buildslave] create a RPM-Package based on determined version
- [buildslave] upload new package to yum repository hosted on Amazon S3
- [all hosts in environment] update to package version we just created
This is what I have to do for this:
> cd /root/of/repo > fab testing package push deploy
Pretty powerful stuff, isn't it?
Fabric and Chef
For bigger server environments I use Chef for configuration management. The Problem with Chef is that it lacks support for deployment orchestration. Fabric is actually pretty good at this. In order to deploy with fabric I need to query Chef for all hosts in an environment. I use PyChef to get these hosts.
from fabric.api import env, task import chef api = chef.autoconfigure() def get_hosts_in_env(environment): """finds all hosts in chef environment""" hosts =  hostname_attr = \ ['cloud.public_hostname', 'fqdn'] for row in chef.search.Search('node', 'chef_environment:%s' % environment): if row: if callable(hostname_attr): hosts.append(hostname_attr(row.object)) else: for attr in hostname_attr: try: hosts.append(row.object.attributes.get_dotted(attr)) break except KeyError: continue else: raise chef.exceptions.ChefError( 'Cannot find a usable hostname attribute for node %s', row.object) return hosts @task def testing(): """set environment to testing""" env.hosts = get_hosts_in_env('testing') @task def deploy(): """deploy the application""" pass
This is actually a more advanced version of the concept we saw before. We query Chef for all hosts in an environment and write that information to the
This way all the tasks called after the
testing task in the
fab command will run on every host in the testing environment.
I want to finish my introduction to Fabric with the simple advice.
Start playing around with Fabric!
You will realize it is a great productivity enhancer for you. The first hurdle for starters is quite low and in a little while you will find yourself doing stuff with fabric, you wouldn't ever have thought of writing a bash script for.. At least this was the case for me.
No excuses, start using Fabric!