Advanced Fabric Usage: Tasks Between Servers

Posted on 2012-12-27 13:45:00+00:00

Fabric is an excellent tool you can use to deploy your sites and projects to servers. Though it's written in Python, you can use it to deploy just about anything. Typically, the Fabric execution model revolves around a remote server, and your own local machine. But what if you want to perform an operation between two servers? This post will show you one approach.

First, as usual, our assumptions:

  1. You're using a recent version of Fabric. At least 1.3.
  2. You've defined two roles. For example: stage and production.
  3. You use a single host per role. Basically, you're using roles as named hosts.
  4. You're using sqlite for your database. This is for simplifying the tutorial. You can use any database method you like, but the technique will remain the same.

With that out of the way, let's start.

For this tutorial, we're going to define a very basic backpublishing task. That means we're going to copy a database from a production server to a staging server. So create a fabfile.py and let's define our file structure:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
from fabric.api import env, execute, get, put, task

env.roledefs = {                                                              
    'stage': ['stage.example.com'],                                           
    'production': ['production.example.com'],                                 
}

def get_database():                                                           
    pass

def put_database():                                                           
    pass

@task                                                                         
def backpublish(source_role, target_role):                                    
    pass

As you can see, our actual task is split up into two parts. And there lies the key to this technique: you split up the operation into the parts that need to be executed on each server. Let's fill those parts out:

1
2
3
4
5
def get_database():                                                           
    get('database.db', '/tmp/backpublish-database.db')

def put_database():
    put('/tmp/backpublish-database.db', 'database.db')

Remember: we don't combine these two parts into one function because they need to be executed on different servers. Let's fill out the main task to show you how
to do it:

1
2
3
4
5
6
7
@task                                                                         
def backpublish(source_role, target_role):                                    
    get_database.roles = (source_role,)                                       
    execute(get_database)

    put_database.roles = (target_role,)                                       
    execute(put_database)

And that's it! Now the different parts will be executed on different servers, without any low level hacks, or having to split your workflow into separate Fabric tasks. Of course, we're pulling in the role functionality into the actual Fabric task, so you'd call this from bash like so: fab backpublish:production,stage

For reference, here's the entire fabfile.py:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
from fabric.api import env, execute, get, put, task

env.roledefs = {                                                              
    'stage': ['stage.example.com'],                                           
    'production': ['production.example.com'],                                 
}

def get_database():                                                           
    get('database.db', '/tmp/backpublish-database.db')

def put_database():                                                           
    put('/tmp/backpublish-database.db', 'database.db')

@task                                                                         
def backpublish(source_role, target_role):                                    
    get_database.roles = (source_role,)                                       
    execute(get_database)

    put_database.roles = (target_role,)                                       
    execute(put_database)

Comments

Post New Comment