Bootstrap Django applications with Fabric
For full stack developers, there are many laborious and duplicated tasks that one would wish were removed in order to get to the fun stuff. Aside from this, the time and effort required if one is carrying out these tasks daily certainly mounts up to “non-productive” time and effort. If you are like me, you are probably tired of the process of setting up, deploying or releasing projects, or worse, you're not just tired of it, but also scared of it. Yes, I was like that till I found Fabric which made all these trivial things a breeze.
Setting up web projects is hard
I am using Django as an example for this post as I work with it mostly every other day, but the same problems faced with Django are also common issues in the web development world. When I start a new Django project, I need carry out a number of activities before I can actually get started developing the application. Some of these are listed here;
- Create a new virtualenv.
- Pip install the Python libraries.
- Create a database if needed. (I actually need this 100% of the time)
- Run syncdb.
Also, if I am working on a project which I did not create, besides all the steps above, I also need to;
- Run south migration.
- Import some testing data.
- Run existing tests.
From the Fabric documentation
Fabric is a Python library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks.
Fabric does two things, one is to run tasks, the other is to ssh into other machines to run tasks. So it does not just solve our bootstrapping projects problem but also solves the deployment problem which I will discuss at a later date. Shell script may be able to do the same but I bet I will need to write a lot more to achieve the same result.
To install Fabric:
pip install fabric
You then need to create a python file called fabfile.py. You can put any python code in there, at the end of the day it's just a python file. But obviously we'd like to use some Fabric functions make bootstrapping a new web project easier. For a more step by step detailed tutorial, please consult the Fabric site, I will only show you some useful commands and how I use them to bootstrap my Django projects. For example, to install Postgresql on a Debian Linux machine.
from fabric.api import local, task, settings env.hosts = ['18.104.22.168'] # Some remote hosts ip @task def install_postgres_locally(): with settings(user='root'): local('apt-get install postgresql postgresql-client') @task install_postgres_remotely(): with settings(user='root'): run('apt-get install postgresql postgresql-client')
Then you can run:
This will install postgresql for you locally using root. It will ask you for the root password, and when you give this you will see the normal output as you install postgresql. If you run:
This will ssh into the remote server which set using env.hosts, after that, it's the same as installing the package locally.
This is just a simple command to show what Fabric is capable of doing. It has many more features which I will not list here. But as you can see, it's quite simple, just python functions plus python functions can run shell command. Another library I want to introduce here is fabtools. It's built on top of Fabric, making some often used commands even easier. Using the example above, when you run apt-get install it will output a lot of things which you probably don't want to see or you have to keep typing Y to continue, in order to make it clean, you can run apt-get install -y -q. But what if you don't remember all those flags? Fabtool has made this easy, you can write a task like this:
from fabric.api import local, task, settings from fabtools import deb @task install_postgres(): with settings(user='root'): deb.install('postgresql postgresql-client')
So you actually don't even need to remember apt-get. But more importantly, it makes installing packages on different Linux distros easier. I use Arch Linux daily, but some of my colleagues use Debian, Open Suse or Centos, fabtools support most popular Linux distros, so for my laptop I can do:
from fabric.api import local, task, settings from fabtools import arch @task install_postgres(): with settings(user='root'): arch.install('postgresql postgresql-client')
So do you know how to install packages on Arch Linux?
You don't have to, fabtools has got your back (it's pacman -S if you're curious) How cool is that, a Linux distro's package manger is pacman xd. Fabric has an environment dictionary, env, it contains some default values. From the documentation;
While it subclasses dict, Fabric’s env has been modified so that its values may be read/written by way of attribute access. In other words, env.host_string and env['host_string'] are functionally identical.
So for example, if you don't set the env.user, its default value is your current logged in user on your machine, but you can reassign any values to it. I bring this up because it's very useful if you'd like to make your fabfile tasks more generic. I don't want to rewrite my fabfile functions for every project, then why should I use it in the first place? My way of reusing the fabfile tasks may not be the "True Way", but I find it useful and easy. What I've done is to create another Python file which contains all the specific things to the projects, import it into the fabfile, then overwrite the env using the values set in this file. I named this file server_conf.py, in the file I have;
# server_conf.py SYSTEM_PACKAGES_NEEDED = "python-pip git" USER = "my_project_user" HOSTS = ["my_project_server_ip"] PROJECT_NAME = "project_name" REPO_URL = "repo_url" REQUIREMENT_FILE = "requirements.txt" DB_NAME = "project_db_name" DB_USER_NAME = "project_db_user_name" DB_PASSWORD = "project_db_password" PROJECT_PATH = "" VIRTUALENV_PATH = "" # fabfile.py from fabric.api import env import server_conf env.hosts = server_conf.HOSTS env.user = server_conf.USER if server_conf.USER else env.user env.system_packages = (server_conf.SYSTEM_PACKAGES_NEEDED if server_conf.SYSTEM_PACKAGES_NEEDED else "") env.project_name = (server_conf.PROJECT_NAME if server_conf.PROJECT_NAME else "") env.repo_url = (server_conf.REPO_URL if server_conf.REPO_URL else "") env.db_name = (server_conf.DB_NAME if server_conf.DB_NAME else "") env.db_user_name = (server_conf.DB_USER if server_conf.DB_USER else "") env.db_password = (server_conf.DB_PASSWORD if server_conf.DB_PASSWORD else "") env.requirement_file = (server_conf.REQUIREMENT_FILE if server_conf.REQUIREMENT_FILE else "") env.project_path = (server_conf.PROJECT_PATH if server_conf.PROJECT_PATH else "/home/%s/%s/" % (env.user, env.project_name)) env.virtualenv_path = (server_conf.VIRTUALENV_PATH if server_conf.VIRTUALENV_PATH else "%svirutal_env_%s" % (env.project_path, env.project_name))
So the only file I need to edit for my projects would be the server_conf.py file. As you can see I don't need to set everything, in the fabfile.py I set some default values if the variable does not exist in the server_conf.py file. Actually I don't have to set everything as attributes of the env, but I'd like the variables to be consistent, so when I use them in the tasks, I don't have to think if it's in the env or not. Also, Fabric env is a global singleton, so if in some circumstances I need to use it in other files, I can use the values I previously set by importing env.
How about bootstrapping projects?
Before running any tasks, first I will show you the imports.
from fabric.api import env, task, settings, run, cd from fabric.contrib.files import exists, append from fabtools import require, deb, arch from fabtools.python import virtualenv, install_requirements from fabtools.require import postgres import server_conf env.hosts = server_conf.HOSTS env.user = server_conf.USER if server_conf.USER else env.user env.system_packages = (server_conf.SYSTEM_PACKAGES_NEEDED if server_conf.SYSTEM_PACKAGES_NEEDED else '') env.project_name = (server_conf.PROJECT_NAME if server_conf.PROJECT_NAME else '') env.repo_url = (server_conf.REPO_URL if server_conf.REPO_URL else '') env.db_name = (server_conf.DB_NAME if server_conf.DB_NAME else '') env.db_user_name = (server_conf.DB_USER if server_conf.DB_USER else '') env.db_password = (server_conf.DB_PASSWORD if server_conf.DB_PASSWORD else '') env.requirement_file = (server_conf.REQUIREMENT_FILE if server_conf.REQUIREMENT_FILE else '') env.project_path = (server_conf.PROJECT_PATH if server_conf.PROJECT_PATH else '/home/%s/%s/' % (env.user, env.project_name)) env.virtualenv_path = (server_conf.VIRTUALENV_PATH if server_conf.VIRTUALENV_PATH else '%svirutal_env_%s' % (env.project_path, env.project_name))
Install packages on a Debian machine.
@task def debian_install(packages): """ Install packages for Debian packages is a string with packages' names example: 'nginx python-pip' """ with settings(user='root'): deb.install(packages.split(' '))
Create a new directory.
@task def create_directories(directories_names): """ Create directories directory_names is a string with directories' names example: 'project1 project2' """ directories = directories_names.split(' ') for directory in directories: run('mkdir %s' % directory)
Create a new virtualenv.
@task def create_virtualenv(virtualenv_path): require.python.virtualenv(virtualenv_path)
Use created virtualenv to install python libraries needed by the requirement file. This is not a task, because fabtools handles this.
Set up database, same here, I use some methods fabtools provides.
with settings(user='root'): postgres.server() require.postgres.user(env.user, password=env.db_password) postgres.database(env.db_name, owner=env.user)
Run syncdb and migration for my database.
with cd(env.project_path): with virtualenv(env.virtualenv_path): run('python manage.py syncdb') run('python manage.py migrate')
Lastly, I've created a method called bootstrap_debian to call all these tasks.
@task def bootstrap_debian(): debian_install(env.system_packages) with cd('/home/%s' % env.user): create_directories(env.project_name) with cd(env.project_path): svn_check_out(env.repo_url) create_virtualenv(env.virtualenv_path) with virtualenv(env.virtualenv_path): install_requirements(env.project_path+env.requirement_file) with settings(user='root'): postgres.server() require.postgres.user(env.user, password=env.db_password) postgres.database(env.db_name, owner=env.user) with cd(env.project_path): with virtualenv(env.virtualenv_path): run('python manage.py syncdb') run('python manage.py migrate')
All the tasks above are in fabfile.py, now to create a new project, I will just need to call:
Bootstrapping a project this way makes my life a lot easier. I don't know if you noticed that I didn't use any local commands, I could be wrong, but I've found the local command is not as powerful as the run command. So if I need to bootstrap a project on my local machine, I just put my machine's IP to my server_conf.py file, this way actually makes my fabfile cleaner because I don't need to write separate tasks for my local machine. Only problem (not to me) is you need to have ssh daemon running on your machine.
There are many API Fabric and fabtools provide I did not cover, because there are many of them and that I probably never need to use. I'd also like to grow my fabfile.py bigger in tasks. Like auto detect current Linux distros, then use the according command to install packages etc. Please find my fabfile.py at my github and feel free to point out mistakes or send me a pull request with your tasks.
Something I'd like to point out, I have never used SaltStack or Ansible, and I believe both can do what Fabric offers and are even more powerful, but they're also more complicated. They're definitely worth to checking out though 🙂 So hopefully this blog post can help you to bootstrap your project a little easier 🙂
Clinical Patient Management System for European Reference Networks: A Case Study The Clinical Patient Management System (CPMS) is a virtual consultation platform which enables healthcare professionals to present patient cases and collaborate with other healthcare professionals to provide diagnosis, care and treatment across borders. What are the European Reference Networks?…Read More
Knowledge sharing at the Centre for Digitisation in telemedicine, germany and discussing CPMS OpenApp CEO, Con Hennessey, has been invited to speak about the Clinical Patient Management System (CPMS) at a the “Digital and central: Treat rare diseases through digital networking across institutions” online symposium. Launched in 2017, CPMS serves as…Read More
How is Open Source driving innovation across global industries as well as within Ireland? Skillnet Ireland will be hosting their inaugural Open Source & Ireland’s Innovation Ecosystem Conference on Thursday, February 25th from 4-6pm GMT. The agenda is packed full of speakers who are experts on the use of Open…Read More