Docker in Production at RustProof Labs
Warning: This post is outdated. It is here for reference purposes only.
RustProof Labs has been using Docker in production since May 2015. The switch to Docker hasn't been completely pain-free, but has already proven to be a major improvement. This post attempts to cover, at a high level, how Docker has been utilized to save time and resources. There are three main services I currently run in Docker containers on my main VPS are:
- RustProof Content
- OpenVPN Server
At this point, Nginx and SSH are the only two main services running on the base VPS, everything else is running inside of Docker containers. RustProof Labs' production servers are ran on entry level Digital Ocean droplets, ringing in at $5/month. The base OS is currently Debian 7, but that could be any operating system that supports Docker.
RustProof Content is always running on the VPS in two Docker instances. Nginx (on the base server) and acts as a load balancing proxy between the two containers. Once a day a simple bash script rebuilds each container from source code, one after the other. If I want to push new content to the site, I simply push to my private Git repository and manually trigger the build script.
Presto, updated site. If something goes awry, I can bring my old containers back.
Rewind to early 2015 when I decided to write my own blog platform using Python. I quickly discovered headaches with deploying Python 3+ applications to Debian based machines. I ran into kernal issues, Python version issues, path issues and more. A lot of this is in part because the Python package management systems are fractured and incomplete. I won't go into more details, it's already been said. The point is, I've gone through the headache of getting Python (Flask) bundled with uWSGi and all the dependencies I need.
The user interface I'm developing for the Garden Tracking database I wrote about earlier this year uses the same RustProof Content Docker image, it just replaces the Flask application for RustProof Content with a different Flask application for that particular project. Deployment is identical to RustProof Content, basically just a different Git repository. Huge win there for using Docker.
In 2014 I worked very hard to migrate all of RustProof Labs to PostgreSQL databases exclusively. Since June 2015 my PostgreSQL needs have been completely moved to Docker. This was a big challenge for me because I was trying to embrace the "throwaway" nature of containers, but that was tough for me to silence the "Data is sacred!" voice. I experimented with using data volumes, data only containers, and some other (terrible) ideas of my own. None of those seemed to jive with me though, and instead I have come to practice and trust database backups even more.
I use Jenkins CI to run a job that's mostly just a typical
PostgreSQL backup using
Up front, it is from a database
that just happens to be running inside a Dockerized container, but that fact doesn't change
The cool part is because a new, clean PostgreSQL instance is just a matter of
sudo docker run..., Jenkins goes on to create a brand
new PostgreSQL instance identical to the one it just backed up from.
The backup is then tested immediately as part of the success criteria of the job.
At a bare minimum this process ensures the backup will restore successfully, and I
sleep better at night. That was not so easy to perform while using a single, shared PostgreSQL instance.
My most recent Docker addition to my production environment is a VPN service using OpenVPN Server. I don't have much else to say about this yet, I've only been using it for a short time but I will hopefully write more about this in the future. Before I set it up I was a bit worried that getting from Colorado to New York for VPN would make daily work unusable, but that is the exact opposite of what I've experienced! I've been running Pandora, downloads to virtual machines, and Citrix all while connected and it has been a great experience so far.
Docker in "Almost" Production
Outside of the three items above, having IPython Notebook servers deployed with Docker have become almost indispensable. When I'm developing scripts, programs, or just playing around, having the interactive interface of an IPython Notebook can't be beat. They're configured to require HTTPS connections, set with a secure default password, and have all the cool modules I use in Python, such as:
Docker Container Lifespan
I've seen the comparison of Cattle vs Kittens a number of times, and I my Docker usage represents a mix of both. I have Cattle in the form of RustProof Content, which was developed to be built from source into a Dockerized environment. I rebuild that daily just because. I obviously don't update my site daily, but it's nice knowing that my site was built less than 24 hours ago.
I also have Docker kittens, which are my database instances. I have a couple containers that are about 2 months old now. I might let those go another month or two before I migrate them.
Docker is a great product. There are still a lot of people that consider it too immature for production use, but at least for me it has been immensely beneficial. The biggest benefit has been that my virtual servers only require a base installation with Docker and SSH. Everything else is done in containers, and dependency management is much easier. While troubleshooting issues in Docker isn't necessarily delightful, it shifts the pain to being a 1-time pain.
Published September 29, 2015
Last Updated June 20, 2018