DevOpsing at Home

I remember the days when SysAdmins bragged about server uptimes that were sometimes measured in years. I have been out of the SysAdmin world for quite a while, focusing on software development, and somewhere along the way, a small revolution happened. Here at Threat Stack, our DevOps team embraces immutable infrastructure, which allows us to spin down problematic servers and spin up brand new clean instances in a matter of minutes. Impressed with this approach, I started to look for a way to bring some of these concepts home.

What is DevOps?

Most likely there are fancy academic definitions of what DevOps is precisely, or you can clearly see what it is by taking a look at this Venn diagram from Wikipedia:

Devops Venn Diagram-1.svg

But to me, DevOps is a methodology for automating SysAdmin work while reducing the friction needed to accomplish tasks. It might be somewhat of a stretch, but I find it closely related (in spirit) to Test Driven Development (TDD), of which I have been a huge supporter.

The Problem

Like many others, I have a dedicated server running at home for basic home computing tasks — file sharing, backups, CCTV, and home automation. It’s nothing fancy — just a basic off-the-shelf, headless PC with RAID1. It runs on a recent Ubuntu Server, and that takes care of all my home computing needs. But there’s a small, yet annoying problem with potentially devastating consequences.

It’s very easy to let a home server become stale, out-of-date, and vulnerable. Scarily enough, that happens to too many production servers as well — forgotten and unmonitored — easy targets to become soldiers in a DDoS army or just proxies in clandestine activities (unless you’re using a platform like Threat Stack Cloud Security Platform®). Unlike production server SysAdmins, I have many excuses for neglecting updates — it’s easy to convince myself that no one cares about my server: if it ain’t broke, don’t fix it, etc. Of course, those are just sad excuses.

But maintaining a system without the need for constant intervention is a real challenge. I realized that it only takes me a few months to forget why, how, and where everything is set up. Upgrading environment-specific software (as opposed to OS updates) via a package manager becomes an annoying chore, taking way more time than I want to spend and that leads to neglect. 

The Solution

Here at Threat Stack we use Chef and AWS for our immutable infrastructure. So I started thinking about I how could bring something like that home. It would be nice to not only configure something once, but to also have a record of how it was done along with the ability to easily repeat those steps for the specific applications I run on my server. After some research, I found that Docker and Docker Compose fit the bill quite well.

If you are not familiar with Docker, it’s a platform that allows immutable and configured software containers to run side-by-side while being isolated from each other (like virtual instances, but without the overhead of a separate OS for each). You can create your own containers or use existing ones (possibly with your own configuration to fit your needs) — oftentimes provided by the original developers/maintainers.

Docker-config.pngDocker-compose config

Docker Compose is a service that allows for the automatic management and configuration of Docker containers so all containers are managed from a central file and can be started by one tool. I use a mix of ready-made and custom-configured containers.

Moving to immutable infrastructure at home has been refreshingly awesome. For example, I use Home Assistant, which is an open-source service to manage home automation devices (similar to purpose-built hubs like Samsung SmartThings and Wink Hub). It’s under active development, and is somewhat tricky to configure in Ubuntu because it uses cutting-edge versions of Python packages and libraries. With Docker, all that custom configuration is encapsulated within the container, and (which is awesome) the container is already provided by the Home Assistant developers. All I have to do is provide my own configuration file and tell Docker which ports to keep open. What’s even better, I can keep all my software config files in one central location  (under version control) and instruct Docker Container to use them.

If a new version of the software becomes available, I can just pull the new image, and Docker will run any custom setup I may have specified, load my custom config, and bring a clean-slate instance without any old crud.

Final Thoughts . . .

I am very excited at being able to bring home immutable infrastructure. It’s a great way to increase maintainability of a home server while increasing security and decreasing the time required to maintain it. And I’ve got no more excuses for not maintaining my home server.