paint-brush
An Introduction to Docker Through Storyby@elliot_f
5,139 reads
5,139 reads

An Introduction to Docker Through Story

by Elliot ForbesDecember 28th, 2017
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

My previous few story-based articles seem to have really hit it off and I’ve had some very good feedback from you all. As such I thought I’d continue the series and start introducing more granular topics such as containerization and <a href="https://hackernoon.com/tagged/docker" target="_blank">Docker</a>.

People Mentioned

Mention Thumbnail
featured image - An Introduction to Docker Through Story
Elliot Forbes HackerNoon profile picture

My previous few story-based articles seem to have really hit it off and I’ve had some very good feedback from you all. As such I thought I’d continue the series and start introducing more granular topics such as containerization and Docker.

We’ll again be following our senior developer Gemma as she works on improving the codebase of the online comic-book viewer/store. If you are interested in seeing how this story started then I recommend you check out:


An Introduction to Microservice Based Architecture Through Story — Part 1_The drive towards Microservice based architecture these days seems to make sense for a lot of big, distributed…_hackernoon.com

If you wish to read up more on docker then I highly recommend checking out the book — Docker: Up & Running: Shipping Reliable Containers in Production

The Issues

During the transition to a microservice based approach Gemma oversaw a couple of new faces joining her team. There’s always a learning curve when joining a new team and a transition period which reduces the overall effectiveness of the team as they look to get everything set up on their work machine.

No newcomer should feel stranded and as such Gemma put some of her senior engineers with the newcomers in an attempt to get them up and running with the codebase as quickly as possible.

Dependency and Environment Hell

Originally there were 4 people within Gemma’s team, they all worked as a tightly knit unit and were able to very quickly and easily fix any dependency issues or environment issues on each others machine with not too much time wasted.

However, as the team expanded, the number of differing environments became an issue. One team member may have been running on CentOS, another on MacOS and a third on Windows 10. With 3 respective services composing their comic-book viewer this meant they had to ensure every machine had the correct dependencies and environment variables set in order to run.

One team member may have had Python 3.1 installed on their machine whilst the account service actually required Asyncio in order to work. Asyncio only became available in version 3.4 of the language and thus the service would fail to start on their machine.

Now whilst this may have been somewhat trivial to solve, it did represent time spent trying to synchronize everyone to be on the right version. As the codebase grew in complexity, so too did the environment config needed in order for it to run.

III. Config — 12 Factor Applications

If you have worked on making applications cloud friendly then you may have encountered the 12 factors. Gemma and her team were trying to ensure that every service within their application adhered to all of these factors in order to easily scale and deploy their services with very little to no manual effort.

The third factor is that you store all config items within your environment. This typically means using python commands such as:

import os

db_pass = os.environ["DB_PASS"]

In order to retrieve database passwords etc. This however makes running it on your local machine rather difficult and you would have to set all of these variables first in order for your program to run. You could do this with the likes of a powershell script or a bash script but making this cross-platform compatible takes time and effort and any changes become difficult.

The Issue

The combination of setting appropriate dependencies and environment variables for all newcomers was starting to become a bit of a burden and there were times when tests were run against production databases. This was due to developers forgetting to set environment variables back to development values.

With plans for further team expansion in the future this issue needed to be resolved.

The Solution — Docker

This is when Gemma discovered the wonderful world of containerization and Docker. With containerization Gemma and her team were able to define everything they needed for their application to run within a Dockerfile and then run their newly coined docker app with one simple command.

They started by converting their account service which was written in Python to use docker and came up with something that looked like this to start with:

FROM python:3.6.3

WORKDIR /app

ADD . /app

RUN pip install -r requirements.txt

EXPOSE 80

ENV DB_PASS dolphins

CMD ["python", "app.py"]

They specified the precise Python version they wanted their app to run in, they specified the working directory of their application and they ran a pip install -r requirements.txt in order to install all the necessary Python dependencies. Finally they exposed port 80 which was the port that the underlying Python application was running on.

You’ll see on the second last line that they were also able to specify their DB_PASS environment variable which would subsequently allow them to connect to their development database.

Running Their App

The real beauty of defining this Dockerfile was that everyone in the team would be able to build and start the application after having installed Docker on their local machine by using the following 2 commands:

docker build -t py36-account-service .

docker run --name "py36-account-service" -d -p 9000:80 py36-account-service

This would subsequently go away, download all of requirements needed and build our Docker image file which we could subsequently run.

A screenshot of Gemma running this locally

After this point she was able to run the application by calling the second Docker command specified above.

running the account service

By migrating the app to this new format it meant that anyone new to the team could be told to simply install Docker if they hadn’t already, and then run these two commands and they’d have a working account-service on their machine. This would ultimately save a lot of time in the long run.

This worked across all of the developers Operating Systems with minimal fuss.

Dealing With Multiple Environments

So whilst the above structure allowed developers to get up and running with one environment of their application, it doesn’t solve the problem of developers accidentally running application tests against a production environment.

To combat this we can do things like specifying what --env-file we want to pass in to our application. Within these environment files we could specify our dev, test and production credentials separately. We could then specify that we wanted our application to start and use our development database like so:

docker run my_app --env-file .dev

Conclusion

Through the utilization of docker, Gemma and her team were able to minimize the barrier to entry for new developers coming into the team and greatly reduce the time taken to get someone up and running with a working development environment.

Docker-izing their application also made deployments simpler as they were able to leverage AWS services such as the Elastic Container Service (ECS) and not have to worry so much about the underlying operating systems etc. This ultimately saved them time and effort and the costs were the same as if they were to run it across a normal EC2 instance.

Conclusion

Hopefully you found this story both entertaining and enlightening! I’m quite enjoying this new style of writing and I’m hoping it makes a change from the more verbose and boring style that most technical posts go into.

I’m hoping this helped to highlight the benefits of Docker and how you could utilize it for everyday applications to improve developer workflow. It shouldn’t just be used for the likes of microservices and it provides a lot of value for even larger monolithic applications.

If you found this useful then please let me know in the comments section or by tweeting me @Elliot Forbes. I’d be more than happy to hear any feedback!