How we happily dockerized our development environment (part 1/2)

Written by aherve | Published 2016/03/10
Tech Story Tags: docker | devops | startup

TLDRvia the TL;DR App

In this post I'll show some tips we used at Hunteed to boost our productivity thanks to a fully dockerized development environment.

This is the part 1/2 where we describe a basic configuration for a dockerized development environment. In part 2 we'll explain how to put this configuration on steroids so that you never wait for a npm install/gem install anymore.

Why should I change anything to my workflow ?

With a properly setup docker configuration, we now appreciate the following features:

  • One-click install of a work-ready environment. From scratch. Yep.

  • No local installation of specific version/dependencies. As a freelancer you might work on several projects, each one of them using different versions of node/mongo/elastic-search…etc. That should not be an issue at all.
  • If it works home, it works in prod .CI and production server should work EXACTLY the same as my dev environment.
  • No knowledge of the infrastructure is needed to work on the project. The front-devs, designers, integrators can have partial knowledge of the infrastructure. We want an integrator to be able to work on our project without having to know how to install a mongodb server. Well, he/she should not even care about knowing if there is a database at all.

In addition to these features, we want what has now become a standard in any modern coding configuration:

  • continuous testing: changing the code should trigger a sequence of tests that are directly visible for the developer
  • Hot reloading: a change in your code should appear quickly on your browser, without needing to reboot everything manually
  • Continuous integration is plugged in, and every commit fires a complete-testing suite on the CI-server.

Yep. So we will basically get rid of

  • Welcome to Hunteed, let's spend 3 days together with the CTO hacking your computer until you can actually work on the project
  • I don't get it, it works on my machine !
  • How do I change my installed version of mongodb/mysql/elastic-search/node/phantom/ruby/whatever so it is compliant to your stack ?
  • Npm install / gem install / whatever install didn't work on my machine. What are the required dependencies again ?
  • I don’t get it, it works on my machine ! (again !?)
  • I don't get it, it does NOT work on my machine ! (yeah, that also)

All right, let's slaughter the monster !

2. Make it work

Here you can find a tiny demo project :

git clone https://github.com/aherve/simple-dockerized-dev-env

Let's have a look at what you can do with it before we go into the details:

  • Build the entire environment:

docker-compose build

This will download the proper containers so you can use an out-of-the-box install of node, mongodb. It will also run an npm install for you.

Note also that if you run this command again, docker will use cache so the npm install and images downloads are not performed twice:

If you change the package.json file, then the npm install command will be run again. Exactly what we need !

  • One-time testing:

docker-compose run test

This will run the tests. If you launched this command for the first-time, did you notice how it installed everything for you ?

Note that a db_1 instance is also launched. Your test suite is actually connected to a mongodb although we don't actually use it here.

  • Launch-server/script

docker-compose run hello> hello is loaded !

  • Continuous testing

docker-compose run autotest

Unlike the simple test command, this one will start a watcher that tracks a change in the code. try changing the /src/spec.js file and watch the test fail instantly !

Note for mac-users : docker run on a virtual environment when using osX. therefore, the file changes are not detected out of the box.Fortunately, these guys took care of this problem for you. Simply install dinghy and use it as a docker virtual machine. It will then work out of the box.

  • Run any arbitrary command

docker-compose run hello npm listdocker-compose run test env | grep NODE_ENV

run hello <something> will simply override the default command instruction of our docker-compose file.

3. How does it work ?

All the docker-related mechanics are located in two files:

  1. Dockerfile :

This file describe the instructions for building a container that will be able to run our code.

In this case, it consists of a pre-installation of node (alpine-node), on which we create a working directory (/app) where we put our package.json file. Then we run npm install so that the javascript libraries are available from within the container. Rubyists can compare npm install to a gem install command, whereas the package.json file compares to a Gemfile file.

Note that our source code is not (yet) included in the container.

2. docker-compose:

This file defines and orchestrate containers.

In this example we use our main Dockerfile 3 times to build very similar images: hello, autotest and test. You'll notice that these are actually the same containers, but we pass different environment variables, as well as different default commands.

The volumes instruction will mount our local directory into the container. More importantly, any change in your local src directory will have an instant impact on the running container. As a matter of fact, they are literally the same file.

Unlike what is located in /src, the package.json has been added by an ADD command before running an install command. This means you will have to rebuild the container if the package.json were to change.

And finally, the last db image is a running mongodb image. This image is directly downloaded from docker-hub and will start a working db server out of the box. The other containers can access the db thanks to the links command.

4. Bonus : CI servers also know about docker !

Since you can install the environment regardless of your setup…. then so does the CI-server \o/

At Hunteed we use the docker-based infrastructure of codeship for both continuous integration & deployment.

Good news, they can read a docker-compose file as well as you can.

Simply add a codeship-steps.yml file where you tell codeship to run your tests:

And codeship will run your test with the EXACT configuration you had. How awesome is that ?

5. Conclusion

So far we have a working environment that allows any contributor to onboard in one-click, regardless of his/her personnal computer configuration.

The only drawback of this particular setup is that a docker-compose build will run a fresh npm install everytime you change the package.json file. This can take a lot of time, as all the libraries are re-installed from scratch. In part 2. we will improve our configuration to significantly boost this step.

Happy coding !

Go to part 2/2 to see how to put your workflow on steroid !

Further reading : Dockerized dev environment for building a node-express api using typescript


Published by HackerNoon on 2016/03/10