In this post I'll show some tips we used at to boost our productivity thanks to a fully dockerized environment. Hunteed development This is the part 1/2 where we describe a basic configuration for a dockerized development environment. we'll explain how to put this configuration on steroids so that you never wait for a anymore. In part 2 npm install/gem install Why should I change anything to my workflow ? With a properly setup configuration, we now appreciate the following features: docker of a work-ready environment. From scratch. Yep. One-click install of specific version/dependencies. As a freelancer you might work on several projects, each one of them using different versions of node/mongo/elastic-search…etc. That should not be an issue at all. No local installation .CI and production server should work EXACTLY the same as my dev environment. If it works home, it works in prod The front-devs, designers, integrators can have partial knowledge of the infrastructure. We want an integrator to be able to work on our project without having to know how to install a mongodb server. Well, he/she should not even care about knowing if there is a database at all. No knowledge of the infrastructure is needed to work on the project. In addition to these features, we want what has now become a standard in any modern coding configuration: : changing the code should trigger a sequence of tests that are directly visible for the developer continuous testing a change in your code should appear quickly on your browser, without needing to reboot everything manually Hot reloading: is plugged in, and every commit fires a complete-testing suite on the CI-server. Continuous integration Yep. So we will basically get rid of Welcome to , let's spend 3 days together with the CTO hacking your computer until you can actually work on the project Hunteed I don't get it, it works on my machine ! How do I change my installed version of mongodb/mysql/elastic-search/node/phantom/ruby/whatever so it is compliant to your stack ? Npm install / gem install / whatever install didn't work on my machine. What are the required dependencies again ? I don’t get it, it works on my machine ! (again !?) I don't get it, it does NOT work on my machine ! (yeah, that also) All right, let's slaughter the monster ! 2. Make it work you can find a tiny demo project : Here git clone https://github.com/aherve/simple-dockerized-dev-env Let's have a look at what you can do with it before we go into the details: Build the entire environment: docker-compose build This will download the proper containers so you can use an out-of-the-box install of node, mongodb. It will also run an npm install for you. Note also that if you run this command again, docker will use cache so the npm install and images downloads are not performed twice: If you change the file, then the command will be run again. Exactly what we need ! package.json npm install One-time testing: docker-compose run test This will run the tests. If you launched this command for the first-time, did you notice how it installed everything for you ? Note that a instance is also launched. Your test suite is actually connected to a mongodb although we don't actually use it here. db_1 Launch-server/script docker-compose run hello> hello is loaded ! Continuous testing docker-compose run autotest Unlike the simple test command, this one will start a watcher that tracks a change in the code. try changing the file and watch the test fail instantly ! /src/spec.js docker run on a virtual environment when using osX. therefore, the file changes are not detected out of the box.Fortunately, took care of this problem for you. Simply install dinghy and use it as a docker virtual machine. It will then work out of the box. Note for mac-users : these guys Run any arbitrary command docker-compose run hello npm listdocker-compose run test env | grep NODE_ENV run hello <something> will simply override the default instruction of our docker-compose file. command 3. How does it work ? All the docker-related mechanics are located in two files: Dockerfile : This file describe the instructions for building a container that will be able to run our code. In this case, it consists of a pre-installation of node (alpine-node), on which we create a working directory (/app) where we put our package.json file. Then we run npm install so that the javascript libraries are available from within the container. Rubyists can compare to a command, whereas the file compares to a file. npm install gem install package.json Gemfile Note that our source code is not (yet) included in the container. 2. docker-compose: This file defines and orchestrate containers. In this example we use our main 3 times to build very similar images: and . You'll notice that these are actually the same containers, but we pass different environment variables, as well as different default commands. Dockerfile hello, autotest test The instruction will mount our local directory into the container. More importantly, any change in your local directory will have an instant impact on the running container. As a matter of fact, they are literally the same file. volumes src Unlike what is located in , the has been added by an ADD command before running an command. This means you will have to rebuild the container if the were to change. /src package.json install package.json And finally, the last image is a running image. This image is directly downloaded from docker-hub and will start a working db server out of the box. The other containers can access the db thanks to the command. db mongodb links 4. Bonus : CI servers also know about docker ! Since you can install the environment regardless of your setup…. then so does the CI-server \o/ At we use the docker-based infrastructure of for both continuous integration & deployment. Hunteed codeship Good news, they can read a docker-compose file as well as you can. Simply add a file where you tell codeship to run your tests: codeship-steps.yml And codeship will run your test with the EXACT configuration you had. How awesome is that ? 5. Conclusion So far we have a working environment that allows any contributor to onboard in one-click, regardless of his/her personnal computer configuration. The only drawback of this particular setup is that a will run a fresh everytime you change the file. This can take a lot of time, as all the libraries are re-installed from scratch. . we will improve our configuration to significantly boost this step. docker-compose build npm install package.json In part 2 Happy coding ! Go to part 2/2 to see how to put your workflow on steroid ! Further reading : Dockerized dev environment for building a node-express api using typescript