Senior backend developer
This is the introduction to a series that was originally published on tech.osteel.me. Only the introduction was brought to Hacker Noon — links to other parts will take you to that other website.
The first version of Docker was released in 2013 and, since then, it has worked its way up to eventually becoming the industry standard for containers. Among developers, exposure to Docker ranges from having vaguely heard of the technology to using it on a daily basis, the latter category singing its praises while the former is sometimes still struggling with the sheer concept of containers.
Wherever you are on your journey, as a developer there are many reasons why you might want to delve into this technology, including, but not limited to:
"I've read quite a bit about Docker and I am now looking for something more hands-on";
"I use a Vagrant-based solution like Homestead and it starts to feel like it's getting in the way";
"I already use a pre-configured Docker environment like Laradock and I want a deeper understanding of how things work under the hood";
"I want greater control over my development environment";
"I want to better understand application architecture";
"The project I am working on uses microservices";
"I am supposed to migrate a Wordpress website and I need a distraction".
Whatever your motive is, you might notice the emergence of a theme, revolving around understanding and being in control: in my opinion, Docker is about empowering the developer, from providing a safe environment to try out technologies easily, without messing up your local environment (yes, even Wordpress), to opening the gates to the world of Ops.
The aim of this tutorial series is to demystify a technology that can feel daunting at times, all the while providing most of the tools a developer might need to build a web project locally with Docker. It strictly focuses on the development environment and does not cover deploying Docker applications whatsoever, even though the end result can be used as part of a deployment process.
No prior knowledge of Docker is required, but being comfortable around a terminal will help.
We will start our journey with a very basic LEMP stack and gradually increase the environment's complexity as we work our way through the different parts of the series, covering more and more use cases along the way. While the focus is mostly put on PHP on the server side, the overall architecture can be adapted for any language.
You might find yourself stopping at a certain point because all of your needs have been addressed already, only to come back later when you need more. Wherever that is, each part of the series comes with its own branch from this repository, which you can fork or download and use as a starting point for your own projects.
That's it for the elevator pitch. Convinced already? Great, let's move on to the first part (or head over here if you intend to run Docker on Windows). If not, let's try to address some of the doubts you might still have, starting with a little background check.
I am a backend developer with more than a decade of experience, having worked with various types of companies in three different countries. I witnessed the evolution of the development environment landscape over time, from the early days of using WAMP and Notepad++ to migrating to full-fledged IDEs and custom virtual machines, before moving on to Vagrant and eventually, Homestead.
Back in 2015, I explored the possibility of using Docker locally as a development environment and wrote about it in a couple of articles that were picked up by the official Docker newsletter, but eventually dropped the idea, mainly because of performance issues.
A the beginning of 2018, however, I was forced to revisit that thought because of Vagrant's shortcomings vis-à-vis the growing complexity of the project I was working on at the time (more on that in the next section), and realised progress had been made both in terms of performance and adoption.
After tinkering about for a few days and successive iterations, I ended up with a decent development environment running on Docker, capable of meeting the application's new scope and greatly simplifying the onboarding of new developers.
Since then, I successfully implemented and improved upon a similar setup in another company.
While the approach taken in this series is certainly not the only one and is most likely perfectible, it has proven to be reliable and a significant improvement compared to previous Vagrant-based setups.
Homestead is great. I used Laravel's pre-packaged Vagrant virtual machine (VM) for years, both for personal and clients' projects. Homestead features pretty much everything you need for a PHP application, and allows you to spin up new websites very quickly. So why did I make the switch?
The short answer is microservices.
Back at the beginning of 2018, I was tasked with exposing an API from a legacy monolith to serve a new SPA, and to progressively phase out the monolith by extracting all of its business logic into microservices.
All of a sudden, I had to manage some legacy code running on PHP 5.x on the one hand, and some microservices running on PHP 7.x on the other. I initially got both versions of the language running on the same VM but it involved some dirty workarounds that made the overall user experience terrible. Besides, I would eventually end up with multiple microservices with different stacks, and managing them all on the same VM wasn't a realistic long-term solution.
I briefly tried to give each microservice its own Vagrant box, but running everything together was far too heavy for my machine and managing things like intra-VM communication felt very cumbersome.
I needed something else, and that something else was Docker. But how does it help the situation?
One of the promises made by Docker is to provide isolated environments (containers) running on a single virtual machine, which starts in about five seconds. In my case, that meant replacing all of my heavy, Vagrant-based virtual machines with a single, super-fast virtual machine featuring Docker, and run all of my microservices on top of it, each in its own isolated container.
Using Docker's very own logo in an attempt to illustrate this, it would be the equivalent of having a single whale per container instead of the same whale carrying all of the containers. If you replace "whale" with "virtual machine" and "container" with "microservice" in the previous sentence, you should get the idea.
Imagine if every single container in the logo needed its own whale: beyond the ridiculous amount of plankton that would require, would it look efficient?
While being an overly simplistic explanation, this right there is actually what made it click for me: as a developer, this use case made perfect sense as it related to the work I do every day, way more than trying to understand how virtualization or shared operating systems work.
The answer is yes and no. The more complex the application, the more moving parts it is composed of, the more likely you will need a solution like Docker. But even if your application is rather simple, starting and stopping some containers is way faster than booting and halting a virtual machine, and in the eventuality of your application evolving in some unexpected ways (like they always do), adopting Docker from the get go gives you the confidence your setup is future-proof.
Besides, instead of using a pre-packaged virtual machine like Homestead, featuring way more tools than any given application has any use for, using Docker the way I suggest ensures that you only ever install what you actually need, in a proactive way. You regain control of your environment.
There are quite a few Docker-based development environments out there, and it seems that every few weeks a new one pops up. Laradock was arguably the first to get some traction, and is to Docker what Homestead is to Vagrant: a pre-packaged environment for PHP applications. While I don't personally use it, I have heard a lot of good things and it might just be enough for your needs. Their motto is Use Docker First - Then Learn About It Later, which is an excellent approach in my opinion.
There's also Takeout, DDEV, and I'm sure plenty of others that are very good at what they do. The purpose of this series, however, is to give you a deeper understanding of Docker so you can build your own environment, one that matches perfectly any given project's requirements. If all you want to do for the moment is using a Docker-based solution without the hassle of setting it up yourself, by all means give one of the aforementioned projects a try, and feel free to come back whenever you want to create your own, tailored setup.
All in all I see this explosion of Docker-based environments as a good sign – a confirmation that this is the right tool for the job, one that will be embraced more and more in the future.
Meanwhile, others argue that the comparison is moot because the two technologies serve completely different purposes.
I don't know what the future is made of. When it comes to technology, I like to consider myself a pragmatist: being a developer requires an awful lot of effort to stay up to date, and while I would love to explore the many technologies that arise pretty much on a daily basis, I simply don't have the time to do so. While the serverless movement has certainly been getting momentum, it is also still very much in its infancy, and I would like to see it mature further before I consider taking the plunge.
If serverless somehow ends up "eating the stack" and that is where all the jobs go, fine, I will make the switch. But I don't see such a market shift happening for another few years (if ever), and for the time being I'd rather focus on what is likely to make my CV attractive today.
Companies are already slow to migrate to containers despite high interest, and the industry only recently rallied behind Kubernetes, which is poised to become the standard of container orchestration. Serverless represents an even greater paradigm shift that is nowhere near such consolidation: thinking it will flip the market at the snap of a finger is foolish in my opinion.
That doesn't mean I won't use a Lambda function where it seems relevant, or that serverless shouldn't be on your watchlist. On that note, if you wish to explore the subject further, I found this article by ZDNet to be informative.
It used to be terrible. Now it's just not great.
This is the one real trade off to using Docker Desktop as a development environment for the entire stack, in my opinion (Docker on Linux is fine). This will be mostly felt on macOS, where the underlying osxfs filesystem offers only average performance (and the recently introduced gRPC FUSE doesn't seem to make a big difference). There are ways to make it better, however, which are mentioned throughout this series.
You're unlikely to get the same kind of speed as Homestead or something like Valet, but it will be acceptable. And I trust it will only get better over time: Windows now has Windows Subsystem for Linux version 2 (WSL 2), and there are ongoing discussions around performance on macOS.
As previously mentioned, with the unveiling of WSL 2 it seems that running Docker inside Windows is now perfectly fine. But if you can't use it for some reason, running Docker on Microsoft's OS can be a bit of a pain. The good news is that most of what is covered in this series has been successfully tested on Windows 10 at some point, even though it does require a few tweaks here and there.
To help you get started, I came up with a special appendix which I recommend you read after this introduction.
A caveat: I work on a Mac and don't have regular access to a Windows machine, so if any Microsoft-related trickery seems to be at play I'm afraid I won't be of much assistance (this is also valid for Linux distributions, although from experience these don't tend to cause any trouble).
Finally, some developers simply dismiss Docker as not being their problem. This is a fair argument, as Docker sits somewhere beyond the realm of pure application development. If anything, a sysadmin or DevOps engineer should set it up for you, right?
Well, yes, you are entitled to feel that way. But if DevOps can refer to individuals with an interest for both system administration and coding, effectively acting as bridges between the two, it can also be interpreted as people from both sides meeting halfway.
In all fairness, you don't need to know Docker to be a good developer. My point is simply that by ignoring it, you are missing out on an opportunity to get better. It won't improve your syntax nor teach you new design patterns, but it will help you to understand how the code you are writing fits into the bigger picture. It is a bit like playing an instrument without learning to read sheet music: you might sound good, but you are unlikely to write a symphony. Taking a step back to reflect on the application's architecture to understand how the different pieces fit together will give you invaluable perspective that will influence the technical choices you make for your application.
Whatever your specialty – backend, frontend or fullstack – and to whatever extent you feel concerned with the DevOps movement, I promise that learning about the cogs that make your applications tick is worth your time, and that you will be a better developer for it.
In the end, you may not fully embrace the final result of this series. Be it for performance reasons as mentioned earlier, or because you feel some aspects of the setup are too complicated or just don't work for you, this development environment may not entirely satisfy you in its suggested form.
But that is not the true purpose of this series. While you can definitely take my approach at face value, my true objective is to give you enough confidence to integrate Docker to your development workflow in any way you see fit – to provide enough practical knowledge to adapt it to your own use cases.
And I can guarantee that by the end of this series, you will think of ways to use Docker for your own benefit.
This story was originally published on tech.osteel.me.