When you first look into Docker, what it does, and how it works, it appears to be a neat tool to help with application packaging and deployment. It's not until you start using it, however, that some of the other benefits that developers love so much show themselves. So, to discover why this tool has become so popular, here are Ten Reasons Why Developers Love Docker.
The traditional deployment methodology involves pushing an application artifact onto a server, then running it. This also includes making sure to setup whatever libraries and file structure that are required by the application.
Docker flips this on its head. A Docker image should include everything required to run an application, including the application server, dependencies, file structure, as well at the application itself. This empowers developers to be responsible for more than just development, which makes sense as it's their code after all.
Docker images are built once and run anywhere. Or more specifically, anywhere that you can run Docker.
This makes deployment more predictable, as the same image is getting deployed to your production and test environments. This takes any surprises out of the equation, as we can be sure that the application and it's dependencies will be deployed in the same way.
When someone publishes a Docker image, not only is it available for anyone that has access to run it, but it can also be extended to add more functionality. Developers love this because it promotes reuse and less duplication, one of the key concepts they apply to write code.
As an example, let's say we wanted to use the popular lightweight Linux distribution Alpine Linux to run the curl command to fetch a specific URL. By default, Alpine Linux doesn't include curl, but we can easily extend the image to add it in, like so:
FROM alpine
RUN apk add curl
If we built and ran this image, we'd get all the functionality of Alpine Linux with the addition of curl.
If you think of an open-source application that you want to run in Docker, the likelihood is that a Docker image for it is already available to download. This would normally be from Docker Hub, one of the most popular central registries of Docker images.
This makes developers' lives easier because they can quickly get everything they need to run an application started in their local development environment.
Take the popular database technology Postgres, for example. All you have to do is run "docker run postgres" and you have an instance of Postgres available, ready for your application to connect to.
It might sound obvious, but one of the huge advantages for developers of using Docker is the containerized nature of them. Broadly, this means that:
All of these points add up to the feeling that when you're using Docker, the containers are very much self-contained and throwaway. If you want to have a brand-new development environment, just stop the containers and start again.
Photo by chuttersnap / Unsplash
A Docker Registry is a service that is used for distributing Docker images. It's kind of like what Maven Central is to Maven artifacts. One of the most well known Docker registries is Docker Hub, where you can find all sorts of publicly available images. Some alternatives include:
Of course, all of these options allow you to store private images that you only want authorized users to have access to. This means that developers can have their application Docker images built once, published to a registry, then deployed to whatever environments are required.
We're already touched on the fact that Docker makes packaging applications easier, but this has several other implications for developers.
One of the main ones is that if there's a problem with the production application, they can have the same application running in their development environment in seconds, by pulling the relevant versioned image from whatever Docker Registry it's been published to. Obviously the Docker container they're running also includes all the same dependencies and file structure as production, meaning less chance of those pesky production-only bugs.
On the flip side, this also means that when the application gets tested during it's path to production, it's less likely that we'll have a bug in production that didn't manifest in the test environment.
Docker Compose is a tool bundled with Docker, which allows you to start whatever applications you need using a simple YAML or JSON specification.
You define the different applications that should be started up, along with other configurations such as networking. So, if an application you're developing depends on a database, or various other applications, it's a one step process to get this all running.
Here's an example Docker Compose file that defines an instance of Prometheus and Grafana, the popular monitoring and graphing applications:
version: "3"
services:
prometheus:
image: prom/prometheus:latest
ports:
- 9090:9090
grafana:
image: grafana/grafana
ports:
- 3000:3000
depends_on:
- prometheus
To start these applications you would just run "
docker-compose up
", then you'd have an instance of Prometheus running on http://localhost:9090
and Grafana on http://localhost:3000
.All of this can be committed into your version control system, making the developer's environment a lot more deterministic and predictable.
One of the developer's main pain points is the length of time it takes from writing the code to it being delivered into production. This can sometimes be down to slow cumbersome continuous integration (CI) processes that take ages to run tests.
Docker can help here, as a long-running test suite can be split up and run across several Docker containers. CI tools such as Jenkins make this easy, as you can configure it to spawn jobs on popular Docker orchestration frameworks such as Kubernetes and Amazon ECS.
As an example, if your testing process included the following stages, you could run them all in parallel in Docker:
Once we move to a production environment, high-availability and scalability become big concerns. It's all well and good running a single Docker container for our application in a test environment, but your customers probably won't appreciate it in production.
Fortunately, there are several Docker container orchestration frameworks available that take care of all the heavy lifting when it comes to scaling, deploying, and managing your application. Here are a few:
Features like auto-scaling, rolling deployment, and rollbacks mean that developers can sleep a little better at night knowing that if problems do happen, to some extent the framework can take of things.
Hopefully you can now understand a developer's point of view a little better when they tell you how much they love Docker.
It's no surprise then, that in a recent survey of its customers by Datadog, approximately 25% were already using Docker. With popular cloud providers such as Amazon and Google offering ever more seamless integration with Docker, we can expect the uptake to rise higher still.