paint-brush
Why We Love Docker and Best Practices for DevOpsby@PavanBelagatti
13,080 reads
13,080 reads

Why We Love Docker and Best Practices for DevOps

by Pavan BelagattiOctober 14th, 2019
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Docker makes it easy for developers to develop and deploy apps inside neatly packaged virtual containerized environments. PayPal has over 700+ applications now, and they have converted them all into container-based applications. PayPal, MetLife and MetLife are using Docker to boost their dev productivity by 50%. Docker is a form of virtualization, but unlike the virtual machines, the resources are shared directly with the host. This allows you to run many Docker containers where you may only be able to run a few virtual machines.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Why We Love Docker and Best Practices for DevOps
Pavan Belagatti HackerNoon profile picture

Every company is becoming a software company these days, and there is so much happening around making software development occur at record speeds.

In today's cloud market, there are many DevOps tools and methodologies that are emerging every day. People have so many options to choose from that competition has reached its peak, which in turn has put pressure on these software firms to constantly deliver products and services even better than their competitors.

As the cloud approach is intensely gaining in popularity, many firms are starting to embrace cloud practices and concepts like containerization, meaning DevOps tools like Docker are in high demand. In this article, we are going to see some facts related to Docker that are useful for developers and architects.

Virtual machines and the evolution of Docker:

Long ago, before the introduction of Docker and containers, big firms would go and buy many servers to make sure their services and business didn’t go down. This process usually meant that firms bought more servers than needed, which was extremely expensive. But they needed to do this because, as more and more users hit their servers, they wanted to make sure they could scale well without any downtime or outage.

Then we had VMware and IBM (there is still a debate on who introduced it first) introducing Virtualization that allowed us to run multiple operating systems on the same host. This was a game-changer, but also seemed to be very expensive with multiple kernels and OSs.

So fast forward to modern-day containerization, we have this company ‘Docker’ that solves a lot of problems.

Why do developers like Docker?

Docker makes it easy for developers to develop and deploy apps inside neatly packaged virtual containerized environments. This means apps run the same no matter where they are and what machine they are running on.

Docker containers can be deployed to just about any machine without any compatibility issues, so your software stays system agnostic, making software simpler to use, less work to develop, and easy to maintain and deploy. Simply put, the days of ‘It is working on my machine’ are long gone.

A developer will usually start by accessing the Docker Hub, an online cloud repository of Docker containers and pull one containing a pre-configured environment for their specific programming language, such as Ruby or NodeJS with all of the files and frameworks needed to get started. Docker is one such tool that genuinely lives up to its promise of Build, Ship, and Run.

Worldwide and across the industry, so many companies and institutes are using Docker to speed up their development activities. PayPal has over 700+ applications now, and they have converted them all into container-based applications. They run 150,000 containers, and this has helped them to boost their dev productivity by 50%.

MetLife, another great example, made huge savings on their infrastructure because they were able to use fewer operating systems to manage more applications. This gave them a lot of their hardware back, and hence they were able to save a lot of money on infrastructure and cost reduction. After moving to Docker, MetLife saw a 70% reduction in VM costs, 67% fewer CPUs, 10x avg. CPU utilisation, and 66% cost reduction. That's the power of Docker for you.

Why has Docker became so popular?

> Lightweight

> Portable

> Fast

> No hypervisor

Docker is a form of virtualization, but unlike the virtual machines, the resources are shared directly with the host. This allows you to run many Docker containers where you may only be able to run a few virtual machines.

A virtual machine has to quarantine off a set amount of resources like HDD space, memory, processing power, emulate hardware and boot the entire operating system. Then the VM communicates with the host computer via a translator application running on the host operating system called a ‘Hypervisor.’

On the other hand, Docker communicates natively with the system kernel, bypassing the middleman on Linux machines, and even Windows 10, Windows Server 2016, and above.

This means you can run any version of Linux in a container and it will run natively. Not only this, Docker uses less disk space too.

Virtualization vs. containerization:

Source: This image is everywhere on the internet but I first saw it on Nick Janetakis's tutorial

In virtualization, the infrastructure is going to represent your server is the bare metal–the host could be your laptop or desktop. On top of that we have the operating system, something like a Windows server, or for your personal laptop, Mac OS or a Linux distribution.

In virtualization, we have something known as a Hypervisor. Because we are running these virtual machines, which are basically isolated desktop environments inside of a file, the Hypervisor is what’s going to understand how to read that file. This is what a virtual machine image is, and common Hypervisors like VMware and VirtualBox know how to interpret these operating systems.

On top of that, we have the actual guest OS. Each one of these guest OS will have their own kernel, and this is where things start getting a little expensive from a resource allocation perspective.

On top of the OS is where we would actually install our binaries, libraries, and then finally we could copy over all of our files on to this operating system that actually makes up our application that we want to deploy to the server.

Now let’s contrast this with containerization. In this we have the infrastructure and OS, but no Hypervisor. It has a process that directly runs on the operating system known as Docker Daemon, and this facilitates and manages things like running containers on the system, the images, and all of the command utilities they come with Docker.

The applications that we run within these images basically run directly on the host machine. What happens is we create images that are like copies of the application that we want to distribute, and a running instance of an image is what’s known as a container.

Containerization basically kills the ‘It works on my machine but not theirs’ drama.

Docker terminologies:

Image: Image is basically an executable package that has everything that is needed for running applications which includes a configuration file, environment variables, runtime, and libraries.

Dockerfile: This contains all the instructions for building the Docker image. It is basically a simple text file with instructions to build an image. You can also refer to this as the automation of Docker image creation.

Build: Creating an image snapshot from the Dockerfile

Tag: Version of an image.  Every image will have a tag name.

Container: A lightweight software package/unit created from a specific image version.

DockerHub: Image repository where we can find different types of images.

Docker Daemon: Docker daemon runs on the host system. Users cannot communicate directly with Docker daemon but only through Docker clients.

Docker Engine: The system that allows you to create and run Docker containers.

Docker Client: It is the chief user interfacing for Docker in the Docker binary format. Docker daemon will receive the docker commands from users and authenticates to and from communication with Docker daemon.

Docker registry: Docker registry is a solution that stores your Docker images. This service is responsible for hosting and distributing images. The default registry is the Docker Hub.

Embracing DevOps with Docker:

Docker as a tool fits perfectly well in the DevOps ecosystem. It is built for the modern software firms that are keeping pace with the rapid changes in technology. You cannot ignore Docker in your DevOps toolchain; it has become a de facto tool and almost irreplaceable.

The things that make Docker so good for DevOps enablement are its use cases and advantages that it brings to the software development process by containerizing the applications that support the ease of development and fast release cycles.

Docker can solve most of the Dev and Ops problems, and the main one, ‘It works on my machine,’ enables both the teams to collaborate effectively and work efficiently.

According to RightScale 2019 State of the Cloud Report, Docker is already winning the container game with an amazing YoY adoption growth.

With Docker, you can make immutable dev, staging, and production environments. You will have a high level of control over all changes because they are made using immutable Docker images and containers. You can always roll back to the previous version at any given moment if you want to.

Development, staging, and production environments become more alike. With Docker, it is guaranteed that if a feature works in the development environment, it will work in staging and production, too.

Datadog took a sampling of its customer base, representing more than 10,000 companies and 700 million containers, in its report on the survey, it is shown that, at the beginning of April 2018, 23.4 percent of Datadog customers had adopted Docker, up from 20.3 percent one year earlier. Since 2015, the share of customers running Docker has grown at a rate of about 3 to 5 points per year.

Source: Datadog

Docker best practices:

Before approaching Docker, you must know some best practices to reap the benefits of this tool to the fullest extent. Listing down here some Docker best practices to keep in mind,

Docker is all about speed:

Source: Docker

Containers are the next once-in-a-decade shift in infrastructure that we all need to take part in. The hardest part in any IT industry whenever new tools are developed is the migration part–we have to learn those new tools, workflows, understand the terminology, and much more.

But the nicest thing about Docker is it is created with developers, sysadmins, test engineers, Ops people & IT architects in mind. According to Gartner research, it is said that more than 50% of global organizations will be running containers in production.

Without containers today, organizations get into something called the 'Matrix from Hell' problem, where you have different types of applications, dependencies, and environments, and all these things need to work together to make your software work efficiently. That really is hell.

This problem has been fixed by Docker. Docker is all about speed, and it helps to develop fast, build fast, test fast, deploy fast, update fast, and recover faster.

Docker is a fantastic piece of technology with a high level of adoption, making it a default tool when it comes to embracing DevOps practices. Docker has initiated the digital transformation at various firms.

Millions of users rely on Docker, downloading 100M container images a day, or maybe even more (as per their blog) and over 450 companies have turned to Docker Enterprise Edition – including some of the largest enterprises in the globe.

With such vast adoption, the range of stories to tell, and the diverse set of use cases continues to grow.