How To Get Better At Testing Automations in Docker

Written by flamine | Published 2020/06/23
Tech Story Tags: docker | testing | dockerfile | automation-testing | running-tests-with-docker | coding | docker-top-story | hackernoon-top-story

TLDR Docker is a great tool for build automation and app launching. It’s difficult for me to imagine how to work without it and not suffer performance losses. In this article, I will describe my experience with Docker and provide you with some tricks to make your life easier with its help. Using Docker-compose instead of plain Docker to be able to use relative paths to results folders. This way you will be sure that even if you need to run the container on a different operating system everything will work exactly how you planned.via the TL;DR App

I do test automation for ONLYOFFICE document editors, and our team uses Docker for lots of tasks. It’s difficult for me to imagine how to work without it and not suffer performance losses.
In this article, I will describe my experience with Docker and provide you with some tricks to make your life easier with its help.

Docker for build automation and app launching

Docker is a great tool for build automation and app launching. Even it seems you can just write a simple bash script, try Docker instead. This way you will be sure that even if you need to run the container on a different operating system everything will work exactly how you planned. With bash script it’s not always possible.
One of examples of build automation is our Java test example (by default our editors don’t include any document management system, so we created several simple DMS in different languages for us and users to test them. They are called test examples).
Of course, we could have written a bash script, but taking a ready image with all the necessary dependencies and executing one command within it is much easier. For this, we use Docker-compose instead of plain Docker to be able to use relative paths to results folders.

Creating dockerfile

I won’t repeat all the basic things about Docker and Docker-compose from the official documentation. But I have some tips from my experience. Pay attention to these things while creating a dockerfile:
Excluding unnecessary files
```COPY / ADD```
command will copy all the files from your directory (see the difference between these commands in Docker Best Practices).
If your service was written with the use of npm, and the packages are stored in the same folder with your project, they will be copied too. In the best-case scenario, it will just occupy more space and waste a little bit of your time (see build cache). The worst that could happen is an error while building the container for production. Use
`.dockerignore`
to exclude the files you don’t need. See the details here
Using links to git repo
Docker-compose allows composing services using a link to a git repo (see documentation. You will be able to make use of it on two conditions:
  1. You create a dockerfile for project building in the project’s root folder
  2. You don’t change its name to a custom one.
For instance, the following docker-compose script will use the image we built from Dockerfile located in the project’s root https://github.com/no-flamine/docker-in-root-example.git.
version: '3'
services:
  test-service:
    build:
      context: https://github.com/no-flamine/docker-in-root-example.git
Using databases in containers
If your app has a database, web-server, or something of the kind, use a separate container for them.
You probably read that database in a container is a bad idea. That’s true because Docker was created for processing not storing. But during development and testing,  it’s totally OK to use it as you don’t have much important data stored there. You can kill this database anytime you want, add a new one and load it with any trash data if you need that for testing. Later, for production, you’ll add an opportunity to connect a perfectly safe and stable external database for actual data.
Restart policy
Don’t forget to use
`--restart always`
parameter (see documentation for more info) when you start your containers for production. It’s rather unpleasant when services crash, but it’s way worse they don’t return to life.

Running tests with Docker

Docker significantly simplifies running autotests, because you need two commands only (build/run).
For example, we keep all the tests for our DocBuilder  tool in this repo. DocBuilder is js - OOXML(docx, xlsx, pptx) converter for generating documents from code.
Each time we update the tests, we check the new version on the stable version of the product. To do so, we just need to build an image and start the container using dockerfile from the project’s root.
Checking the updated tests on the stable version of the product is important, otherwise, the newly added changes can break the master branch.
To run the tests regularly, we use GitHub Actions. Just look at how neat it is in the  task configuration file:
name: check
on: [push]
jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2
      - name: Running test inside doc-builder-testing
        run: |
          docker build -t doc-builder-testing .
          docker run doc-builder-testing
We have only two strings in the run section - build и run.
We also have dockerfiles for testing DocBuilder installation on supported systems. When we add support for more systems, we’ll create dockerfiles for them as well.
That’s all! Feel free to share your thoughts and ask your questions in comments. Your Docker experience stories are very welcome as well.


Published by HackerNoon on 2020/06/23