What is docker, in simple words?
- Klevis Cipi
- Aug 2, 2024
- 2 min read

Docker is an application-level virtualization platform that allows you to package software and its dependencies into isolated, self-contained units called containers.
Basically, docker helps us create different environments on a single machine/host.
Let’s take a real example without too many technical details to better understand docker:
Two projects developed with different PHP versions, one with PHP7.4 and one with php8.2, must be published on a single server machine.
This machine has PHP7.2, and we cannot upgrade to the PHP package because there are other projects that run on top of this PHP version.
At this point, we have three options?
Add a new machine for each project.
Use docker.
Upgrade your PHP project.
You can choose option one (1) if you expect very high traffic generated or high use of resources such as RAM and disk space.
But this is not the case, the two projects are viable for internal use but do not require many resources.
Option three (3) requires time to analyse and human resources. That is not the case for us yet.
So let’s go for option two (2) and leverage the power of docker by creating two virtual environments dedicated to the projects mentioned above.
When creating this type of architecture, more than one story is needed to explain it, and I will later publish a dedicated article about it.
But does docker offer any tools that we can use and create these virtual environments?
Docker offers two installation modes:
Docker engine known as Docker CE
Docker itself as a virtual environment is made up of these macro components, or rather we can call ‘Artefacts’:
Image : These are the recipes or rather blue-prints to initialize the containers.
Container : They are a running instance of an image, which, based on our previous example, we can define as “a project with PHP 8.2 live in production using Docker environment ”.
Volumes : These are mechanisms for persisting data outside a container’s lifecycle.
Network : They allow communication between containers and outside world.
The desktop docker tool helps us manipulate these artefacts.
The docker engine CE tool helps us generate these artefacts.
The image or rather the blue-print that I like to call is generated based on lines of instructions written on specific files that we call:
Dockerfile
Dockerfile example:

What is the deployment flow from Dockerfile to the live room in production?
Before answering, I would like to add another term: Image Storage Registry.
Docker offers its own service: Docker Hub, but there are other private registries such as Amazon ECR (Elastic Container Registry)
Deployment flow has this order:
Dockerfile creation.
Image creation and image tagging.
Publication of image on cloud (Docker hub or ECR or other providers).
Pull image to the host machine.
Container execution starting from the image.
Conclusion:
Docker is a very powerful tool that helps us organize more than one project hosted in a single server machine.