What Is Docker? The Docker Container Concept Explained.
Docker is an open source tool that enables developers to build, deploy, run, update and manage containers.
Hence in this blog, we'll talk about container concepts and one of its most popular implementations, Docker. Topics covered in this blog article:
What a container is and what problems it solves?
Container repository, basically a storage for containers
How containers can make the development process much easier and more efficient?
How containers solve some of the problems we have in the deployment process of applications?
So let's dive right into what a container is.
What Is a Container and What Problems Does It Solve? 📦
A container is a way to package applications with everything they need inside of that package, including all its dependencies and all the necessary configurations.
The package is portable just like any other artifact is, and that package can be easily shared and moved around between a development team or development and operations team.
The portability of containers, plus everything packaged in one isolated environment, gives it some advantages that make the development and deployment process more efficient. ✅
We'll see some examples of how that works later on.
Container Repository - Where Containers Are Stored?
As I mentioned, containers are portable, so there must be some storage for them so you can share them and move them around.
So containers are stored in a container repository. It is a particular type of storage for containers.
Private vs. Public
Many companies have their private repositories, where they host or store all the containers, and this will look like this where you can push all of the containers you have.
There is also a public repository for Docker containers. It is where you can browse and find any application container you want.
So let's head to the browser and see what that looks like:
So if I search for Docker Hub, which is the name of the public repository for Docker, I will see this official website:
So here, if you scroll down, you see more than a 100,000 container images of different applications hosted or stored in this Docker repository.
Here you just see some examples; for every application, there is this official Docker container image:
But if you are looking for something else, you can search for it here:
I see there's an official image for Jenkins. But there are also a lot of non-official images or container images that developers or even Jenkins itself:
A public repository is where you usually get started, where you can find any application image.
Application Development Before/After Container
Now let's see how containers improve the development process by specific examples.
How did we develop applications before the containers?
Usually, when you have a team of developers working on some application, you must install most of the services on your operating system directly.
● Redis for messaging
Every developer in the team would then have to go and install the binaries of those services, configure them, and run them on their local development environment. The installation process will look different depending on which operating system they're using. Also, another thing with installing services like this is that you have multiple installation steps:
So you have a couple of commands to execute, and the chances of something going wrong and an error happening are high, because of the number of steps required to install each service. And this approach or this process of setting up a new environment can be tedious, depending on how complex your application is.
For example, if you have 10 services that your application is using and you would have to do that 10 times on each operating system environment. So now let's see how containers solve some of these problems with containers. 👏
You do not have to install any of the services directly on your operating system, because the container is its own isolated operating system layer with a Linux base image.
You have everything packaged in one isolated environment, so you have PostgreSQL with a specific version packaged with the configuration in the start script inside one container. So as a developer you don't have to go and look for the binaries to download on your machine. But instead, you go ahead and check out the container repository to find that specific container and download it on your local machine.
The downloads step is just one docker command that fetches the container and starts it simultaneously. Regardless of your operating system, the docker command for creating the container will be the same. ✅
Also, you can have different versions of the same application running on your local environment without conflict.
Application Deployment Before/After Container
So now, let's see how containers can improve the deployment process.
A traditional deployment process will look like this:
The developer team will create artifacts, which are basically files, along with instructions on installing and setting them up on the server. For example, you might receive a jar file for your application and instructions on setting up a database or some other service. All of these artifacts and instructions will be provided by the development team:
So the development team would give those artifacts over to the operations team, and the operation team would set up the environments to deploy those applications:
External dependencies on the server OS: The problem with this approach is that you first need to configure everything and install everything directly on the operating system of the server, which we saw in the previous example. That could lead to conflicts with dependency versions and the service running on the same host.
Misunderstandings / Miscommunication: Another problem that could arise from this process is a misunderstanding between the development team and operations. Because everything is in a textual guide, there could be cases, where developers miss mentioning some critical points about configuration and when that fails, the operations team has to go back to the developers and ask for more details, which could lead to back-and-forth communication until the application is successfully deployed on the server with containers.
With containers, this process is simplified, because now the developers and operations are working in one team to package the whole configuration dependencies inside the application.
It means that if you use a Docker container, you do not need to configure anything directly on the server, because everything is already encapsulated within the container. Instead, you only need to run a Docker command that pulls the container you have stored in the repository and then runs it.
It is, of course, a simplified version but that solves exactly the problems we saw in the previous scenario.
So it's much more straightforward, and no environmental configuration is needed on the server. The only thing of course is, you have to install and set up the Docker runtime on the server before you can run containers there. But that's just a one-time effort. 😌
Watch this Docker Crash Course to learn everything about Docker to get started and use it in practice 🚀
You can learn more about Docker and other DevOps technologies on my Youtube channel 👏
Like, share and follow me 😍 for more content: