In the past when creating an application, there has always been an issue with sharing the code. All developers have experienced the “it runs on mine” moment after sharing some code that then refuses to work. It is generally the change in environment that causes this issue, resulting in time-costly processes to either edit the code or build a virtual machine for it to run on.
And so, Docker containers were invented!
A virtual container consists of all of the dependencies required to run an application outside of the environment it was created in. There is no longer a need to reconfigure the code or to build and use a virtual machine. Virtual containers mean the application will have the necessary elements to run the code ‘self-contained’ within it, making it a much more efficient process.
In fact, multiple containers can run on the same hardware, all using the same Host OS kernel while maintaining isolated applications, taking up less space and saving even more time.
The concept of a “Container” has existed in one form or another since the late 1970’s with the release of Unix V7. However, it was the release of Docker in 2013 that set the container revolution into motion! Docker took the concept of containers and made them simple, fast and easy to use.
Just download the Docker Engine software, build your application into a Docker Image and send it off! Once the image has been shared, another person has the ability to download it and build the container on their own computer to access the application with all its trimmings.
A fun way of thinking about this is to imagine you want to bake an amazing cake to share with your friends. To do this, you make the mixture and bake it inside of a cake tin. Over time you will improve the recipe for the cake but the tin remains the same. If you want to share the cake you give the tin, filled with the ingredients, to a friend to bake.
In this analogy the cake tin is the Docker Image and the resulting cake once removed from the tin is the Docker container with your application, and all the necessary elements to function, contained inside.
You could call it a little bit of baking magic.
DevOps teams now use Docker to build and share applications quickly and reliably, transforming the way in which they undergo projects. Docker allows for an interactive working experience; developers and ops people can collaborate and build the infrastructure of an application together by sharing and editing simultaneously. They can manage software parts as isolated, self-sufficient containers, using the “pull/push” feature to hand over elements of the software when it is updated.
This would not have been possible before, but because the necessary components to run the code are included within the container, the applications can be shared with no issues. How marvellous! This feature is especially useful when a project will have to go through many iterations to reach completion and time is of the essence.
Let's quickly run through the differences between a Docker Container and a Virtual Machine. While a Virtual Machine has downsides in comparison with Docker containers, it has a few notable upsides like the fact it is stateful and fairly easy to set up. For one thing, a Virtual Machine can run multiple applications within itself, but this often causes internal conflicts as sometimes two or more applications running on the same machine require different dependencies which can not be installed in parallel.
In contrast to this, a Docker container is built for each individual application, removing the possibility of conflict on this level. It should also be noted that a Virtual Machine takes up masses of space as it is an intricate and independent system with its own operating system. Virtual Machines are not portable like a Docker image because they are integrated into the hardware layer of the computer. Finally, not all machines support virtualisation and thus would not be able to sustain a virtual machine on their system, making any attempt at a transfer potentially futile.
In comparison, though Docker involves runtime complexity and regular image maintenance, it has many more positive than negative features. Docker takes up significantly less room on a machine as it utilises the host OS of the system and does not require a personal one. Docker, as previously stated, has the ability to run multiple applications in different containers simultaneously, represented in the image below (Figure. 1). Docker containers are stateless which makes it easy to test many iterations of an application without interference from previous trials.
This feature also makes the application easier to transfer to the cloud because it is not storing so much unnecessary internal data. Once an application has been containerised in a Docker image it can be deployed on almost any infrastructure including the cloud, via email or technically even a text (as a zip/tar archive)! This ease of transference is encouraging more people to migrate towards cloud-based systems and micro-services. The small size and ease of portability that Docker provides greatly surpasses its’ downsides, making it an essential package for today's’ Developers.
Docker is a lightweight, fast and straightforward software package that solves the age-old issue of ensuring code will work after it has been shared with others. It provides a system of boxing up and shipping projects with all of the tools required to run the application contained inside. Its’ many positive qualities greatly outweigh the negatives and if you are not already using it, what are you waiting for?
Docker (2019), Why Docker, What is a Container, digital image, accessed 28th November 2019, <https://www.docker.com/resources/what-container>.
Docker (2019), Why Docker, Company, Newsroom, Docker Logos, digital image, accessed 17th December 2019, <https://www.docker.com/company/newsroom/media-resources#logos>