It’s not feasible to have different computers and servers for every application that we might want to run. This would be a big waste of resources. More often either apps won’t be running or will be running at full. Most of the apps don’t need resource that exist on modern server. The general approach used to overcome this was Virtualization and Containerization.
The Virtualization is a creation of virtual rather than actual version of something such as virtual applications, servers, storage and networks. The technology got its start on Mainframes decades ago, allowing administrators to avoid wasting expensive processing power. In virtualization there will be a server, OS, Hypervisor and on top it there will be multiple VM’s. The applications will run over those VM’s. The main goal of virtualization is to manage workloads by radically transforming traditional computing to make it more scalable.
In containerization, the structure is quite similar there will be server and operating system but instead of using hypervisor it uses some of the tools that are given by the Linux kernel specifically namespaces, cgroups, and chroots to break the system into chunks called Containers. Container platform is complete solution that allows organizations to solve multiple problems across the diverse set of requirements.
- The VM are created out of having an Operating system, the libraries and dependencies that are necessary and then you’re allowed to place the application that you want to run within the virtual machine.
- A container consists of Libraries and Dependencies that you need in order to run that app and the application logic.
- The container provide a way to virtualize the OS so that multiple workloads can run on a single OS instance but in VM’s the hardware is being virtualized to run multiple OS instances.
- The containers are exceptionally light and they are only megabytes in size and just take seconds to start whereas virtual machines have a huge size and they take minutes to start.
- Containers are increasingly popular because they are Flexible, Lightweight, Interchangeable, portable, scalable and stackable.
Docker is a open source software platform to create, manage, deploy virtualized applications container on common operating system. Docker definitions of the isolated Operating systems are stored as a image files. The instances of images are called as containers. Here the images are the class and the container are the objects. The containers usually communicate with each other like real computers in a network. This gives a significant performance boost and reduces the size of the applications.
For whom Docker is for?
- Developers use docker to eliminate the verse on my machine problems when working with the co-workers. It makes a part of many Developer Operations (DevOps).
- Operators use docker to run and manage apps side by side in isolated containers to get better compute density
- Enterprises use docker to build Agile software delivery pipelines to ship new features faster more securely and of confidence for both Linux and windows software.