The developer saying;“Its works on my machine”, is no longer valid with the arrival of the container and the Docker technology.
In the other hand, more and more enterprise organizations are now adapting to DevOps process and trying to make it more reliable and faster.
In the real world, owners of shipping containers consider its content and not about how it is going to ship to the right destination. The shipping company considers the transport of the container point of origin to the desired destination without worrying about its content. It is the same for Docker containers. Developers concern about what the services deliver with the container, and how it works along with interdependent multiple containers as an application. The job of the IT team is taking care of the infrastructure and its needs to run Docker containers, management of production environment, scalability and monitoring. Like the shipping company, IT teams shouldn’t concern about the content of the container, they need to focus on application delivery properly to the end user. Using Docker containers, developers can share their application dependencies to production environment more easily via the IT team. Use of container eliminate the conflicts between different environments. Also, the use of container provides the path to the developer and IT team to work closer and collaborate better, unlike traditional DevOps process.
DevOps workflow is a practice which unifies software development and software operations. As shown in the figure main features of the DevOps workflow of product development are code development, integration, build, test, release deployment and monitoring. The core of the DevOps process is automation, continuity, and tools. DevOps support for shorter development life cycle and frequent deployments to support business needs of the organizations.
Docker is a tool created to build, deploy and run applications using containers. The containers allow developers to unwrap an application with all of the dependencies, and deliver it as one package. The Docker containers can natively run on Linux and Windows. The Linux containers can run on both Linux and Windows host machines, but Windows containers can only run in Windows host machines.
The containerized application creates a Docker image after successful development which can use in any Docker environment without a hassle. Main change in the conventional DevOps workflow when adopting Docker technology is that it delivers from build to Docker environments (Production, Staging and QA), Docker image instead of the source code package, and integration of the Docker registry (Azure Container Registry, Docker Hub, private Docker Trusted Registry or any other Docker registry) to the DevOps workflow.
As shown in the above figure, this journey starts with the developer. This is where a developer run and test the application locally as Docker container. The developer follows the steps such as code, run, test and debug, with containerized application building Docker container with all the required dependencies in locally. After this initial step, source code push into the Source control repository with well-instructed Docker file. The developers can use Linux, Windows or Mac OS as their development platform. The developers are using Docker registry to get the base images to create the application or services to run in the container.
After completion of development, push the code to the source code repository (Visual Studio, Team Service,
Git, GitHub). On this point onwards, start the CI pipeline (Microsoft Team Foundation Server, Jenkins, CircleCI, TravisCI). After the developer commit to source code repository, CI pipeline triggered the build and test jobs. In this job, the Docker image builds a relevant build number. Then initiate the Docker container and run the tests. End of the successful completion of that job, push created Docker image to the Docker registry.
At the end of the CI pipeline, starts the CD pipeline, which uses tested Docker image published into the Docker registry. Using this image, you can do the deployment to the environments like QA, Staging, and Production. The containerized application contains more than one service or application making interdependency
needs to use docker-compose file, based on the environments. The CD pipeline changes the deployment approach based on the architectural type of the containerized application. The monolithic applications can be deployed to the few servers or VMs as Docker hosts using CD pipeline. But, when we come to an application like microservices-oriented applications, which requires high scalability through load balancing across the multiple nodes, servers, and VMs, high availability and intelligent failovers needs to use container clusters, orchestrators, and schedulers. For that can be used Azure service fabric, Azure container service, DC/OS, Docker swarm, and kubernetes.
The next steps are managing the running environments and monitoring. The containerized applications can manage based on the deployed environments using management tools available with Azure service fabric, Azure container service, DC/OC and Docker tools. The Operation Management Suite (OMS) and Application Insight can use as monitoring tools for Docker hosts and containers.
From the starting step of the DevOps workflow, Docker technology and containers work their involvement till to the last step. The same Docker image travel till to the relevant environment deployment, without changing from its creation step onwards. This feature of containerized DevOps workflow eliminates possible faults and drawbacks, which we have in the conventional DevOps workflow.
When thinking about the containerized DevOps workflow, it is more than a set of steps related to tools and technology. It’s a mindset transformation of the developers, testers, IT teams and managers of the organizations. The containerizing your DevOps workflow can bring all these different units to one place and make more efficient product delivery to end customers. Adopting to the containerized DevOps workflow makes application lifecycle faster, more reliable, more scalable and maintainable while improving business values of the organizations.