Overview In the past period of time, containers have been widely used in all aspects of IT software production: from software development, continuous integration, continuous deployment, test environment to production environment. In addition to Docker's official Docker Swarm, Docker Machine and Docker Compose, the open source software community has also emerged with a series of container-related tools that cover all aspects of container orchestration, scheduling, monitoring, logging, and more. This article will focus on the software development process and solve the problem of continuous software delivery and team collaboration based on containers. Using containers in continuous integration Build a unified environment management In the traditional mode, using continuous integration tools such as Jenkins, the first problem in deploying an enterprise continuous integration platform is the diverse build environment requirements. The usual practice is to assign the build agent (server or virtual machine) to the team and let the team manage the environment configuration information of the build server and install the corresponding build dependencies. Using Docker in Continuous Integration
As shown above, we can easily build the software package through the container. There are several points to note: The --rm command ensures that the container created during the build is automatically cleaned up after the command is executed. I think you don’t want to have to clean up the build server disk from time to time. -v In addition to mounting the current source code into the container, we can also cache some dependencies required for the build by mounting the disk, such as the jar package downloaded by Maven, to improve compilation efficiency. --workerdir is used to specify the working path for the build command to be executed. Of course, it needs to be consistent with the workspace. As mentioned above, based on containers we can quickly build a CI build environment that adapts to a variety of build requirements. All you need is Docker on your build server. Using docker-compose in continuous integration In some cases, we may need to use some real third-party dependencies, such as databases or cache servers, during the build or integration test phase. In traditional continuous integration practices, you usually either use the deployed database directly (remember to clean up the test data and ensure concurrency), use the in-memory database instead of the real database, or use mocks or stubs for testing. Of course, ideally we still want to use a real database or other middleware service that is consistent with the real environment. Based on docker-compose, we can easily meet the needs of complex build environments.
Let's take Maven as an example. Suppose we need to use MySQL in the build to support the requirements of integration testing.
Building a Continuous Delivery Solution Establishing a cross-functional R&D team with common goals is the foundation of the DevOps movement. Automation is the cornerstone of improving efficiency. Based on the above, how do we build our continuous delivery solution based on containers? Infrastructure Automation The reason for using Rancher is simple. Rancher is the only container management platform on the market that can be used out of the box. It can also support multiple orchestration engines, such as Rancher's own Cattle, Google's K8S, and Docker's official Swarm as container orchestration engines. At the same time, the Catalog application store provided by Rancher can help the R&D team independently create the required service instances. Creating a continuous delivery pipeline The core issue of establishing a continuous delivery pipeline is how to define the enterprise's software delivery value flow. As shown in the figure below, we summarize the use of some typical tools used in various stages of development, continuous integration, and continuous delivery, as well as the relevant activities of the relevant teams in each stage, and typical DevOps-related activities. Team collaboration in the continuous delivery pipeline As mentioned above, the essence of creating a continuous delivery pipeline is to define the value flow of software delivery and reflect the formal software delivery process. The flow of value involves a high degree of collaboration among members of various functions in the team. In the container-based continuous delivery practice, images are used as value transfer between people in different functions.
In the container-based continuous delivery implementation solution, we use images as the unit of value transfer. Through continuous testing and verification of images, we complete the transformation of images from development and testing to releasable states and complete the software delivery process. |
<<: Is there still room for wireless mesh networking in the enterprise?
>>: How Amazon can achieve continuous delivery
Tencent Cloud's Double Eleven event has offic...
In recent years, the global satellite communicat...
5G exploded in China in 2020. Have you changed yo...
RackNerd's New Year 2024 (AMD Ryzen 7950X ser...
[Original article from 51CTO.com] After the succe...
According to foreign media, PCMag recently tested...
HostYun has been providing Russian CN2 line VPS f...
【51CTO.com Quick Translation】 Currently, the indu...
Here we divide the problems of slow task running ...
Recently, ICO has attracted a lot of attention. F...
Any time a network service outage occurs, it can ...
1. Overview This article mainly explains MaxCompu...
iONcloud's Double 11 promotion runs throughou...
Investigating the technical, environmental and so...
CMIVPS has launched this month's promotion, o...