[51CTO.com original article] With the development of the Internet of Things and 5G networks, "edge computing" has gradually become the focus of many bigwigs. As a new computing paradigm, edge computing has demonstrated the advantages of real-time processing and high efficiency and energy saving in computing close to the device end. So, how does the industry define edge computing? What is its current market situation and future development prospects? The Global Software and Operation Technology Summit hosted by 51CTO was held in Beijing on May 18-19, 2018. At the "IOT Development Technology Analysis" forum, Shi Yang, chairman of the technical architecture of Huawei and the Edge Computing Industry Alliance, presented "Practice and Thinking of Edge Computing Reference Architecture 2.0". This article will be divided into three parts:
Why do we need edge computing? The digital transformation of the industry is a fashionable concept at present. Its essence is to generate data through digitalization, realize the flow of value through networking, and create economic and social value through intelligence. The iterative development of industry digitalization started from Internet companies, gradually entered Internet financial companies (FinTech), and in the past two years has entered scenarios such as smart manufacturing in the real industry. Looking at any enterprise, its operation involves people, money, materials, applications and environment, and all of them are derived from data. We perceive data, understand its meaning, and then predict it. For example, we use correlation analysis to create a "profile" of Internet users and predict their shopping habits. With the help of increasingly cheap cloud computing services, ERP and CRM software, we can look to the future from the past, continuously optimize the user experience, and make various decisions. Today, traditional industries need to use the Internet of Things to understand the operating status and asset utilization of their core assets (such as generators), and then connect computing from the cloud to the edge. Therefore, the core issue of digital transformation is to create a "data + model = service" model and achieve the following four transformations:
Today, our digital transformation technology is becoming more and more mature, and the threshold is getting lower and lower. For example, the digital twin that was popular at the Hannover Industrial Fair is to build a virtual world in the digital world through modeling and using ICT technology. It can be seen that digital technology can actually release the potential of the physical world. Of course, there are also various problems in connecting the physical world with the cloud digital world, including: ten-millisecond latency constraints, the surge in data and bandwidth consumption in driverless scenarios, data security and privacy of people and enterprises, and the unreliability of the connection between edge physical devices and the cloud. Therefore, we need to achieve the autonomy of things through intelligent distribution, and further realize: the cooperation between things, the collaboration between things and local systems, the collaboration between things and people (i.e. human-computer interaction), and the interactive collaboration between things and the cloud. In the whole process, we need to achieve comprehensive collaboration through the sharing of data and support. However, the edge computing, cloud services, and even technologies such as Docker and K8 that we use are actually distributed systems in the final analysis. As for the distributed system architecture, in addition to the scalability and performance advantages that we are familiar with, it also brings various challenges as shown in the figure above. These include: deployment, steep learning curve, understanding the overall architecture logic, and various development costs, maintenance costs, and operating costs. At the same time, various traditional software widely used in industry, such as ERP, MES, CAD, etc., are neither profitable nor able to adapt to flexible changes in demand due to their strong industry customization characteristics (MES is particularly typical). Therefore, we need to transform the original industrial service platform from hierarchical to flat, breaking their business logic into multiple modules. We solve the diverse problems of industrial sites through internal encapsulation and configuration orchestration. In the industry, we use technologies such as K8 to build a "highly cohesive and loosely coupled" distributed service architecture, and use microservices to provide a series of basic services for the architecture, including: service discovery, control bus, business orchestration, and architecture operation and maintenance. Along with microservices, the control architecture of the industrial sector has also become distributed. For example, interactive distributed control logic has been implemented on PLC control equipment in industrial sites. Generally speaking, every time a company changes a mobile phone model, all processes on its production line must be adjusted accordingly. Therefore, the competition among companies at the production line level is mainly reflected in the ability to change equipment and launch new products. At the same time, companies need to achieve flexible scheduling and process control by quickly loading and iterating system processes. In summary, today’s digital transformation in the industry faces the following challenges:
We need to combine traditional mechanism models with data calculation models through technologies such as AI, big data, and machine learning, so that enterprises can shift to operational service transformation and collaboration, thereby bringing about changes in the ecological architecture of the entire industrial chain. Next, let's take a look at the basic concept of edge computing. In short, it is an open distributed platform that can solve a series of problems such as real-time, data optimization and security. Looking at the development history of software technology, in the early years, major software companies operated independently and launched their own operating systems. Later, the Linux operating system that could run on X86 chips became the mainstream. In recent years, the cloud computing services launched by Amazon have encapsulated the underlying operating systems and hardware, and users only need to use these cloud services like using water, electricity and gas. In industry, the top of the value chain is technology, followed by core equipment, PLC, and some core industrial software, while common IT is at the bottom. Therefore, the core of the entire value chain is: system integrators are responsible for packaging the hardware and software as a whole and providing services. In addition, for some general software, it is actually impossible to bid separately. For example, network software is usually bid together with the entire project. It can be seen that the overall business model in the industry is different. As an analogy, the essence of edge computing and cloud computing focuses on two aspects:
Based on the above theory, we proposed a model-driven architecture as shown in the figure, which aims to enable the physical world and the digital world to collaborate and establish various cognitions in the physical world. Today, many manufacturers in the industry emphasize that they are MBEs (model-based enterprises), highlighting their modeling and collaboration capabilities. Since the industry has heterogeneous hardware and operating systems, they hope to turn various know-hows into software through model encapsulation. Just as the cloud computing field often uses DevOps to effectively support the entire life cycle from development to deployment, the operational capabilities of edge computing also require a set of tool chains and services to achieve business orchestration. Of course, we should adopt an evolutionary approach in technology, rather than a model of starting over. Various server hardware configurations suitable for information systems (such as memory size) and cloud models of AI (such as GPU power consumption) cannot be quickly transplanted to industrial environments (such as only a few hundred MB of storage space and memory size) and low-power chips for edge computing. We need more unique innovation opportunities. How does Huawei continue to make progress in the field of edge computing? Next, let's take a look at how Huawei continues to make progress in the field of edge computing. As shown in the figure above, this is an edge cloud collaboration. In the digital world, at the top is a centralized cloud service, which includes public cloud and private cloud, and is generally built in an IDC. On top of the PaaS and IaaS of cloud services, general-purpose AI and big data services at the edge, as well as Enterprise Intelligence services above, can be provided. On the edge side, we have embedded LiteOS (lightweight system), IoT gateways, servers, and third-party nodes. On this basis, we provide an optimized edge cloud based on K8. In many industrial scenarios, resources are extremely limited. Therefore, we cannot directly copy K8, but need to make appropriate optimizations. Above that are some basic services such as streaming data analysis and data management. At the same time, users can also push their own applications and trained models to the microservice architecture. In addition, in addition to being fully distributed, the architecture can also provide consistent service interfaces. Therefore, the edge side not only meets immediate business needs, but also meets the long-term distributed architecture in the cloud. The above picture is the Huawei EI Big Data webpage. In addition to providing some default services, we also provide some industry-oriented solutions, including: water, manufacturing, transportation, finance, retail and other related solutions. The architecture specifically includes: deep learning, pre-built models, cloud training, edge integration, and edge deployment. At the same time, it also provides various tool chains for some typical application scenarios (such as video applications) to better support application development. For machine learning, our platform provides a complete tool chain to achieve core construction, modeling, and model library publishing. As mentioned before, industrial projects generally have a very long lifespan. One of our projects to reduce energy consumption for air compressors lasted for more than half a year. The following key factors are involved:
Since resources are limited in many scenarios, we need a lightweight architecture on the edge side. As shown in the figure above, we provide EdgeCore, a serverless architecture, on the edge side. We abstract components such as gateways into edge computing nodes and use protocols to form local logical groups, thereby achieving device unification, interaction, assistance, and decentralization. On the edge side, the scenarios of each industry are actually different. For example, there needs to be some internal logic between the different pumps of a reservoir unit so that different hosts can still interact when the cloud connection fails. As shown in the figure above, SmartMesh provides a service bus. We abstract each node and define the logic based on it. In other scenarios, we only need to modify the logic above and keep the logic below unchanged to easily achieve adaptation. Of course, we have encountered many problems in the development of IoT edge application scenarios. For example, when Huawei delivers the gateway it provides, the configuration of the network interface, the switching and testing of the scenarios are not only time-consuming, but also may cause various special environment problems. Today, we have realized the provision of an integrated development environment in the cloud, which can simulate gateway hardware, different device libraries, various OS libraries, and even the memory resources of the gateway. In this way, users can simply drag and drop to build and load an operating environment. In the past, we often needed to manually inspect the safety-related conditions of badges and warning signs. Now with machine learning, big data, deep learning and other methods, we can build a model library and a compliance library. Through on-site photo collection and cloud data analysis, we can easily get the corresponding safety report. In addition, in the past, it usually took about 5 minutes to manually inspect each chip circuit board. Now, cameras have greatly improved accuracy and efficiency through machine learning and machine recognition. Similarly, based on machine learning, we controlled and optimized various multi-power parameters of the air compressor, reducing its energy consumption by 2%-4%. In the 3C field, by replacing manual recognition, we can also reduce personnel workload by 48%. Looking to the future, we need to horizontalize vertical demands and ultimately promote industry collaboration and development through unified terminology and architecture. Edge Computing Reference Architecture 2.0 Practice and Thinking The figure above is the edge computing reference architecture we proposed. At the top is the intelligent service, which is located in the cloud and edge layers. Its essence is to provide a full-process service for development and deployment. Its ultimate end is some physical devices such as edge sensors, edge gateways and edge servers, which are responsible for digitizing the collected information. It can be seen that IT personnel responsible for the upper part of the architecture and OT personnel responsible for the lower part need to interact to realize the business and map it to specific industries layer by layer. Therefore, everyone needs to use a unified language to modularly describe specific needs. At the same time, when the digital world above interacts with the physical world below, there needs to be an intermediate layer to define various business rules, realize the mapping between the upper and lower parts, and shield each layer. In other words, for the business layer, it does not need to know too much operation and physical resource information. Therefore, we abstracted the entire edge side into an edge cloud, and then interacted with the upper part through the interface, and then decoupled it layer by layer until we reached the physical layer that purely industrial personnel are concerned about. In the past, because the cost of switching was too high, everyone was afraid of being held hostage by a certain "camp" and therefore were very cautious when cooperating. Now, through this framework, everyone can achieve collaboration and integration. Different users can collaborate in the ecosystem to connect each level and ultimately provide solutions that cover the entire life cycle and full-process services of data. Let's take a look at some key points in the architecture:
Since the edge cloud is a distributed scheduling system, it can be defined by business policies. For the entire application, the framework provides support from the development layer to the deployment layer, and realizes rapid deployment of services through business orchestration. Therefore, the framework we provide can define the core services and overall scheduling of data processing, and users only need to implement the business logic in it, thus realizing the division of labor and cooperation between the platform and users. From the deployment perspective, there are both decentralized ones, such as Mobike, where a small module is embedded in each bike and its base station acts as a gateway; and centralized ones, such as distributed power grids, which have a large number of computing nodes and form an edge cloud on the edge side. And their traffic models are slightly different: the distributed type mainly adopts a north-south traffic model, and all data interacts directly with the cloud. In a centralized scenario, since local equipment wants to have more autonomy, a large amount of traffic is interactive in the east-west direction, and only a small amount of traffic is cleaned and aggregated over a long period of time before being uploaded. ***I would like to end this speech with "think big and start small". Thank you! [51CTO original article, please indicate the original author and source as 51CTO.com when reprinting on partner sites] |
<<: McKinsey: 5G development is not fast, and 5G upgrades will reach a climax in 2022
>>: Low Power Wide Area Network (LPWAN) Technology – Benefits and Testing Challenges
Kubernetes network is a core concept in Kubernete...
Shumai Technology is a business that mainly provi...
[[420219]] There are not many interview questions...
On February 23, the "2021 Mobile World Congr...
According to recent announcements, Kerlink and Ra...
The tribe has shared news about ShockHosting seve...
Friends who need CN2 GIA line hosts can pay atten...
DHCP appears A computer or mobile phone needs an ...
GPP defines network slicing as one of the main fu...
Huawei's 5G development is hindered With the ...
According to a report by China Business News, Hua...
SoftShellWeb recently launched several VPS monthl...
Introduction: Xi'an Railway Vocational and Te...
The rollout of 5G is expected to have a significa...
Continued from: Multi-access Edge Computing – Par...