At the beginning of the new year of 2020, there will definitely be a new wave of hype around edge computing and 5G. Now is an ideal time to consolidate and update our understanding of these two concepts and explore how these two technologies can complement and enhance each other. Especially in the fields of financial payments, online orders, fraud detection, machine learning, etc., these two technologies will help you maintain continuous competitiveness in the future.
Edge computing is mainly about processing information from devices closer to where it is created, rather than calling back and forth in the cloud platform. The arrival of 5G has paved the way for applications that were not possible before edge computing. For example, in augmented reality and virtual reality, the ultra-low latency characteristics of 5G keep what you see in sync with what you are doing; and self-driving cars, which need to process large amounts of data and make decisions in an instant. IDC predicts that by 2025, there will be 150 billion connected devices (including RFID) worldwide, most of which will output data in real time. In 2017, real-time data accounted for only 15% of all information created, captured, or copied; by 2025, this proportion is expected to reach 30%. From a percentage point of view, edge computing does not seem to bring earth-shaking changes, but from the perspective of the raw capacity of data, it is an order of magnitude increase (from ~5zb to ~50zb). Edge computing is based on massive real-time data and has the ability to synchronize intelligent analysis while minimizing bandwidth overhead. New Definition Even in 2020, the definition of "edge" is still a matter of opinion. From NIST (National Institute of Standards and Technology) to IEEE, models are still evolving. It can be a Raspberry Pi selectively sending sensor information to the cloud, or it can be a node processing data on Google's online streaming platform. Although there is a big gap between the two models, they both bring computing resources closer to the user terminal. A relatively neutral edge computing report gave a clearer definition, which was agreed upon by some industries:
In layman's terms, the device edge includes terminals such as phones, drones, AR headsets, IoT sensors, and connected cars; it also includes gateway devices such as switches and routers; and local servers. They are all downstream of the last mile of the network. The infrastructure edge exists upstream and obtains computing resources through network access devices and data centers. Above: The upstream and downstream chain from the device edge to the infrastructure edge Image credit: Intel For example, in the case of a large drilling rig, the Raspberry Pi is at the edge of the device. It does not need to transmit environmental data in real time to occupy bandwidth, but processes it locally, unless it is an emergency task and reports to the background. On the contrary, when the local data center needs to provide a streaming speed of 60 frames per second to play 4K high-definition video, although the edge of the device can provide an obvious low-latency advantage, users prefer to get a hardware with more powerful processing capabilities upstream. In addition to distributed infrastructure close to the core network, the requirements for the cloud are becoming more centralized and scalable. However, when you use cloud computing, the latency is much higher (and much less consistent). Advantages of low-latency edge computing It’s easy to view low latency as the killer feature of edge computing, especially given the physical limitations of cloud computing. Data can’t travel faster than the speed of light, so a request to a server hundreds or thousands of miles away necessarily takes tens or hundreds of milliseconds to complete. The difference is imperceptible when you’re scrolling on a web page. But for a surgeon operating remotely or a gamer in virtual reality, these delays are unacceptable. Edge computing eliminates latency to maintain data consistency. Above: Changes in network bandwidth and latency requirements for future devices and applications Image source: State of the Edge 2020 Edge computing also avoids transferring data back and forth between connected devices and the cloud. If you can determine the value of data closer to where it is created, you can optimize how data flows. Dedicating traffic to only data on the cloud can reduce bandwidth and storage costs, as well as applications that are not sensitive to latency. Edge computing also brings reliability. In harsh environments, there may be many problems with data transmission between device edge and centralized cloud. For example, in offshore platforms, refineries or solar farms, device edge and infrastructure edge can operate semi-autonomously without connecting to the cloud. Distributed architectures are even a boon for security. Moving less information to the cloud means less information to be intercepted. Data analysis at the edge spreads risk geographically. Endpoint information itself is not easy to protect, so firewalls at the edge help limit the scope of an attack. And, keeping data local can be useful for compliance reasons. Edge infrastructure offers the flexibility of access based on geography or version permissions. 5G and edge computing promote each other Edge computing is not new. As early as 2000, content delivery networks were referred to as edge networks. But it is widely believed that as 5G coverage expands, edge computing will help solve the high-bandwidth, low-latency needs of modern applications using local (rather than regional) computing resources. 5G technology will bring computing resources closer to where data is generated, thereby improving the speed, reliability and flexibility of enterprise applications. More information will be efficiently transmitted between 5G networks without the need to go back and forth to the central cloud. As a result, we will see application cases that did not exist before. According to the Edge Computing State Report 2020, the greatest demand for edge computing comes from communications network operators, who are constantly updating their infrastructure and upgrading their 5G networks. Mobile consumer services running on these networks will rely on edge computing to support applications such as online gaming platforms, augmented/virtual reality, and artificial intelligence. Smart homes, smart grids, and smart cities all tend to use device edge platforms. However, as these use cases evolve and become more complex, the need for infrastructure edge capabilities will also emerge. 5G's provisions for ultra-reliable low-latency communications (URLLC) and massive machine-type communications (mMTC) mean that devices and edge computing can be brought closer together, making their short connections more efficient. As shown in the figure: Traffic lights are networked and connected to edge gateways, where data can be collected and analyzed. As part of the edge network, they can provide data to mapping tools and reroute around congestion issues. It is also worth mentioning self-driving cars. It is a prime example of edge computing enhanced by 5G. The latest cars now use computing resources on the edge of the device for collision avoidance, lane keeping, and adaptive cruise control. But as assisted driving and autonomous driving functions become more complex, it will be necessary to add edge resources from the infrastructure of the surrounding environment. For example: adjusting the trip according to the traffic conditions ahead, coordinating with other autonomous vehicles to pass at red lights, or making decisions in a split second to avoid unsafe situations. Edge computing still needs to grow Edge computing is listed as one of Gartner's top 10 strategic technology trends for 2020. Several other concepts on the list are also rooted in edge computing. Hyperautomation, which focuses on applying technologies such as artificial intelligence and machine learning, will rely on low latency and a continuous and reliable communication foundation. Multidimensional experience is also a major trend, which relies on multidimensional sensors and multifunctional interfaces with high bandwidth and real-time processing. Of course, smart things are all related to AI, 5G and edge computing. As shown in the figure: Global annual capital expenditure on edge technology and data centers is expected to reach US$146 billion in 2028, with a compound annual growth rate of 35%. Image source: "Edge Computing State Report 2020" Enabling these new applications will require massive investments: Tolaga Research predicts that cumulative capital expenditures on IT and data center infrastructure will reach $700 billion between now and 2028. As computing resources spread from centralized clouds to distributed edges, new applications and opportunities abound, especially at the edge of mature infrastructure. Understanding the impact of edge computing and 5G will help you bring a more seamless experience to your customers, gain insight into new markets, and make responsive decisions quickly. Original link: https://venturebeat.com/2019/12/20/get-2020-vision-about-edge-computing-and-5g/ |
<<: Practice | Wireless AC+AP deployment solution
>>: Some Europeans are extremely resistant to 5G, which will only accelerate their elimination
KVMLA is a long-established Chinese business foun...
[51CTO.com original article] On June 21, the WOT2...
[51CTO.com original article] Veeam ON Forum hoste...
Interview question overview: What are the underly...
Data center companies often encounter hardware an...
[[178931]] The application scenarios of various s...
Kuroit also released a promotion during the Chris...
Engineers who have a little knowledge of computer...
The UK has just officially announced that Huawei ...
HostHatch has released a new promotion plan on LE...
At the 2017 China Management Global Forum and Kin...
[[355718]] While people may think of 5G as a cool...
[51CTO.com original article] On March 21, Huawei ...
The arrival of 5G not only makes the Internet of ...