Edge computing is changing the way millions of devices around the world handle, process and deliver data. The explosive growth of connected devices (IoT), along with new applications that require real-time computing power, continues to drive the development of edge computing systems. Faster networking technologies, such as 5G wireless, are enabling edge computing systems to accelerate the creation or support of real-time applications such as video processing and analytics, autonomous vehicles, artificial intelligence and robotics. While early goals of edge computing were to address the bandwidth costs of transmitting data over long distances due to the growth of data generated by the IoT, the rise of real-time applications that require processing at the edge will drive the technology’s development.
What is edge computing? Gartner defines edge computing as “part of a distributed computing topology in which information processing is located near the edge — where things and people generate or consume information.” Fundamentally, edge computing brings computing and data storage closer to the devices where it is collected, rather than relying on a central location that can be thousands of miles away. This is done so that data, especially real-time data, does not suffer from latency issues that can affect application performance. Additionally, companies can save money by completing processing locally, reducing the amount of data that needs to be processed in a centralized or cloud-based location. Edge computing has evolved due to the exponential growth of IoT devices, which connect to the internet to receive information from the cloud or pass data back to the cloud. Many IoT devices generate large amounts of data during their operation. Think about equipment used to monitor production equipment on a factory floor, or an internet-connected camera that sends live footage from a remote office. While a single device producing data can easily transmit data across a network, problems arise when the number of devices transmitting data simultaneously increases. Instead of one camera transmitting live footage, multiply that by hundreds or thousands of devices. Not only will the quality suffer due to latency, but the bandwidth costs can be very high. Edge computing hardware and services, which are the processing and storage source for many of these systems, can help solve this problem. For example, an edge gateway can process data from an edge device and then send only the relevant data back through the cloud, reducing bandwidth requirements. Or, if a real-time application is required, it can send data back to the edge device. These edge devices can include many different things, such as IoT sensors, your employee's laptop, the latest smartphone, security cameras, or even the internet-connected microwave in your office break room. The edge gateway itself is considered an edge device in the edge computing infrastructure. Why is edge computing important? For many companies, cost savings alone can be a driving force behind deploying an edge computing architecture. Companies that have adopted cloud technology for many of their applications may have discovered that bandwidth costs are higher than expected. However, the biggest benefit of edge computing is the ability to process and store data faster, enabling more efficient real-time applications that are critical to companies. Before edge computing, a smartphone that scans a person’s face for facial recognition would need to run the facial recognition algorithm through a cloud-based service, which would take a lot of time to process. With the edge computing model, given the increased capabilities of smartphones, the algorithm can run locally on an edge server or gateway, or even on the smartphone itself. Applications such as virtual and augmented reality, self-driving cars, smart cities, and even building automation systems all require fast processing and response. “Edge computing has evolved significantly from the days of isolated IT at ROBO (remote office branch) locations,” said Kuba Stolarski, research director at IDC, in the Worldwide Edge Infrastructure (Compute and Storage) Forecast, 2019-2023 report. “With enhanced interconnectivity, improved edge access to more core applications, and enabled by new IoT and industry-specific business use cases, edge infrastructure is expected to become one of the key growth engines for the server and storage markets over the next decade and beyond.” Privacy and Security However, like many new technologies, solving one problem can cause others. Data at the edge can be troublesome from a security perspective, especially when the data is processed by other devices that are not as secure as centralized or cloud-based systems. As the number of IoT devices increases, IT must understand the potential security issues surrounding these devices and ensure that these systems can be protected. This includes ensuring that data is encrypted and that the correct access control methods and even VPN tunnels are used. In addition, different devices’ requirements for processing power, electricity, and network connectivity can affect the reliability of edge devices. This makes redundancy and failover management critical for devices processing edge data to ensure that data is delivered and processed correctly in the event of a single node outage. What about 5G? Globally, operators are deploying 5G wireless technology, which promises to bring high bandwidth and low latency benefits to applications, allowing companies to transition to firewalls using their data bandwidth. In addition to providing faster speeds and allowing companies to continue processing data in the cloud, many operators are also adopting edge computing strategies in their 5G deployments to provide faster real-time processing, especially for mobile devices, connected cars, and autonomous vehicles. Futuriom writes in “5G, IoT, and Edge Computing Trends” that 5G will be a catalyst for edge computing technology. “Applications using 5G technology will change traffic demand patterns, providing a large impetus for edge computing on mobile cellular networks,” the company wrote. It cites low-latency applications including IoT analytics, machine learning, virtual reality, and autonomous vehicles as these applications “have new bandwidth and latency characteristics that require support from edge computing infrastructure.” In its predictions for 2020, Forrester also mentioned the need for on-demand computing, with real-time application engagement playing a role in driving the growth of edge computing in 2020. It’s clear that while the initial goal of edge computing was to reduce bandwidth costs for IoT devices over long distances, the growth of real-time applications that require local processing and storage capabilities will drive the technology’s development in the coming years. |
<<: Huawei: 5G+AI opens a new era of smart city twins
On the eve of the Dragon Boat Festival, the Minis...
SD-WAN is more than just an alternative to Multip...
According to Fierce Wireless, 3GPP (a mobile comm...
[[188973]] A set of data: According to the 38th &...
South Korea is expected to launch 5G (fifth-gener...
The U.S. Commerce Department is about to sign a n...
According to a recent report released by DellOro ...
At present, the informatization construction of c...
ExtraVM is a foreign VPS service provider founded...
Whether it is the evolution of data communication...
[[405615]] After get off work, you feel empty and...
In September, when students of the University of ...
Recently, Ruijie Networks released two new servic...
As we all know, 5G has become the main battlefiel...
This article mainly explains the operation of soc...