The growing number of “connected” devices is generating an excessive amount of data, and this will continue as Internet of Things (IoT) technologies and use cases grow in the coming years. According to research firm Gartner, by 2020, there will be as many as 20 billion connected devices generating billions of bytes of data per user. These devices are not just smartphones or laptops, but also connected cars, vending machines, smart wearables, surgical medical robots, and more.
The large amount of data generated by countless types of such devices needs to be pushed to a centralized cloud for retention (data management), analysis, and decision-making. Then, the analyzed data results are transmitted back to the device. This round trip of data consumes a lot of network infrastructure and cloud infrastructure resources, further increasing latency and bandwidth consumption issues, thus affecting mission-critical IoT use. For example, in self-driving connected cars, a large amount of data is generated every hour; the data must be uploaded to the cloud, analyzed, and instructions sent back to the car. Low latency or resource congestion may delay the response to the car, which may cause traffic accidents in serious cases. IoT Edge Computing This is where edge computing comes in. Edge computing architecture can be used to optimize cloud computing systems so that data processing and analysis are performed at the edge of the network, closer to the data source. With this approach, data can be collected and processed near the device itself, rather than sending it to the cloud or data center. Benefits of edge computing:
The advent of edge computing does not replace the need for traditional data centers or cloud computing infrastructure. Instead, it coexists with the cloud as the computing power of the cloud is distributed to endpoints. Machine Learning at the Network Edge Machine learning (ML) is a complementary technology to edge computing. In machine learning, the generated data is fed to the ML system to produce an analytical decision model. In IoT and edge computing scenarios, machine learning can be implemented in two ways.
Edge computing and the Internet of Things Edge computing, together with machine learning technology, lays the foundation for the agility of future communications for IoT. The upcoming 5G telecommunication network will provide a more advanced network for IoT use cases. In addition to high-speed and low-latency data transmission, 5G will also provide a telecommunication network based on mobile edge computing (MEC), enabling automatic implementation and deployment of edge services and resources. In this revolution, IoT device manufacturers and software application developers will be more eager to take advantage of edge computing and analytics. We will see more intelligent IoT use cases and an increase in intelligent edge devices. Original link: http://www.futuriom.com/articles/news/what-is-edge-computing-for-iot/2018/08 |
In the rapidly evolving world of telecommunicatio...
With the continuous evolution and development of ...
The story of how home networks are as slow as a s...
Not all workloads are suitable for the cloud, whi...
It is widely believed that 5G mobile networks wil...
DingTalk made its debut in Japan. "Well... t...
Hello everyone, I am Bernie, an IT pre-sales engi...
If you were to pick the most used feature in Chro...
When discussing the coming 5G era, attention is o...
HostYun launched a special promotion from the 12t...
5G has been hyped as a new key technology for ent...
[51CTO.com original article] Xiao Wang is an ordi...
Wen Ku, spokesman for the Ministry of Industry an...
The rise of remote work is arguably the biggest c...