The growing number of “connected” devices is generating an excessive amount of data, and this will continue as Internet of Things (IoT) technologies and use cases grow in the coming years. According to research firm Gartner, by 2020, there will be as many as 20 billion connected devices generating billions of bytes of data per user. These devices are not just smartphones or laptops, but also connected cars, vending machines, smart wearables, surgical medical robots, and more.
The large amount of data generated by countless types of such devices needs to be pushed to a centralized cloud for retention (data management), analysis, and decision-making. Then, the analyzed data results are transmitted back to the device. This round trip of data consumes a lot of network infrastructure and cloud infrastructure resources, further increasing latency and bandwidth consumption issues, thus affecting mission-critical IoT use. For example, in self-driving connected cars, a large amount of data is generated every hour; the data must be uploaded to the cloud, analyzed, and instructions sent back to the car. Low latency or resource congestion may delay the response to the car, which may cause traffic accidents in serious cases. IoT Edge Computing This is where edge computing comes in. Edge computing architecture can be used to optimize cloud computing systems so that data processing and analysis are performed at the edge of the network, closer to the data source. With this approach, data can be collected and processed near the device itself, rather than sending it to the cloud or data center. Benefits of edge computing:
The advent of edge computing does not replace the need for traditional data centers or cloud computing infrastructure. Instead, it coexists with the cloud as the computing power of the cloud is distributed to endpoints. Machine Learning at the Network Edge Machine learning (ML) is a complementary technology to edge computing. In machine learning, the generated data is fed to the ML system to produce an analytical decision model. In IoT and edge computing scenarios, machine learning can be implemented in two ways.
Edge computing and the Internet of Things Edge computing, together with machine learning technology, lays the foundation for the agility of future communications for IoT. The upcoming 5G telecommunication network will provide a more advanced network for IoT use cases. In addition to high-speed and low-latency data transmission, 5G will also provide a telecommunication network based on mobile edge computing (MEC), enabling automatic implementation and deployment of edge services and resources. In this revolution, IoT device manufacturers and software application developers will be more eager to take advantage of edge computing and analytics. We will see more intelligent IoT use cases and an increase in intelligent edge devices. Original link: http://www.futuriom.com/articles/news/what-is-edge-computing-for-iot/2018/08 |
Telecom operators are investing in operator softw...
Hosteons has just launched a promotion for July. ...
Multiple IoT transmission technologies have been ...
In 2017, the application of 5G technology has bec...
From oil and gas, refineries and chemicals to pha...
RackNerd is a foreign hosting company founded in ...
5G is currently the most eye-catching new technol...
Netty is a network application framework, so from...
TLS v1.2 was released in August 2008. Ten years l...
April 25, 2023 – Akamai Technologies, Inc. (Akama...
Review the above: Multi-access Edge Computing – P...
New infrastructure (i.e. new infrastructure const...
When dealing with complex network environments, i...
[[442802]] Recently, the Central Cyberspace Secur...
[51CTO.com original article] On August 27, the 5-...