There are still many unanswered questions about 5G, but it is clear that it will leave a lasting mark and influence on the world. With a transmission speed of 10 gigabits per second, it is about 100 times faster than the connection speed of 4G networks. There are also bandwidth improvements and latency of less than 1 millisecond. The impact of these on mobile Internet, driverless cars, drones, smart homes, smart cities, smart grids, and many other technologies is indeed impressive. But what about data centers, which have been around for years? How will these important structures change once 5G technology is widely adopted? First, we need to understand the limitations of current infrastructure and what this means for the rollout of 5G. What’s holding back the rollout of 5G? Fault tolerance is one of the reasons why 5G seems to be at a standstill at the moment. The systems mentioned above, such as smart P2P grids, telemedicine, and remote robotic surgery, have high fault tolerance requirements. Surgeons have successfully performed surgeries 30 miles away from the operating room using robotic arms and 5G networks. Needless to say, such systems cannot fail during use. Otherwise, communication between autonomous driving and smart city infrastructure will not be possible.
The principle of 4G technology is relatively "primitive", with devices connecting to one infrastructure at a time, such as a transmission tower, and then transmitting to the next transmission tower, and so on. But with 5G, our devices and equipment must communicate and transmit with multiple base stations and other infrastructure at the same time if they want to achieve "zero fault tolerance". This is also called "spherical coverage". Many of the splashy 5G demonstrations to date have involved connections between handheld or IoT devices and local network routers or base stations. But much of the rest of the internet’s backend, our servers and data centers, aren’t yet fast enough or low enough to handle 5G connections. Simply put, the core of the problem is that data processing and server facilities need to be closer to the edge of the network. How data centers must change to accommodate 5G 4G networks are slow enough that most people don’t notice the delays caused by data packets traveling hundreds or thousands of miles. To solve this problem, in the 5G era, data centers around the world need to be more dispersed than they are now, so that the geographical distribution is more conducive to data transmission. Only in this way can the high speed and low latency presented by 5G be realized. There are no other simple shortcuts. "Micro data centers" are a way forward. They're also called "containerized" data centers. To get an idea of what this might look like in practice, attach a micro data center to each site. Then, imagine more cell towers than you have today. Building this infrastructure will get us there to a certain extent, and will allow us to deploy 5G-powered IoT devices across a fairly large geographic area without latency. But what about larger, industrial-scale data processing tasks? That's a slightly different story.
Businesses that rely on large-scale data transfer and processing can build new data centers with relative ease, but smaller companies may be left behind or turn to managed services to migrate traffic while new infrastructure is integrated. For companies large and small that rely on the accumulation, analysis, and distribution of data, the goal is to move processing equipment closer to where the data is generated: near the end user. Under the existing computing model, services and devices must send data to the cloud, then to the “core” data processing infrastructure, and back again. But this model is not fast enough for 5G, nor for the capabilities and emerging technologies it will help us unlock. What does 5G really mean? “General-purpose technology” (GPT) refers to branches of technology that can impact, transform, and improve an entire country’s economy. When we accept the real needs of 5G and build shared and proprietary infrastructure accordingly, it will enter the realm of GPTs like previous products, such as steam engines, interchangeable parts, cars, and the Internet itself. Despite the obstacles to development, we are seeing a sea change in the way people communicate with data services. By 2025, humanity will have 75 billion connected devices. With 5G technology, these devices will be able to transmit more wireless data, and faster, than at any time in human history. But getting there will require a foundation laid by both the public and private sectors, which means rethinking the size and placement of data centers, as well as new business models for sharing data transmission and processing power. |
<<: The data center dilemma: Is data destroying the environment?
>>: Enterprise network cabling will be affected by five major technology trends
Huaruiyun is affiliated to Shenzhen Huaruiyun Net...
The tribe has shared information about different ...
[51CTO.com original article] The Internet of Thin...
On the morning of December 4, Jinan High-tech Zon...
I checked the port modification records in the si...
The ubiquitous wireless technology Wi-Fi has beco...
Miao Wei, deputy director of the Economic Committ...
Whoever masters advanced network technology first...
On September 28, hackers used Facebook's secu...
Digital twins, or DTs, are data-enabled city mode...
For most of the front-end developers interviewed,...
The goal of network function virtualization in th...
RackNerd is a foreign VPS hosting company founded...
5G could help realize the ideal of modular factor...
The theme of this issue of 5G Encyclopedia is: Ho...