5G is seen by the industry as a revolutionary wireless technology, but the high-frequency spectrum that is one of the foundations of the next-generation standard requires operators to adopt a completely different approach to building networks and upgrading previous cellular networks. The first generation of mobile networks in the 1980s brought consumers analog voice channels. The emergence of integrated circuits and digital signal processing in the 1990s led to the emergence of 2G, which greatly increased network capacity. 3G was born at the turn of the century, effectively combining mobile data with voice, allowing users to make calls while replying to emails. After 2010, the outbreak of 4G increased the speed of wireless Internet and enabled mobile phones to use a variety of desktop applications. Despite this, the communications industry and its customers remain isolated. The communications industry consists of cable/internet service providers, cable and internet service providers, wireless operators, and ultra-high-rise application providers. Consumers and businesses can get connections from a variety of operators and different platforms, which often cannot talk to each other. Users have a huge overhead in the network, and operators need to allocate a lot of resources to handle these demands, so signaling, billing, and device management systems came into being.
For end users, 5G is an ecosystem of connected applications, each of which will adaptively manage data speed, latency, and reliability based on the required tasks. For example, for self-driving cars that require extremely reliable, instant responses, 5G networks will provide the most extensive high-density coverage, low-latency, and encrypted communication links, rather than blindly allocating 100 MHz channels to cars, because high throughput does not equal low-latency and reliable network coverage. For service providers, 5G will integrate communication systems to meet end-user application needs such as data, voice, video, IoT and critical communications. 5G will provide higher throughput, ultra-low latency, and greatly improve network capacity, reliability and security services. What are the criteria for judging 5G networks? Generally speaking, 5G network architecture should provide:
To achieve these goals, network and user equipment manufacturers must innovate technologies to make networks more efficient and deploy new wireless spectrum to support high-bandwidth demands. Millimeter wave for 5G deployment The high bandwidth required for 5G is between 800MHz and 2GHz. The spectrum that can meet the deployment of 5G is the millimeter wave spectrum. When satellite communications begin to deploy Ka band, 26.5GHz to 40GHz, accompanied by the use of spot beam frequencies, it increases the channel bandwidth from the typical bandwidth of 54MHz to between 500MHz and 2GHz. This technology enables Gigabit IP connections, which is also a requirement for 5G. In October 2015, the FCC allocated three millimeter wave bands for 5G services. These bands are called the frontier spectrum for 5G services, and the spectrum above 24GHz is being actively investigated. The 28GHz band supports 850MHz bandwidth; the 37-40GHz band supports 3GHz bandwidth, and the 64-71GHz unlicensed band supports up to 7GHz bandwidth. These spectrum and bandwidth allocations make 5G services possible. mmWave link propagation and link budget Commercial wireless frequencies (including Wi-Fi) are generally below 6 GHz, and many design tools can implement the characteristics of these frequency bands, but the deployment of millimeter wave bands to provide links between UE (device) and base stations (BS) brings many technical challenges. The first thing to understand is the millimeter wave path loss properties and build a predictable mathematical model. To investigate the link behavior of 5G, path loss and link budget are two essential elements. 5G links include both line-of-sight (LOS) and non-line-of-sight (NLOS) components in the wireless propagation environment. LOS is close but not exact above 60GHz, freeing up spatial path loss, while NLOS path loss deviates significantly from the usable domain. The typical process is to make propagation loss measurements at a specific frequency and terrain, then perform a curve fit to find the loss of exponential n. The combined path loss is proportional to (the distance between the transmit and receive antennas) is, where n is the exponential loss and can be between 2 and 4. End-to-end microwave communications require a clearance between the propagation path and the nearest obstacle on the ground, governed by the Fresnel zone theory. If this zone is 60%, it is LOS propagation. However, the antenna height of 5G networks is relatively low, which may bring significant propagation obstruction. 5G mmWave budgets are significantly different from traditional sub-6 GHz wireless link budgets and can introduce additional losses due to rainfall, atmospheric absorption, humidity, and Fresnel blockage. Below is an example calculation for a 5G link budget, which may vary depending on the frequency band and type of cell. Received Power in dBm = Tx Power + Tx Antenna Gain + Rx Antenna Gain - Path Loss - Rainfall (estimated 2dB/200m) - Shadow Loss (20 to 30dB) - Foliage Loss (10 to 50dB) - Atmospheric Absorption - Terrain/Humility - Fresnel Blockage - System Profit Margin Fresnel zone radius (R) = 17.32 × √(d/4f) (d, km, f in GHz) By examining the above calculations, it is clear that there are many factors that can impair mmWave transmission, and link budget is an area that any 5G deployment team needs to consider. Propagation loss measurement Measuring millimeter waves includes a signal generator, a spectrum analyzer, and two phased array/horn antennas. The signal generator simulates a base station, which should be installed at the selected site; the signal generator sweeps the frequency band from 27.5 to 28.35 GHz; the spectrum analyzer measures the received signal at a certain distance. Since the signal generator and spectrum analyzer are not synchronized, the spectrum analyzer must be able to capture enough samples at a given frequency point before the signal generator is tuned to the next frequency. There are several ways to synchronize the signal generator and the spectrum analyzer, such as timer-based triggers, hardware triggers, or just free running with peak hold on the spectrum analyzer. Free running is not the best solution because it introduces a lot of errors that will affect the accuracy of the propagation model. To address these measurement challenges, Keysight's FieldFox analyzers feature Extended Range Transmission Analysis (ERTA). It connects two FieldFox analyzers together; triggers on each instrument synchronize the measurements. Deploying 5G on mmWave is a challenge for RF engineers, as the frequencies of 5G mmWave must have reliable channel models. Massive MIMO and beamforming are important components of 5G and require early and extensive testing to enable their deployment. |
<<: ICO was wiped out, is blockchain technology a blessing or a curse?
The importance of energy to national development ...
Modern infrastructure is generating log data at a...
In "TCP/IP Basics: Data Encapsulation",...
On March 24, 2020, Shugen Interconnect and Gartne...
This article is reprinted from the WeChat public ...
2020 is a critical year for my country's 5G c...
Megalayer recently launched a VPS host in Singapo...
Recently, China's three major operators have ...
[[352290]] This article is reprinted from the WeC...
Historical data has shown that performance has a ...
PS: This article does not involve knowledge about...
The trend of big AI models has reached the teleco...
In the past 2019, with the issuance of 5G commerc...
[[408831]] This article is reprinted from the WeC...
The hottest word in the technology field in 2016 ...