5G NR is a complex of contradictions. It is difficult to achieve both capacity and coverage. 5G increases system capacity by expanding spectrum bandwidth. The frequency band range is extended from below 3GHz in the 4G era to the millimeter wave band, and the single carrier bandwidth is increased from 20MHz to more than 100MHz. However, the higher the frequency band, the smaller the base station coverage, and operators have to build more base stations.
Today, mainstream 5G deployment uses 5G mid-band, which compromises capacity and coverage, taking into account both outdoor and indoor coverage, and further improves cell capacity and coverage through Massive MIMO technology, allowing operators to build a wide-coverage 5G network based on existing 4G sites. However, facing the exponential growth of traffic in the future, limited mid-band resources alone are definitely not enough, so operators have to expand to the millimeter wave band, but the coverage range of millimeter wave signals is only one or two hundred meters, and it is impossible to reach indoors from outdoor, which has brought unprecedented pressure on network construction investment. what to do? Only through technological innovation can we continuously improve spectrum efficiency, allow each Hz to carry more bits, and make 5G deployment as efficient and economical as possible. Today we will introduce several major wireless technologies that are worth paying attention to in the post-5G era and even the 6G era. NOMA Multiple access is the core technology of mobile communications. From 1G to 5G, we have experienced FDMA, TDMA, CDMA and OFDMA. These multiple access schemes all use orthogonal design to avoid interference between multiple users. The mobile communications field has been committed to improving spectrum efficiency through the orthogonality of radio waves. We have adopted various orthogonal methods such as frequency division, time division, space division, and code division. But what should we do when the orthogonal space is exhausted? It's time for NOMA to come on stage. NOMA, or non-orthogonal multiple access, is a multiple access technology planned for 5G (R16 version), which can significantly improve the spectrum efficiency of mobile communication networks. As we all know, 4G and the current 5G use OFDMA (Orthogonal Frequency Division Multiple Access). The time-frequency resources occupied by each user are separate and mutually orthogonal. Due to the constraints of orthogonality, each UE is allocated a certain subcarrier and occupies part of the frequency resources. NOMA is different from OFDMA. It is based on non-orthogonality design, and each UE can use all resources. NOMA and OFDMA So, the question is, how does NOMA avoid mutual interference between multiple users? The basic idea of NOMA is to superimpose multiple UE signals at the transmitting end, occupy all time and frequency resources, and send them through the air interface. At the receiving end, the signals are decoded one by one based on MUD (multi-user detection) and SIC (serial interference cancellation) technologies to extract useful signals. There are two main NOMA methods: code domain based and power domain based. Code domain based means that each user is assigned a non-orthogonal spreading code (similar to WCDMA codes, except that WCDMA codes are orthogonal). Power domain based means that each user signal is superimposed at a different power level at the transmitting end. Taking the power domain-based NOMA solution as an example, its working principle is as follows: As shown in the figure above, the three UE signals are assigned different power levels. UE1, which is closest to the base station and has the best channel conditions, is assigned the lowest power, while UE3, which is farthest from the base station, is assigned the highest power, and UE2, which is in the middle, is assigned a moderate power. At the transmitting end of the base station, UE1, UE2 and UE3 all occupy the same time-frequency resources, and their signals are superimposed in the power domain and sent through the air interface. At the UE receiving end, SIC first decodes the signal with the strongest received signal strength, such as UE1. Since the power allocated to it is much lower than UE3, it may first decode UE3's signal and determine whether it is its own useful signal through the MA signature. If not, it deletes UE3's signal and then repeats the process until it finds its own useful signal. As for UE3, since the power allocated to it is higher than that of UE1 and UE2, the first decoded signal may be its own useful signal, and thus it can be directly decoded. Since NOMA allocates all air interface resources to all users, spectrum efficiency can be improved. Especially at the cell edge, due to the poor wireless environment, 5G networks using orthogonal multiple access have to use sparse modulation and coding to overcome channel impairment, which will lead to "waste" of PRB resources. But in NOMA, all users use all PRB resources, whether they are in the center or at the edge of the cell, thereby improving spectrum efficiency. It is worth mentioning that NOMA can also be used in combination with Massive MIMO. Under Massive MIMO, a physical sector can be split into multiple virtual sectors within the broadcast beam range. Users served by the virtual sectors use NOMA. Since the virtual sectors are orthogonal, the system capacity can be further doubled. However, NOMA also has its own challenges. First, MUD/SIC requires additional calculations, stronger hardware support, and higher power consumption. Although it is not a problem for the base station side, it is troublesome for the terminal, which will increase the terminal cost and power consumption. Secondly, under NOMA, the base station must group and allocate power for all UEs, which requires the base station to accurately understand the channel status of each UE. Full Duplex Today, 5G adopts TDD duplex mode. The 4G era included TDD and FDD, but strictly speaking, both TDD and FDD are only "half-duplex" because TDD transmits uplink and downlink signals in different time slots on the same frequency band, while FDD transmits uplink and downlink signals separately on two symmetrical frequency bands. Full-duplex technology can achieve simultaneous uplink and downlink signal transmission in the same frequency band (sending and receiving signals at the same time), which can undoubtedly greatly improve spectrum efficiency. At the same time, since full-duplex sends and receives data at the same time, feedback information can be received after sending data, which can also shorten transmission delay. However, the biggest challenge faced by full-duplex is that the transmitted signal generates strong self-interference on the received signal. For example, in a cellular network, the transmission power can be as high as tens of watts, while the received power is only a few picowatts. This means that the interference signal generated by the transmission may be billions of times stronger than the useful received signal, and the wireless transmitter will quickly saturate the receiver. As shown in the figure above, due to factors such as duplexer leakage, antenna reflection, and multipath reflection, the transmitted signal is mixed into the received signal, generating strong self-interference. How to eliminate these interferences? Fortunately, since the transmitted signal is known, the transmitted signal can be used as a reference to eliminate self-interference. However, the reference signal is easier to obtain in the digital domain. When the digital signal is converted to an analog signal, it is difficult to obtain a reference from it due to the influence of linear distortion and nonlinear distortion. Therefore, the RF domain is the biggest challenge for full-duplex to eliminate self-interference. At present, the self-interference elimination technology is constantly improving, but the implementation complexity and cost are too high. One way to solve this problem is to separate the transmit and receive antennas, install them at a distance from each other, and then achieve decoupling through methods such as antenna sidelobe suppression. Combined with spatial path loss, this can greatly reduce self-interference. However, this method is feasible on the base station side, but it is not feasible on the terminal side due to space constraints. Therefore, full-duplex technology may eventually be deployed on the base station side, while the terminal side may continue to use TDD duplex technology. OAM Is there a new orthogonal state of radio waves that could be exploited, in addition to time, frequency and polarization? That's the orbital angular momentum (OAM) of electromagnetic radiation. Affected by the spiral phase factor, electromagnetic waves with OAM are called "vortex electromagnetic waves" and are spiral-shaped along the propagation direction. The phase rotation structure of electromagnetic waves with OAM is called OAM mode. Radio waves with different OAM modes are orthogonal to each other and do not interfere with each other. Therefore, multiple signals modulated on different OAM modes can be transmitted at the same frequency, thereby improving spectrum efficiency. In theory, there are dozens of different OAM values modulating wireless signals, which can effectively increase spectrum efficiency by dozens of times.
OAM multiplexing principle However, practical demonstrations of OAM have so far been limited to near-field applications. Atmospheric turbulence can distort the OAM of radio waves, causing crosstalk, so there is still much work to be done before OAM can be applied to cellular networks. Machine Learning Machine learning can be used to optimize 5G air interfaces to improve spectrum efficiency. All layers of 5G NR can be optimized through machine learning. For example, machine learning can optimize the modulation, FEC, MIMO, signal detection, power control and beamforming of the physical layer, the scheduling, HARQ and flow control of layer 2, and the mobility management, load management and connection management of layer 3. Machine learning, especially deep reinforcement learning, can dynamically make optimization decisions based on traffic conditions and wireless environment to keep the network in the best state at all times. Taking modulation as an example, higher-order modulation can improve the transmission rate. For example, in the 4G era, we hope that all UEs can maximize the use of 256QAM to obtain better spectrum efficiency. But in reality, this is impossible, because as SINR decreases (for example, when the UE is at the edge of the cell), higher-order QAM constellations will be distorted, making it more difficult for the receiver to demodulate. With machine learning, it is possible to demodulate higher-order modulation with lower SINR by learning complex distortion patterns, thereby improving the spectrum efficiency of the system. |
<<: Check out the five important characteristics of SD-WAN in 2019
Recently, at the Second China Domain Name Develop...
Guizhou University of Finance and Economics (here...
Over the past five years, IT professionals who fo...
The fact that the 5G network in the United States...
Speaking of the Communications Design Institute, ...
The telecommunications industry likes to use the ...
I received an email from Justhost.ru, saying that...
In 2020, as the first year of 5G, 5G network cons...
[51CTO.com Quick Translation] As a data analyst, ...
Multiple IoT transmission technologies have been ...
We have shared product information of many data c...
A few days ago, 3GPP announced the freezing of th...
In today's digital economy era, it has become...
Friendhosting sent an email titled "Storage ...
According to Gartner's forecast, global IT sp...