How high is the spectrum efficiency of 5G?

How high is the spectrum efficiency of 5G?

Wireless spectrum is the most valuable resource for operators. If the wireless network is compared to a rice field, the wireless spectrum is the land for growing rice. If the land is scarce and you want to achieve high yields, you can only work hard to cultivate good varieties.

[[352646]]

Each generation of development in mobile communications is equivalent to the cultivation of higher-yielding rice varieties. Combined with land reclamation and finding ways to use the previously difficult-to-use barren land, we can achieve a several-fold increase in yield.

For communications, increasing throughput means achieving faster data transmission rates (in Mbit/s) on the same bandwidth (usually in MHz). 4G and 5G can support a variety of different system bandwidths. To measure their capabilities, we need to calculate the transmission rate per unit bandwidth, also known as spectrum efficiency:

Rate (Mbit/s)/Bandwidth (MHz) = Spectral Efficiency (bit/s/Hz)

With this calculation, we can know the spectrum efficiency, that is, how many bits of data can be transmitted per hertz of spectrum per second.

We can do the math:

In the above table, the theoretical spectrum efficiency of 5G cells is 3.68 times that of 4G.

LTE uses the most mainstream 4-antenna transmission, and each cell and each user can achieve the same number of streams, which is a maximum of 4 streams; 5G uses 64 antennas for transmission. Although each user can only support a maximum of 4 streams, with the support of Massive MIMO technology, the same spectrum in the entire cell can be reused by multiple users, achieving a total of 16 streams, crushing 4G in peak rate.

In other words, the multi-user multi-stream transmission brought by Massive MIMO technology is the key to improving the theoretical spectrum efficiency of 5G. For a single user, the spectrum efficiency of 5G is equivalent to that of 4G, and the rate increase mainly depends on the increase of system bandwidth.

The above is the maximum spectrum efficiency in theory, which is completely different from the actual user in the real cell. In actual use, the peak rate is definitely not achievable, as there are too many factors that affect the rate.

This requires a more practical indicator: average spectral efficiency.

Imagine that the number of antennas in densely populated urban areas, suburbs, and rural areas is different, the antenna heights are different, the frequency bands may also be different, the station spacing is different, the number of users is different, and the reflection, diffraction, and absorption effects of buildings on wireless signal propagation are also different. Can the cell rates be the same?

Even under the same base station, multiple users are at different distances from the base station, use different mobile phones, move at different speeds, and perform different services. How much throughput can be achieved for so many users?

This system is too complex. To know the average spectrum efficiency, we must use a computer to input all the above variables into the system, add many assumptions, and calculate according to a certain model. This process is called simulation.

Generally, base stations use four antennas for transmission, and the average spectrum efficiency of 4G in urban areas is around 2.9 bit/s/Hz. That is to say, the average downlink rate of a 4G cell with a 20M bandwidth can only reach 58Mbps.

5G base stations use 64 antennas for transmission, and the average spectrum efficiency in densely populated urban areas is around 10 bit/s/Hz. That is to say, for a 5G cell with a 100M bandwidth, the average downlink rate is about 1Gbps.

Similar to the maximum spectrum efficiency, 5G is more than three times that of 4G, but 5G has a large spectrum bandwidth, so the final average downlink rate of 5G cells is also amazing.

So the question is, has the 5G vision been fulfilled? Let's look at the figure below. From the perspective of eMBB services, the user experience rate of 5G must reach 100Mbps.

Judging from the fact that the spectrum efficiency is around 10 bit/s/Hz, the average downlink rate of a 5G cell with 100M bandwidth is about 1Gbps, it seems that the average user experience rate has reached 10 times the target.

In fact, although the average rate is high, the user signal at the edge of the cell is not good and may be affected by other interferences, so it is difficult to achieve a rate of 100Mbps.

Therefore, in actual network planning, 50Mbps is generally used as the standard for edge users, and 100Mbps is only used as a challenge target for high-value areas.

Ultimately, to achieve good coverage and high speed, you have to spend money to build new stations, and network construction needs to consider the balance between investment and benefits.

<<:  Unity Online Technology Conference officially opens, with major upgrades to hardcore technology

>>:  The intelligent combination of 5G technology and artificial intelligence

Recommend

Eleven years of Tianyi Exhibition: How China Telecom plans for the future of 5G

In June this year, the Ministry of Industry and I...

Transitioning from IPv4 to IPv6, you can't miss these knowledge points

[[277315]] Understanding the network model The ne...

Two ways 5G will change cloud computing

5G is coming, and most people are looking for the...

Catch it all - Webpack project packaging 1

[[427986]] This article is reprinted from the WeC...

What is fog computing and how does it relate to the Internet of Things?

Fog computing is a distributed collaborative arch...

What are the security standards for 5G?

[Editor's Recommendation] 5G security standar...

The 5G era is accelerating. When will edge computing replace "core" computing?

In the 5G era, the number of connected devices wi...

Application of 5G in epidemic prevention and control in medical system

At the beginning of 2020, the COVID-19 epidemic w...