Last year, the 128-antenna Massive MIMO testbed at University of Bristol was used to set world records in per-cell spectral efficiency. Those measurements were conducted in a controlled indoor environment, but demonstrated that the theoretical gains of the technology are also practically achievable—at least in simple propagation scenarios.
The Bristol team has now worked with British Telecom and conducted trials at their site in Adastral Park, Suffolk, in more demanding user scenarios. In the indoor exhibition hall trial, 24 user streams were multiplexed over a 20 MHz bandwidth, resulting in a sum rate of 2 Gbit/s or a spectral efficiency of 100 bit/s/Hz/cell.
Several outdoor experiments were also conducted, which included user mobility. We are looking forward to see more details on these experiments, but in the meantime one can have a look at the following video:
Update: We have corrected the bandwidth number in this post.
The Mobile World Congress (MWC) was held in Barcelona last week. Several major telecom companies took the opportunity to showcase and describe their pre-5G solutions based on Massive MIMO technology.
Huawei and Optus carried out an infield trial on February 26, where a sum rate of 655 Mbit/s was obtained over a 20 MHz channel by spatial multiplexing of 16 users. This corresponds to 33 bit/s/Hz or 2 bit/s/Hz/user, which are typical spectral efficiencies to expect from Massive MIMO. The base station was equipped with 128 antenna ports, but the press release provides no details on whether uplink or downlink transmission was considered.
Nokia and Sprint demonstrated TDD-based Massive MIMO technology for LTE networks, using 64 antenna ports at the base station. Spatial multiplexing of eight commercial LTE terminals was considered. Communication theory predicts that the sum rate should grow proportionally to the number of terminals, which is consistent with the 8x improvement in uplink rates and 5x improvement in downlink rates that were reported. Further details are found in their press release or in the following video:
The cellular network that my smartphone connects to normally delivers 10-40 Mbit/s. That is sufficient for video-streaming and other applications that I might use. Unfortunately, I sometimes have poor coverage and then I can barely download emails or make a phone call. That is why I think that providing ubiquitous data coverage is the most important goal for 5G cellular networks. It might also be the most challenging 5G goal, because the area coverage has been an open problem since the first generation of cellular technology.
It is the physics that make it difficult to provide good coverage. The transmitted signals spread out and only a tiny fraction of the transmitted power reaches the receive antenna (e.g., one part of a billion parts). In cellular networks, the received signal power reduces roughly as the propagation distance to the power of four. This results in the following data rate coverage behavior:
This figure considers an area covered by nine base stations, which are located at the middle of the nine peaks. Users that are close to one of the base stations receive the maximum downlink data rate, which in this case is 60 Mbit/s (e.g., spectral efficiency 6 bit/s/Hz over a 10 MHz channel). As a user moves away from a base station, the data rate drops rapidly. At the cell edge, where the user is equally distant from multiple base stations, the rate is nearly zero in this simulation. This is because the received signal power is low as compared to the receiver noise.
What can be done to improve the coverage?
One possibility is to increase the transmit power. This is mathematically equivalent to densifying the network, so that the area covered by each base station is smaller. The figure below shows what happens if we use 100 times more transmit power:
There are some visible differences as compared to Figure 1. First, the region around the base station that gives 60 Mbit/s is larger. Second, the data rates at the cell edge are slightly improved, but there are still large variations within the area. However, it is no longer the noise that limits the cell-edge rates—it is the interference from other base stations.
Ideally, we would like to increase only the power of the desired signals, while keeping the interference power fixed. This is what transmit precoding from a multi-antenna array can achieve; the transmitted signals from the multiple antennas at the base station add constructively only at the spatial location of the desired user. More precisely, the signal power is proportional to M (the number of antennas), while the interference power caused to other users is independent of M. The following figure shows the data rates when we go from 1 to 100 antennas:
Figure 3 shows that the data rates are increased for all users, but particularly for those at the cell edge. In this simulation, everyone is now guaranteed a minimum data rate of 30 Mbit/s, while 60 Mbit/s is delivered in a large fraction of the coverage area.
In practice, the propagation losses are not only distant-dependent, but also affected by other large-scale effects, such as shadowing. The properties described above remain nevertheless. Coherent precoding from a base station with many antennas can greatly improve the data rates for the cell edge users, since only the desired signal power (and not the interference power) is increased. Higher transmit power or smaller cells will only lead to an interference-limited regime where the cell-edge performance remains to be poor. A practical challenge with coherent precoding is that the base station needs to learn the user channels, but reciprocity-based Massive MIMO provides a scalable solution to that. That is why Massive MIMO is the key technology for delivering ubiquitous connectivity in 5G.
Frequency-division duplex (FDD) operation of Massive MIMO in LTE is the topic of two press releases from January 2017. The first press release describes a joint field test carried out by ZTE and China Telecom. It claims three-fold improvements in per-cell spectral efficiency using standard LTE devices, but no further details are given. The second press release describes a field verification carried out by Huawei and China Unicom. The average data rate was 87 Mbit/s per user over a 20 MHz channel and was achieved using commercial LTE devices. This corresponds to a spectral efficiency of 4.36 bit/s/Hz per user. A sum rate of 697 Mbit/s is also mentioned, from which one could guess that eight users were multiplexed (87•8=696).
There are no specific details of the experimental setup or implementation in any of these press releases, so we cannot tell how well the systems perform compared to a baseline TDD Massive MIMO setup. Maybe this is just a rebranding of the FDD multiuser MIMO functionality in LTE, evolved with a few extra antenna ports. It is nonetheless exciting to see that several major telecom companies want to associate themselves with the Massive MIMO technology and hopefully it will result in something revolutionary in the years to come.
Efficient FDD implementation of multiuser MIMO is a longstanding challenge. The reason is the difficulty in estimating channels and feeding back accurate channel state information (CSI) in a resource-efficient manner. Many researchers have proposed methods to exploit channel parameterizations, such as angles and spatial correlation, to simplify the CSI acquisition. This might be sufficient to achieve an array gain, but the ability to also mitigate interuser interference is less certain and remains to be demonstrated experimentally. Since 85% of the LTE networks use FDD, we have previously claimed that making Massive MIMO work well in FDD is critical for the practical success and adoption of the technology.
We hope to see more field trials of Massive MIMO in FDD, along with details of the measurement setups and evaluations of which channel acquisition schemes that are suitable in practice. Will FDD Massive MIMO be exclusive for static users, whose channels are easily estimated, or can anyone benefit from it in 5G?
Update: Blue Danube Systems has released a press release that is also describing trials of FDD Massive MIMO as well. Many companies apparently want to be “first” with this technology for LTE.
The main selling point of millimeter-wave communications is the abundant bandwidth available in such frequency bands; for example, 2 GHz of bandwidth instead of 20 MHz as in conventional cellular networks. The underlying argument is that the use of much wider bandwidths immediately leads to much higher capacities, in terms of bit/s, but the reality is not that simple.
To look into this, consider a communication system operating over a bandwidth of Hz. By assuming an additive white Gaussian noise channel, the capacity becomes
where W is the transmit power, is the channel gain, and W/Hz is the power spectral density of the noise. The term inside the logarithm is referred to as the signal-to-noise ratio (SNR).
Since the bandwidth appears in front of the logarithm, it might seem that the capacity grows linearly with the bandwidth. This is not the case since also the noise term in the SNR also grows linearly with the bandwidth. This fact is illustrated by Figure 1 below, where we consider a system that achieves an SNR of 0 dB at a reference bandwidth of 20 MHz. As we increase the bandwidth towards 2 GHz, the capacity grows only modestly. Despite the 100 times more bandwidth, the capacity only improves by , which is far from the that a linear increase would give.
The reason for this modest capacity growth is the fact that the SNR reduces inversely proportional to the bandwidth. One can show that
The convergence to this limit is seen in Figure 1 and is relatively fast since for .
To achieve a linear capacity growth, we need to keep the SNR fixed as the bandwidth increases. This can be achieved by increasing the transmit power proportionally to the bandwidth, which entails using more power when operating over a wider bandwidth. This might not be desirable in practice, at least not for battery-powered devices.
An alternative is to use beamforming to improve the channel gain. In a Massive MIMO system, the effective channel gain is , where is the number of antennas and is the gain of a single-antenna channel. Hence, we can increase the number of antennas proportionally to the bandwidth to keep the SNR fixed.
Figure 2 considers the same setup as in Figure 1, but now we also let either the transmit power or the number of antennas grow proportionally to the bandwidth. In both cases, we achieve a capacity that grows proportionally to the bandwidth, as we initially hoped for.
In conclusion, to make efficient use of more bandwidth we require more transmit power or more antennas at the transmitter and/or receiver. It is worth noting that these requirements are purely due to the increase in bandwidth. In addition, for any given bandwidth, the operation at millimeter-wave frequencies requires much more transmit power and/or more antennas (e.g., additional constant-gain antennas or one constant-aperture antenna) just to achieve the same SNR as in a system operating at conventional frequencies below 5 GHz.
Massive MIMO is often mentioned as a key 5G technology, but could it also be exploited in currently standardized LTE networks? The ZTE-Telefónica trials that were initiated in October 2016 shows that this is indeed possible. The press release from late last year describes the first results. For example, the trial showed improvements in network capacity and cell-edge data rates of up to six times, compared to traditional LTE.
The Massive MIMO blog has talked with Javier Lorca Hernando at Telefónica to get further details. The trials were carried out at the Telefónica headquarters in Madrid. A base station with 128 antenna ports was deployed at the rooftop of one of their buildings and the users were located in one floor of the central building, approximately 100 m from the base station. The users basically had cell-edge conditions, due to the metallized glass and multiple metallic constructions surrounding them.
The uplink and downlink data transmissions were carried out in the 2.6 GHz band. Typical Massive MIMO time-division duplex (TDD) operation was considered, where the uplink detection and downlink precoding is based on uplink pilots and channel reciprocity. The existing LTE sounding reference signals (SRSs) were used as uplink pilots. The reciprocity-based precoding was implemented by using LTE’s transmission mode 8 (TM8), which supports any type of precoding. Downlink pilots were used for link adaptation and demodulation purposes.
It is great to see that Massive MIMO can be also implemented in LTE systems. In this trial, the users were static and relatively few, but it will be exciting to see if the existing LTE reference signals will also enable Massive MIMO communications for a multitude of mobile users!
Update: ZTE has carried out similar experiments in cooperation with Smartfren in Indonesia. Additional field trials are mentioned in the comments to this post.
One of the main impairments in wireless communications is small-scale channel fading. This refers to random fluctuations in the channel gain, which are caused by microscopic changes in the propagation environments. The fluctuations make the channel unreliable, since occasionally the channel gain is very small and the transmitted data is then received in error.
The diversity achieved by sending a signal over multiple channels with independent realizations is key to combating small-scale fading. Spatial diversity is particularly attractive, since it can be obtained by simply having multiple antennas at the transmitter or the receiver. Suppose the probability of a bad channel gain realization is p. If we have M antennas with independent channel gains, then the risk that all of them are bad is pM. For example, with p=0.1, there is a 10% risk of getting a bad channel in a single-antenna system and a 0.000001% risk in an 8-antenna system. This shows that just a few antennas can be sufficient to greatly improve reliability.
In Massive MIMO systems, with a “massive” number of antennas at the base station, the spatial diversity also leads to something called “channel hardening”. This terminology was used already in a paper from 2004:
In short, channel hardening means that a fading channel behaves as if it was a non-fading channel. The randomness is still there but its impact on the communication is negligible. In the 2004 paper, the hardening is measured by dividing the instantaneous supported data rate with the fading-averaged data rate. If the relative fluctuations are small, then the channel has hardened.
Since Massive MIMO systems contain random interference, it is usually the hardening of the channel that the desired signal propagates over that is studied. If the channel is described by a random M-dimensional vector h, then the ratio ||h||2/E{||h||2} between the instantaneous channel gain and its average is considered. If the fluctuations of the ratio are small, then there is channel hardening. With an independent Rayleigh fading channel, the variance of the ratio reduces with the number of antennas as 1/M. The intuition is that the channel fluctuations average out over the antennas. A detailed analysis is available in a recent paper.
The figure above shows how the variance of ||h||2/E{||h||2} decays with the number of antennas. The convergence towards zero is gradual and so is the channel hardening effect. I personally think that you need at least M=50 to truly benefit from channel hardening.