Conference Paper

Improving 5G-Advanced Sub-Band Full Duplex Performance with gNB Interference Mitigation

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
As an essential technology in the evolution towards full duplex, the subband full duplex (SBFD) operation has attracted significant attention from both the academic and industrial communities, as well as global standardization bodies for its capability to increase the user perceived throughput (UPT) and the uplink coverage, reduce the transmission latency and support various traffics with different requirements in the same cell in time division duplexing (TDD) bands. In this paper, the model of a SBFD based communication system is clearly presented, along with the interference. To facilitate the early deployment of SBFD in 5G-Advanced and 6G networks, comprehensive performance evaluations on UPT, latency and coverage under different scenarios, different SBFD patterns and different traffics are conducted to explore the potential of the SBFD operation. Furthermore, a field test is carried out to verify the practical performance of SBFD in a typical outdoor scenario. Results obtained from both the simulations and field tests demonstrate significant performance gain for SBFD compared with traditional division duplexing schemes, especially for uplink transmissions.
Article
Full-text available
This article presents sub-band full duplex (SBFD) as a duplexing scheme to improve the uplink (UL) throughput in 5G–Advanced networks, as an alternative to traditional time-division duplexing (TDD). SBFD provides opportunities to transmit and receive simultaneously on non-overlapping frequency resources. To accomplish this, SBFD time slots include both UL and downlink (DL) transmission. This leads to UL transmission being more expanded in the time domain rather than the frequency domain, which allows to increase in the amount of UL transmission opportunity, as compared to TDD where the majority of time slots are used for DL. Concurrent UL and DL transmission create different types of interference, which makes cancellation approaches essential for appropriate performance. The SBFD interference types, including self-interference as the main challenge of SBFD deployment, are outlined and corresponding analytical models are proposed to provide a realistic evaluation of SBFD performance. System-level simulations with different load conditions in a high-power urban macro environment are used to evaluate the SBFD performance in comparison with TDD as the baseline. The results indicate a four times increase in the UL throughput for cell-edge users as well as 32% and 6% increase in average UL throughput, at low and medium loads, respectively. Furthermore, simulation results determine that at least 149 dB of self-interference mitigation is required for acceptable performance in SBFD. Results also show that SBFD benefits are limited by inter-site gNB-to-gNB interference.
Article
Full-text available
With new services emerging in fifth generation (5G)-advanced, the evolution in duplex modes plays an important role to meet more stringent requirements for both downlink (DL) and uplink (UL) transmissions. In this paper, sub-band full duplex (SBFD) at base station (BS) is studied as an attainable evolution of the traditional time division duplex (TDD). Both user equipment (UE) transparent SBFD and UE perceptive SBFD are proposed and studied to serve different types of UEs. To tackle the interference introduced by SBFD, a model including both BS self-interference and cross-link interference (CLI) is presented as a first step, and new interference management schemes are proposed. Three approaches to mitigate BS self-interference, namely the passive suppression, analog interference cancellation and digital interference cancellation are analyzed. A new framework for CLI management is illustrated along with enhancements for interference identification, spatial domain interference coordination and power domain adjustment. To validate the feasibility and performance of the proposed SBFD methods under indoor and dense urban scenarios, system-level simulations (SLSs) are carried out and a proof-of-concept (PoC) is developed for the purpose of obtaining experimental results.
Conference Paper
Full-text available
UTRAN long term evolution is currently under standardization within 3GPP with the aim of providing a spectral efficiency 2 to 4 times higher than its predecessor HSUPA/HSDPA release 6. Single carrier FDMA has been selected as multiple access for the uplink. This technology requires the subcarriers allocated to a single user to be adjacent. The consequence is a reduced allocation flexibility which makes it challenging to design effective packet scheduling algorithms. This paper proposes a channel aware packet scheduling algorithm which exploits the bandwidth flexibility offered by the system to perform an allocation which closely resembles the frequency domain envelope of the metric to be optimized. Compared to a fixed bandwidth approach, the proposed algorithm provides a greater flexibility given the inbuilt adaptation to different scenarios and loads, as well as an improvement in term of performance for the Macro 3 case. In this case the uplink capacity is increased by approximately 20% in average cell throughput and 10% in UE outage compared to a fixed bandwidth channel aware approach.
Chapter
Subband non-overlapping full duplex (SBFD) simultaneously receives uplink (UL) and transmits downlink (DL) signals on the same time division duplex (TDD) carrier but on different frequency resources, which improves UL coverage and throughput. However, these benefits are damaged greatly due to the severe inter-SB cell-to-cell cross-link interference (CLI). Besides, inter-SB cell-to-cell CLI might cause UL communication blocking with a high probability. To cope with this challenge, the performance of coordinated beamforming (CBF) and minimum mean square error (MMSE)-interference rejection combining (IRC) on inter-SB CLI suppression is studied. Firstly, the characteristics and model of inter-SB CLI are analyzed in SBFD scenario. Then, MMSE-IRC and detailed CBF schemes are proposed to reduce inter-SB CLI. To evaluate the proposed schemes, system-level simulation is provided. In the simulation, CBF could suppress inter-SB CLI by 20 dB and avoid UL blocking issue. Moreover, MMSE-IRC receiver improves the UL coverage by 4.85 dB when compared with legacy TDD.KeywordsSubband non-overlapping full duplexCoordinated beamformingInterference rejection combiningCross-link interference
Article
The 3rd generation partnership project (3GPP) radio access network (RAN) plenary recently approved a work package for its Release 18, which represents a major evolution and is branded as the first release of 5G Advanced. The work package includes diverse study and work items which will further boost 5G performance significantly and address a wide variety of new use cases. In this article, we provide an overview of the 5G Advanced evolution in 3GPP Release 18.
Chapter
Mobile wireless network has evolved from the first generation of analog communications to the fourth-generation evolved universal terrestrial radio access network (E-UTRAN). E-UTRAN access technology is also referred to long-term evolution (LTE) with its spectral efficiency increased 150 times as compared to the first-generation analog access technology.
Article
Radio resource management algorithms ranging from bearer admission control to semi-persistent and dynamic packet scheduling, fast link adaptation, and transmission control of multi-antenna transmission modes are addressed in this article for UTRAN long-term evolution. First, a high-level system overview of LTE is given, with special emphasis on the important components related to RRM. The quality of service parameter framework is outlined, as one of the main objectives for the families of RRM algorithms is to maximize system capacity while serving all users according to their minimum QoS constraints. It is demonstrated how the collocation of the RRM algorithms at the base station with easy access to air interface measurements offers opportunities for efficient cross-functional optimization between layers 1, 2, and 3. Examples of performance results for different traffic mixes and antenna transmission schemes are also presented, and the article is concluded with recommendations-on how to operate the various RRM options under different load and traffic conditions.
Article
Radio resource management algorithms ranging from bearer admission control to semi-persistent and dynamic packet scheduling, fast link adaptation, and transmission control of multi-antenna transmission modes are addressed in this article for UTRAN long-term evolution. First, a high-level system overview of LTE is given, with special emphasis on the important components related to RRM. The quality of service parameter framework is outlined, as one of the main objectives for the families of RRM algorithms is to maximize system capacity while serving all users according to their minimum QoS constraints. It is demonstrated how the collocation of the RRM algorithms at the base station with easy access to air interface measurements offers opportunities for efficient cross-functional optimization between layers 1, 2, and 3. Examples of performance results for different traffic mixes and antenna transmission schemes are also presented, and the article is concluded with recommendations on how to operate the various RRM options under different load and traffic conditions.
Conference Paper
In this study we analyze the downlink OFDMA system level performance for three different channel quality indicator (CQI) reporting schemes. The effect of terminal measurement and estimation errors, quantization from formatting and compression, and uplink reporting delays and detection errors are included. We find that a simple threshold-based CQI scheme provides an attractive trade-off between downlink system level performance and uplink CQI signaling overhead, as compared to using a best-M scheme. When applied to the UTRAN LTE system in a 10 MHz bandwidth, we find that a frequency domain packet scheduling gain of 40% is achievable with a CQI word size of only 30-bits. Finally, the effect of applying a so-called outer loop link adaptation algorithm is reported.
Guidelines for evaluation of radio interface technologies for IMT-2020
  • Series
5G New Radio: A beam-based air interface