The development of cellular wireless systems has entered the phase when 5G networks are being deployed and the foundations of 6G solutions are being identified. However, in parallel to this, another technological breakthrough is observed, as the concept of open radio access networks is coming into play. Together with advancing network virtualization and programmability, this may reshape the way the functionalities and services related to radio access are designed, leading to modular and flexible implementations. This paper overviews the idea of open radio access networks and presents ongoing O-RAN Alliance standardization activities in this context. The whole analysis is supported by a study of the traffic steering use case implemented in a modular way, following the open networking approach.
The next generation of wireless communication networks, or 6G, will fulfill the requirements of a fully connected world and provide ubiquitous wireless connectivity for all. Transformative solutions are expected to drive the surge for accommodating a rapidly growing number of intelligent devices and services. Major technological breakthroughs to achieve connectivity goals within 6G include: (i) a network operating at the THz band with much wider spectrum resources, (ii) intelligent communication environments that enable a wireless propagation environment with active signal transmission and reception, (iii) pervasive artificial intelligence, (iv) large-scale network automation, (v) an all-spectrum reconfigurable front-end for dynamic spectrum access, (vi) ambient backscatter communications for energy savings, (vii) the Internet of Space Things enabled by CubeSats and UAVs, and (viii) cell-free massive MIMO communication networks. In this roadmap paper, use cases for these enabling techniques as well as recent advancements on related topics are highlighted, and open problems with possible solutions are discussed, followed by a development timeline outlining the worldwide efforts in the realization of 6G. Going beyond 6G, promising early-stage technologies such as the Internet of NanoThings, the Internet of BioNanoThings, and quantum communications, which are expected to have a far-reaching impact on wireless communications, have also been discussed at length in this paper.
Handover failure and ping-pongs are the common thorny issues in modern mobile networks. While handover failures caused by radio link failure (RLF) significantly reduces the reliability of network operation, ping-pongs drastically waste signaling resources. In the upcoming fifth-generation (5G) networks especially, a complex deployment of small cells can exacerbate the two problems, even though the network can be integrated with a self-organizing network (SON), which is an automation-based solution. Due to the coupling of RLFs and ping-pongs as explained in the literature, it is difficult to analyze handovers and minimize both RLFs and ping-pongs simultaneously. In this paper, we model a handover procedure with geometric elements (Apollonian circles and the straight line), and analyze handover performance. The analysis provides an optimal handover setting for minimizing both RLFs and ping-pongs together, whereas previous works only considered trade-offs between them. We show that our analysis accurately estimates the optimal setting by comparing it with an NS-3 simulation. From the analysis, different environments can require different optimal values: fading (as well as interference) limit the optimal values; user speed has a scaling impact; and time-to-trigger has a shifting effect.
Ultra-dense networks represent the trend for future wireless 5G networks, which can provide high transmission rates in dense urban environments. However, a massive number of small cells are required to be deployed in such networks, and this requirement increases interference and number of handovers (HOs) in heterogeneous networks (HetNets). In such scenario, mobility management becomes an important issue to guarantee seamless communication while the user moves among cells. In this paper, we propose an auto-tuning optimization (ATO) algorithm that utilizes user speed and received signal reference power to adapt HO margin and time to trigger. The proposed algorithm aims to reduce the number of frequent HOs and HO failure (HOF) ratio. The performance of the proposed algorithm is evaluated through simulation with a two-tier model that consists of 4G and 5G networks. Simulation results show that the average rates of ping-pong HOs and HOF are significantly reduced by the proposed algorithm compared with other algorithms from the literature. In addition, the ATO algorithm achieves a low call drop rate and reduces HO delay and interruption time during user mobility in HetNets.
A large number of small cells in the next-generation mobile networks is expected to be deployed to satisfy 5G requirements. Mobility management is one of the important issues that require considerable attention in heterogeneous networks, where 5G ultra-dense small cells coexist with the current 4G networks. An efficient handover (HO) mechanism is introduced to address this issue and improve mobility management by adjusting HO control parameters (HCPs), namely, time-to-trigger and HO margin. Dynamic HCPs (D-HCPs), which explores user experiences to adjust HCPs and make an HO decision in a self-optimizing manner, is proposed in this paper. D-HCPs classify HO failure (HOF) into three categories, namely, too late, too early and wrong cell HO, and simultaneously adjust HCPs according to the dominant HOF. The algorithm is evaluated using different performance metrics, such as HO ping-pong, radio link failure and interruption time, with different mobile speed scenarios. Simulation results show that the proposed D-HCPs algorithm adaptively optimizes the HCPs and outperforms other algorithms from the literature.
In this paper, an Adaptive Handover Margin algorithm based on Novel Weight Function (AHOM-NWF) is proposed through Carrier Aggregation operation in Long Term Evolution—Advanced system. The AHOM-NWF algorithm automatically adjusts the Handover Margin level based on three functions, f(SINR),f(TL)andf(v), which are evaluated as functions of Signal-to-Interference-plus-Noise-Ratio (SINR), Traffic Load (TL), and User’s velocity (v) respectively. The weight of each function is taken into account in order to estimate an accurate margin level. Furthermore, a mathematical model for estimating the weight of each function is formulated by a simple model. However, AHOM-NWF algorithm will contribute for the perspective of SINR improvement, cell edge spectral efficiency enhancement and outage probability reduction. Simulation results have shown that the AHOM-NWF algorithm enhances system performance more than the other considered algorithms from the literature by 24.4, 14.6 and 17.9%, as average gains over all the considered algorithms in terms of SINR, cell edge spectral efficiency and outage probability reduction respectively.
To satisfy requirements on future mobile network, a large number of small cells should be deployed. In such scenario, mobility management becomes a critical issue in order to ensure seamless connectivity with a reasonable overhead. In this paper, we propose a fuzzy logic-based scheme exploiting a user velocity and a radio channel quality to adapt a hysteresis margin for handover decision in a self-optimizing manner. The objective of the proposed algorithm is to reduce a number of redundant handovers and a handover failure ratio while allowing the users to exploit benefits of the dense small cell deployment. Simulation results show that our proposed algorithm efficiently suppresses ping pong effect and keeps it at a negligible level (below 1%) in all investigated scenarios. Moreover, the handover failure ratio and the total number of handovers are notably reduced with respect to existing algorithms, especially in scenario with high amount of small cells. In addition, the proposed scheme keeps the time spent by the users connected to the small cells at a similar level as the competitive algorithms. Thus, the benefits of the dense small cell deployment for the users are preserved.
Network densification is regarded as one of the important ingredients to increase capacity for next generation mobile communication networks. However, it also leads to mobility problems since users are more likely to hand over to another cell in dense or even ultradense mobile communication networks. Therefore, supporting seamless and robust connectivity through such networks becomes a very important issue. In this paper, we investigate handover (HO) optimization in next generation mobile communication networks. We propose a data-driven handover optimization (DHO) approach, which aims to mitigate mobility problems including too-late HO, too-early HO, HO to wrong cell, ping-pong HO, and unnecessary HO. The key performance indicator (KPI) is defined as the weighted average of the ratios of these mobility problems. The DHO approach collects data from the mobile communication measurement results and provides a model to estimate the relationship between the KPI and features from the collected dataset. Based on the model, the handover parameters, including the handover margin and time-to-trigger, are optimized to minimize the KPI. Simulation results show that the proposed DHO approach could effectively mitigate mobility problems.
What will 5G be? What it will not be is an incremental advance on 4G. The
previous four generations of cellular technology have each been a major
paradigm shift that has broken backwards compatibility. And indeed, 5G will
need to be a paradigm shift that includes very high carrier frequencies with
massive bandwidths, extreme base station and device densities and unprecedented
numbers of antennas. But unlike the previous four generations, it will also be
highly integrative: tying any new 5G air interface and spectrum together with
LTE and WiFi to provide universal high-rate coverage and a seamless user
experience. To support this, the core network will also have to reach
unprecedented levels of flexibility and intelligence, spectrum regulation will
need to be rethought and improved, and energy and cost efficiencies will become
even more critical considerations. This paper discusses all of these topics,
identifying key challenges for future research and preliminary 5G
standardization activities, while providing a comprehensive overview of the
current literature, and in particular of the papers appearing in this special
issue.
High-speed data applications over wireless networks have been growing rapidly in recent years. With this increased use of wireless data, services in wireless networks require performance guarantee. This is, therefore, driving the need for regular innovations in wireless technologies to provide more and more capacity and higher quality of service (QoS). These higher performance requirements have motivated rd Generation Partnership Project (3GPP) to work on LTE-Advanced. LTE- Advanced is a technology enhancement to Long Term Evaluation (LTE) that is under evaluation of the requirements of IMT- Advanced. There are a few mobility enhancements in LTE- Advanced to assure good performance at the time of handover. The generic handover procedure of LTE-Advanced builds upon the one developed for LTE and minimizes the handover inter- ruption time. This tutorial article gives an overview of handover procedure of LTE-Advanced and analyzes handover interruption time in Time Division Duplex (TDD) and Frequency Division Duplex (FDD) modes. The analysis shows that the handover interruption time for LTE-Advanced complies with the IMT- Advanced requirement. Index Terms—LTE-Advanced, Mobility Enhancements, Han- dover Interruption Time, IMT-Advanced.
Self-organizing networks (SONs) aim to raise the level of automated operation in next-generation networks. One of the use cases defined in this field is the optimization of the handover (HO) process, which involves a tradeoff between the amount of signaling load due to HOs and the quality of the active connections in the network. In this paper, first, a sensitivity analysis of the two main HO parameters, i.e., the HO margin (HOM) and the time-to-trigger (TTT), is carried out for different system load levels and user speeds in a Long-Term Evolution (LTE) network. Second, a fuzzy logic controller (FLC) that adaptively modifies HOMs is designed for HO optimization. In this case, different parameter optimization levels (network-wide, cell-wide, and cell-pair-wide) and the impact of measurement errors have been considered. Results of the sensitivity analysis show that tuning HOMs is an effective solution for HO optimization in LTE networks. In addition, the FLC is shown as an effective technique to adapt HOM to different network conditions so that the signaling load in the network is decreased while an admissible level of call dropping is achieved.
6G is expected to support the unprecedented Internet of everything scenarios with extremely diverse and challenging requirements. To fulfill such diverse requirements efficiently, 6G is envisioned to be space-aerial-terrestrial-ocean integrated three-dimension networks with different types of slices enabled by new technologies and paradigms to make the system more intelligent and flexible. As 6G networks are increasingly complex, heterogeneous and dynamic, it is very challenging to achieve efficient resource utilization, seamless user experience, automatic management and orchestration. With the advancement of big data processing technology, computing power and the availability of rich data, it is natural to tackle complex 6G network issues by leveraging artificial intelligence (AI). In this paper, we make a comprehensive survey about AI-empowered networks evolving towards 6G. We first present the vision of AI-enabled 6G system, the driving forces of introducing AI into 6G and the state of the art in machine learning. Then applying machine learning techniques to major 6G network issues including advanced radio interface, intelligent traffic control, security protection, management and orchestration, and network optimization is extensively discussed. Moreover, the latest progress of major standardization initiatives and industry research programs on applying machine learning to mobile networks evolving towards 6G are reviewed. Finally, we identify important open issues to inspire further studies towards an intelligent, efficient and secure 6G system.
Wireless communication technologies such as fifth generation mobile networks (5G) will not only provide an increase of 1000 times in Internet traffic in the next decade but will also offer the underlying technologies to entire industries to support Internet of things (IOT) technologies. Compared to existing mobile communication techniques, 5G has more varied applications and its corresponding system design is more complicated. The resurgence of artificial intelligence (AI) techniques offers an alternative option that is possibly superior to traditional ideas and performance. Typical and potential research directions related to the promising contributions that can be achieved through AI must be identified, evaluated, and investigated. To this end, this study provides an overview that first combs through several promising research directions in AI for 5G technologies based on an understanding of the key technologies in 5G. In addition, the study focuses on providing design paradigms including 5G network optimization, optimal resource allocation, 5G physical layer unified acceleration, end-to-end physical layer joint optimization, and so on.
Channel state information (CSI) estimation is one of the most fundamental problems in wireless communication systems. Various methods, so far, have been developed to conduct CSI estimation, which usually requires high computational complexity. However, these methods are not suitable for 5G wireless communications due to many techniques (e.g., massive MIMO, OFDM, and millimeter-Wave (mmWave)) to be employed in 5G wireless communication systems. In this paper, we propose an efficient online CSI prediction scheme, called OCEAN, for predicting CSI from historical data in 5G wireless communication systems. Specifically, we first identify several important features affecting the CSI of a radio link and a data sample consists of the information of these features and the CSI. We then design a learning framework that is a combination of a CNN (convolutional neural network) and an LSTM (long short term with memory) network. We further develop an offline-online two-step training mechanism, enabling the prediction results to be more stable when applied in practical 5G wireless communication systems. To validate OCEAN's efficacy, we conduct extensive experiments by considering four typical case studies, i.e., two outdoor and two indoor scenarios. The experiment results show that OCEAN not only obtains the predicted CSI values very quickly but also achieves highly accurate CSI prediction with up to 2.650% - 3.457% average difference ratio (ADR) between the predicted and measured CSI.
Mathematical models of real processes cannot contemplate every aspect of reality. Simplifying assumptions have to be made, especially when the models are going to be used for control purposes, where models with simple structures (linear in most cases) and sufficiently small size have to be used due to available control techniques and real-time considerations. Thus, mathematical models, especially control models, can only describe the dynamics of the process in an approximative way.
The control problem was formulated in the previous chapters considering all signals to possess an unlimited range. This is not very realistic because in practice all processes are subject to constraints. Actuators have a limited range of action and a limited slew rate, as is the case of control valves limited by a fully closed and fully open position and a maximum slew rate. Constructive or safety reasons, as well as sensor range, cause bounds in process variables, as in the case of levels in tanks, flows in pipes, and pressures in deposits. Furthermore, in practice, the operating points of plants are determined to satisfy economic goals and lie at the intersection of certain constraints. The control system normally operates close to the limits and constraint violations are likely to occur. The control system, especially for longrange predictive control, has to anticipate constraint violations and correct them in an appropriate way. Although input and output constraints are basically treated in the same way, as is shown in this chapter, the implications of the constraints differ. Output constraints are mainly due to safety reasons and must be controlled in advance because output variables are affected by process dynamics. Input (or manipulated) variables can always be kept in bound by the controller by clipping the control action to a value satisfying amplitude and slew rate constraints.
From power plants to sugar refining, model predictive control (MPC) schemes have established themselves as the preferred control strategies for a wide variety of processes.
The second edition of Model Predictive Control provides a thorough introduction to theoretical and practical aspects of the most commonly used MPC strategies. It bridges the gap between the powerful but often abstract techniques of control researchers and the more empirical approach of practitioners. Model Predictive Control demonstrates that a powerful technique does not always require complex control algorithms.
The text features material on the following subjects:
general MPC elements and algorithms;
commercial MPC schemes;
generalized predictive control
multivariable, robust, constrained nonlinear and hybrid MPC;
fast methods for MPC implementation;
applications.
All of the material is thoroughly updated for the second edition with the chapters on nonlinear MPC, MPC and hybrid systems and MPC implementation being entirely new. Many new exercises and examples have also have also been added throughout and Matlab® programs to aid in their solution can be downloaded from the authors' website at http://www.esi.us.es/MPCBOOK. The text is an excellent aid for graduate and advanced undergraduate students and will also be of use to researchers and industrial practitioners wishing to keep abreast of a fast-moving field.
In this paper, we address the problem of reconstructing coverage maps from
path-loss measurements in cellular networks. We propose and evaluate two
kernel-based adaptive online algorithms as an alternative to typical offline
methods. The proposed algorithms are application-tailored extensions of
powerful iterative methods such as the adaptive projected subgradient method
and a state-of-the-art adaptive multikernel method. Assuming that the moving
trajectories of users are available, it is shown exemplary how side information
can be incorporated in the algorithms to improve their convergence performance
and the quality of estimation. The complexity is significantly reduced by
imposing sparsity-awareness in the sense that the algorithms exploit the
compressibility of the measurement data to reduce the amount of data which is
saved and processed. Finally, we present extensive simulations based on
realistic data to show that our algorithms provide fast, robust estimates of
coverage maps in real-world scenarios. Envisioned applications include
path-loss prediction along trajectories of mobile users as a building block for
anticipatory buffering or traffic offloading.