Conference Paper

5G New Radio User Equipment Power Modeling and Potential Energy Savings

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Energy efficiency is one of the key performance indicators in 5G New Radio (NR) systems which is important for both the UE and BS sides. 5G NR technology defines several techniques to conserve energy, such as bandwidth adaptation [13], the use of radio resource control (RRC), cross-slot scheduling [14], DRX mechanism [11], [15], [16] and others. Out of these options, DRX is known to mostly affect the energy performance of UEs. ...
... Accordingly, when parameterizing this mode, the QoS requirements must be taken into account. The values of power consumption measured in power units (PU) are provided in Table I [12], [15], [18]. ...
... Among critical conclusions is the fact that long UE sleep periods may drastically improve energy efficiency at the expense of user performance metrics such as time in outage and latency. In [15], a detailed DRX power consumption model has been developed. The authors then proceeded to show that utilizing the short DRX sleep state can reduce power consumption by 26% and reduce latency by 74% as compared to the basic long DRX timer option. ...
Article
Full-text available
Energy efficiency and service reliability are two critical requirements for 5G New Radio cellular access. To address the latter, 3GPP has proposed multiconnectivity operation allowing user equipment (UE) to maintain active links to more than a single base station. However, the use of this technique compromises the energy efficiency of UE. In this paper, we develop a mathematical model capturing key energy and performance indicators as a function of system and environmental conditions. Then, we apply it to investigate the trade-offs between user performance and energy efficiency as well as the effect of scaling of discontinuous reception (DRX) timers. For a considered set of system parameters, our results reveal that for low micromobility speed ≤ 0.1° /s and blockers density, ≤ 0.1 bl./m 2 two simultaneously supported links with minimal DRX timers lead to optimal performance. For higher blockers density more than two links are needed to optimize energy efficiency while for high micromobility speed multiconnectivity does not allow to improve energy efficiency at all. Thus, the optimal degree of multiconnectivity and DRX timer scaling coefficients depend on the environmental characteristics including both micromobility speed and density of blockers and need to be dynamically updated during UE operation.
... There are also other power saving mechanisms except WUR. In [13], [14], a basic WUS is designed and compared with microsleep and GTS respectively. The difference between PDCCH-based WUS and secondary radio for WUS is also discussed in [14]. ...
... In [13], [14], a basic WUS is designed and compared with microsleep and GTS respectively. The difference between PDCCH-based WUS and secondary radio for WUS is also discussed in [14]. Oller et al. [15] further simulated a secondary WUR on a real hardware platform. ...
Article
Full-text available
The 3rd generation partnership project (3GPP) introduced the Downlink Control information for Power saving (DCP) signal as a Wake Up Signal (WUS) integrated into the discontinuous reception (DRX) mechanism. This signal plays a key role in allowing the gNB to intelligently optimize the UE's power efficiency. In our preliminary research, we discovered that the DCP mechanism can enhance power savings for UEs running a latency-aware service. In this study, we present a novel DCP DRX scheduling strategy tailored to the latency requirements of a multi-service scenario. Our approach involves an analytical model based on a discrete Markov chain, complemented by simulations that validate its efficacy. The simulations assess the improvement in DRX power savings achieved by transitioning from standard DCP to a latency-aware DCP configuration. Results demonstrate that applying the proposed DCP scheduling algorithm allows the UE to increase its sleep ratio by over 10%, without introducing unacceptable latency.
... The authors in [24] presented an implementation of a WUS-based Internet of Things (IoT) device. In [25], the power saving and the go-to-sleep signals are compared. A hybrid method is adopted in [26]- [28]; that is, a secondary receiver wakens up a device, while the secondary receiver only monitors the WUS at specific monitoring occasions instead of always being active. ...
... The case only holds when the packet arrives after the pivot DCP, and the latency is the same no matter whether the next cycle is D or F . As a result, by combining , in (4) and , in (5), we get (24) and the latency distribution in (25). ...
Article
Full-text available
The Discontinuous Reception (DRX) mechanism is designed by the Third Generation Partnership Project (3GPP) for power saving in the Long-Term Evolution and the New Radio technologies. In Release 16, 3GPP designed an advanced signal called Downlink Control Information of Power Saving (DCP). Energy-critical devices can conserve more energy by turning off the receiving module for a long period according to the DCP indication from the next-generation base station (gNB). However, the possible failure of DCP potentially causes extra power consumption or packet loss. To handle the problem, 3GPP allows the gNB to configure the default behavior when the DCP is not detected. In this paper, we accurately model the two DCP mechanisms with different default settings with a three-state discrete-time Markov model. Moreover, we also propose a method that efficiently finds a proper parameter configuration under latency and reliability constraints. Through simulation, we validate the accuracy of the model. In addition, we conduct another simulation to show that the proposed method finds a parameter set yielding near-optimal performance within a short execution time. The derived model, along with the proposed parameter setting method, serves as a solid basis for future research on power saving and sustainability.
... Thus, we provide some related work, summarized in Table VII, in the following. The fundamental WUR mechanism for DRX is presented in [162], [163]. The papers showed that we can use the UE power consumption models in different states from 3GPP technical report TR38.840 ...
... Although GTS is less discussed in the standardization meeting, it is still worthy of further study. We discovered that the fundamental GTS designs might reduce the on duration to ease power consumption based on the simulation estimations [163]. Power Saving Indication, embedded in the beam training procedure, is a comparable concept to GTS and works similarly to improve power efficiency [149]. ...
Article
Full-text available
The Discontinuous Reception (DRX) is the most effective timer-based mechanism for User Equipment (UE) power saving. In Long Term Evolution (LTE) systems, the development of the DRX mechanism enormously extends the UE battery life. With the DRX mechanism, a UE is allowed to enter a dormant state. Given a DRX cycle, the UE needs to wake up periodically during the dormancy to check whether it receives new downlink packets or not. The UE can achieve a high sleeping ratio by skipping most channel monitoring occasions. As the mobile network evolved to 5G, the battery life requirement increased to support various new services. 3rd Generation Partnership Project (3GPP) also enhances the DRX mechanism and adds new DRX-related features in the New Radio (NR) Release 16 standard. In addition to the time-based design, 3GPP proposed two signaling-based mechanisms: power saving signal and UE assistance information. This survey paper introduces the latest DRX mechanism in the 3GPP NR standard and summarizes the state-of-the-art research. Researchers have investigated the DRX mechanism in various use cases, such as web browsing services and heterogeneous networks. They focus on the UE sleep ratio and packet delay and propose corresponding analytical models. New DRX architectures are also discussed to conquer the power-saving problem in specific schemes, especially in the 5G NR networks. This paper categorizes and presents the papers according to the target services and the network scenarios in detail. We also survey the work focusing on the new challenges (such as beamforming and thermal issue) in the NR network and introduce the future research directions in the 6G era.
... These methods can be simultaneously implemented based on various use cases. For example, using DRX combined with the PDCCH optimization schemes (i.e., WUS or crossslot scheduling) can reduce power consumption by up to 20% [162]. While those UE-level power-saving techniques in 3GPP Rel 16 offer significant benefits, they also involve trade-offs, as indicated in Table 10. ...
Article
In the context of the Internet of Things (IoT), aerial computing platforms (ACPs) such as unmanned aerial vehicles and high-altitude platforms with edge computing capabilities have the potential to significantly expand coverage, enhance performance, and handle complex computational tasks for IoT devices (IoTDs). Non-orthogonal multiple access (NOMA) has also emerged as a promising multiple access technology for advanced 5G networks. This paper presents a multi-ACP-enabled NOMA edge network, which enables heterogeneous ACPs to provide computational assistance to IoTDs. To minimize delay and energy consumption, we formulate a joint task offloading and resource allocation problem that considers IoTD association, offloading ratio, transmit power, and computational resource allocation variables. To address the complexity of the optimization problem, it is modeled as a multi-agent Markov decision process and solved using a multi-agent deep deterministic policy gradient (MADDPG)-based solution. Extensive simulation results demonstrate that the proposed MADDPG-based framework can remarkably adapt to the dynamic nature of multi-ACP-enabled NOMA edge networks. It consistently outperforms various benchmark schemes regarding energy efficiency and task processing delay across different simulated scenarios.
... This feature is the Discontinuous Reception (DRX) mechanism [5]. Although other EE schemes such as bandwidth adaptation and dynamic carrier activation or deactivation, among others are supported as additional powersaving mechanisms in 5G [3], [6]. ...
Conference Paper
Full-text available
Improving energy efficiency and extending the lifetime of User Equipment (UE) batteries are among the key performance requirements for the Fifth-Generation (5G) network goals. To realize these goals, ongoing studies are developing schemes to improve both performance requirements. One of the mechanisms at the forefront of these studies is Discontinuous Reception (DRX). This mechanism, which was inherited from the Fourth-Generation (4G) wireless network is crucial, as its performance with regard to the performance requirements for 5G goals has a great impact on the user's satisfaction. Although the DRX mechanism is adapted to meet the power-saving need of a 5G wireless network, the energy expenses for operations at millimetre-wave frequencies are still of concern. In this paper, recent research solutions that consider the application of the DRX mechanism in association with key enabling features and factors such as paging, RRC_INACTIVE, mmWave, and UE mobility are discussed for 5G use cases. In addition, the effect of these features and factors is considered, as well as recent solutions to address the challenges identified. Simulation results of this approach showed an average of 399mW of power consumed in a 24-hour period, which represents a 6% mean percentage reduction in UE power consumption when compared to the existing scheme
... For mechanism enhancements, DRX is modified for different transmission scenarios like dual connectivity and carrier aggregation (DC/CA) [1], license-assisted access (LAA) [2], and device-to-device (D2D) transmission [3]. WUS applies to general scenarios, and the authors in [4], [5] designed and analyzed the performance of WUS-based DRX. To capture the features of DCP, we modeled the basic DCP operation and gave insights into how the DCP configuration affects the system performance in our previous work [6]. ...
Conference Paper
Full-text available
Pursuing a sustainable wireless communication technology, the third generation partnership project (3GPP) designed the discontinuous reception (DRX) for power saving. They further enhanced the mechanism with downlink control signal of power saving (DCP). With DCP, a user equipment (UE) saves more energy by turning off its transceiver module for a longer period. To exploit the advantage of the DCP, in this paper, we propose a deferring scheduling policy called LADCP and its enhancement, eLADCP. We derive the analytical models and verify them through simulations, and there are two important conclusions from the simulation results. First, the model accurately matches the simulation, which could be a stable basis for future research. Second, the proposed LADCP and eLADCP methods effectively improve energy conservation compared with the previous ones.
... In [23], a heuristic algorithm is proposed to concurrently maximize spectrum utilization and minimize UE wake-up time. In [18], by considering only one UE in a network without CA, the details for the power saving potential in radio resource control connected mode have been studied. In [19], for different traffic models -FTP and web browsing-the effect of the CC activation on UE's power consumption is studied. ...
Article
Full-text available
As one of the key technologies in 5G networks, Carrier Aggregation (CA) is studied in this paper. In CA, Component Carriers (CCs) can be activated and deactivated depending on multiple factors, e.g., energy consumption and Quality of Service (QoS) demand of users. We propose CC management strategies where each User Equipment (UE) minimizes its average delay and at the same time minimizes its power consumption while considering that CCs can be activated and deactivated only at certain times, as in real-world CA implementations. We first model the problem as a centralized multi-objective optimum CC management problem. Since centralized approaches would impose a large overhead on the system, we then develop a semi-distributed solution by modeling the problem as a stochastic game and propose a multi-agent Double Deep Q-Network (DDQN) based CC management algorithm to solve the stochastic game. We finally compare the proposed approaches with single CC activation and all-CC activation baseline schemes. Simulation results show that our proposed algorithms outperform the all-CC algorithm in terms of UE power consumption and have the capability of transmitting a number of bits with delay close to the all-CC scheme. Meanwhile, our DDQN-based algorithm decreases the UE power consumption by about 20% with respect to the all-CC scheme.
... [7] presented a basic WUS mechanism with a secondary receiver, and compared it with microsleep and legacy DRX. [8] further compared WUS with go-to-sleep signal. The authors also evaluated the difference between single radio for WUS and secondary radio for WUS. ...
... To analyse the UE power consumption in the LTE-Advanced network with CA, [7] employs a Markov power state model in which the power consumed in each state is considered to be fixed and independent of the achievable rate. This holds for [8]- [10] as well. Since activating more CCs to the UEs in one hand employs more RF chains, and on the other hand augments the achievable throughput (or equivalently makes faster termination of a large file), [5] and [7] study the impact of increasing CCs on UE power consumption. ...
Article
Full-text available
In this paper, we consider 5G networks with Carrier Aggregation (CA). Our aim is to jointly select Component Carriers (CCs) and allocate Resource Blocks (RBs) such that total user throughput is maximized while user power consumption is minimized and Quality of Service (QoS) requirements are met. We formulate the User Equipment (UE) throughput and power consumption in terms of CC and RB indicators and propose a multi-objective optimization problem. Simulation results show that the proposed scheme outperforms the compared techniques by providing approximately 200mW reduction in power consumption while increasing the throughput by 2.7 times for users under short delay constraint.
... 5G NR DRX: For the 5G NR network systems, the DRX mechanisms with the novel power-saving features are evaluated under various services [30]- [32]. Many researchers pay attention to the novel wake-up signal to save more power under DRX operation [33]- [37]. ...
Article
Full-text available
The evolution of the communication and computation systems enables the user equipment (UE) to handle tremendous transmission data. However, the high-speed data processing also makes UEs release heat and might burn the chips inside the devices. The thermal issue would be more critical in millimeter-wave communications. The massive antenna arrays and the radio frequency modules not only drain the UE battery but also heat the devices. 3GPP also identified the thermal issue and suppressed heat generation by temporarily reducing UE capability in Release 15. In this work, instead of reducing the UE capability, we propose to apply the beam-aware Discontinuous Reception (DRX) mechanism to manage the power consumption and temperature of UEs simultaneously. We are the first to analyze the temperature for UEs with DRX configured. A semi-Markov model is provided, and we employ it to estimate the sleep ratio, packet delay, and steady temperature.We use a simulation program to verify the proposed analytical model. When comparing the beam-aware DRX with the baseline 5G NR DRX operation, we find that the beam-aware scheme reduces the steady temperature from 38.2°C to 26.7°C. The results show that Beam-Aware DRX could solve the thermal issue without sacrificing much performance of packet delivery latency.
... Multi-antenna technologies such as beamforming and Multiple-Input, Multiple-Output (MIMO) with hybrid beamforming structures in the mm-Wave band are anticipated to play a key role in 5G systems [33], [34]. Since 5G NR brings a significant amount of data rate, reduced symbol time duration, which may cause 5G NR user equipment to consume more power PDCCH [35]. Since Non-Standalone (NSA) is the early phase of 5G deployment and this uses 4G core. ...
Article
The traffic demand and prediction for the next decade would be mostly affiliated with the Internet of Things (IoT). Various challenges with mobile communication industry will be faced as the demand in high capacity, multi mobile devices (users) connected to the network, uplink power consumption on User Equipment (UE), and its effect on the life span of mobile phone. The major features of 5G as per user experience on the network are Ultra-Reliable Low Latency Communication (URLLC), Internet of Things (IoT), sustaining high rate Enhanced Mobile Broadband (eMBB), and connection density Massive Machine Type Communication (mMTC). This research work focuses on Non-Standalone (NSA) 5G New Radio (NR) early deployment on eMBB for achieving the required throughput. The 5G performance requirement is higher than 4G, which includes the capacity to support user experience downlink throughput with target value of 1 Gbps, millisecond-level of end-to-end latency, and high connection density of 1 million per square kilometer. Optimization is a vast topic, and this paper discusses the problems faced by users latching on 5G NSA network on the downlink and 4G Network on the uplink and suggests its solution.
... In order to evaluate power saving gain of various techniques, a power consumption model was adopted by 3GPP [25][28] [29]. The key power states and relative power per slot are provided in Table A1. ...
Article
Full-text available
Energy efficiency is one of the key performance indicators in 5G New Radio (NR) networks targeted to support diversified use cases including enhanced mobile broadband (eMBB), massive machine type communications (mMTC) and ultra-reliable and low latency communications (URLLC). Trade-offs have to be carefully considered between energy efficiency and other performance aspects such as latency, throughput, connection densities and reliability. Energy efficiency is important for both user equipment (UE) side and base station side. On UE side, UE battery life has great impact on user experience. It is challenging to improve UE experience in other performance aspects without affecting battery life of 5G handsets. On the base station side, efficient network implementation is critical in both environmental and operation cost standpoints. To adapt different requirements and trade-offs, the 5G NR standard is designed to have great flexibility on network operation modes. This paper provides an overview on power saving techniques supported by 5G NR standards according to the current 5G standardization progress. It provides the 5G evolution path of the power saving techniques from the first release of 5G standard to the future beyond-5G releases. In addition to the existing standardized techniques, some major development trends of green communication and the future potential enhancements expected in the beyond-5G standards are discussed.
... A similar concept is available in [17], where the state transition chain is developed using a semi-Markov model and the results deal with the same parameters. Five steps Markov chain of the energy-saving model is found in [18] for 5G user equipment (UE). This paper mainly deals with analytical results on a state transition chain of ten states for the DRX scheme. ...
Article
Full-text available
Recently power saving is a vital issue for wireless devices of 4G and 5G networks. A device enters in sleeping mode (short and long sleep cycle) when there is no arrival of traffic but wakeup once the arrival of traffic. Before wakeup, the UE user equipment (UE) spends the rest of the sleeping cycle which incurs a delay of service. There is a tradeoff between the length of a sleep cycle (power saving factor ids higher for longer sleep cycle) and mean delay of service. In this paper, a Markov chain is designed including timer inactivity, short sleep, and long sleep and active service states. The closed-form solution of the chain is performed using node equations hence comparison of performance is made with previous work in the context of power-saving factor and mean delay. Both the power saving factor and mean delay of this paper are found marginally better than the previous work at lower packet arrival rate but at higher arrival rate performance are almost the same but claims some explanations.
Article
In this tutorial we present recipes for dynamic systemlevel simulations (SLSs) of 5G and beyond cellular radio systems. A key ingredient for such SLSs is selection of proper models to make sure that the performance determining effects are properly reflected to ensure output of realistic radio performance results. We therefore present a significant number of SLS models and related methodologies for a variety of use cases. Our focus is on generally accepted models that are largely supported by academia and industrial players and adopted by 3GPP as being realistic. Among others, we touch on deployment models, traffic models, non-terrestrial cellular networks with satellites, SLS methodologies for Machine Learning (ML) enabled air-interface solutions, and many more. We also present several recommendations for best practices related to preparing and running detailed SLS campaigns, and agile software engineering considerations. Throughout the article we use the 3GPP defined 5G and 5G-Advanced systems to illustrate our points, extending it also into the 6G-era that is predicted to build on alike SLS methodologies and best practices.
Article
The 5G millimeter wave (mmWave) New Radio (NR) systems are prone to blockage and micromobility effects. To improve service reliability, 3GPP proposed a dual-connectivity allowing UE to maintain links to two base stations (BS). However, this functionality is power-hungry resulting in a trade-off between performance and power efficiency. In this paper, we compare user equipment (UE) power efficiency, consumption, and battery lifetime for different intra- and inter-radio access technologies single- and dual-connectivity under different UE usage scenarios, micromobility, and blockage impairments. We evaluate five schemes: (i) BSs in FR1 and FR2 bands (NR-DC FR1/FR2), (ii) BSs in FR2 band (NR-DC FR2/FR2), and (iii) LTE BS and NR BS in FR2 band (EN-DC LTE/FR2), (iv) NR FR1, and (iv) LTE. Our results show that the LTE-only scheme provides three times longer battery lifetime as compared to its nearest rival-NR FR1 single-connectivity option. For dual-connectivity options, the best lifetime is observed for EN-DC FR2/LTE, and the worst – for NR-DC FR2/FR1. The difference between NR-DC FR2/FR1 and NR-DC FR2/FR2 schemes is negligible. Densification negatively affects all the dual-connectivity schemes as it forces UEs to switch between master and backup technologies more often. We recommend EN-DC FR2/LTE for low-traffic outage-sensitive applications, while NR-DC FR2/FR2-for heavy-traffic outage non-sensitive applications.
Article
One of the significant properties of narrowband Internet of Things (NB-IoT) devices is supporting a long battery life. In the NB-IoT, devices have different search-space periods based on channel qualities. However, because a subframe of 180 kHz is a basic unit in the downlink, a base station can generally only simultaneously schedule devices with the same search-space period and allocate radio resources at the start of the period. If scheduling does not consider search-space periods, a base station will miss the start of the search-space period of some devices or sacrifice some devices. Massive devices will attempt multiple periods for blind decoding (BD), resulting in huge energy consumption. This article investigates a new direction in reducing the energy consumption of devices by reducing the BD and idle time for the NB-IoT. The challenge of the target energy-efficient scheduling problem is how to schedule search-space periods to reduce each device's BD and idle time subject to the data requirements. We propose an algorithm to determine each device's search-space period and an energy-efficient scheduling algorithm to schedule the search-space periods. The simulation results show that, compared with two baselines, the proposed algorithms reduce the energy consumption by 77%.
Preprint
5G mmWave, as a revolutionary cellular technology, holds monumental potential for innovations in many academic and industrial areas. However, widespread adoption of this technology is hindered by the severe overheating issues experienced by current Commercial Off-The-Shelf (COTS) mmWave smartphones. This study aims to identify the root causes of device skin temperature related throttling during 5G transmission, and to quantify power reduction required to prevent such throttling in a given ambient temperature. The key insight of our paper is leveraging the power model and thermal model of mmWave smartphone to acquire the quantitative relationship among power consumption, ambient temperature and device skin temperature. This approach allows us to determine the extent of power reduction required to prevent throttling under specific ambient temperature conditions.
Article
Discontinuous reception (DRX) is a way for user equipment (UE) to save energy. DRX forces a UE to turn off its transceivers for a DRX cycle when it does not have a packet to receive from a base station, called an eNB. However, if a packet arrives at an eNB when the UE is performing a DRX cycle, the transmission of the packet is delayed until the UE finishes the DRX cycle. Therefore, as the length of the DRX cycle increases, not only the amount of UE energy saved by the DRX but also the transmission delay of a packet increase. Different applications have different traffic arrival patterns and require different optimal balances between energy efficiency and transmission delay. Thus, understanding the tradeoff between these two performance metrics is important for achieving the optimal use of DRX in a wide range of use cases. In this paper, we mathematically analyze DRX to understand this tradeoff. We note that previous studies were limited in that their analysis models only partially reflect the DRX operation, and they make assumptions to simplify the analysis, which creates a gap between the analysis results and the actual performance of the DRX. To fill this gap, in this paper, we present an analysis model that fully reflects the DRX operation. To quantify the energy efficiency of the DRX, we also propose a new metric called a real power-saving (RPS) factor by considering all the states and state transitions in the DRX specification. In addition, we improve the accuracy of the analysis result for the average packet transmission delay by removing unrealistic assumptions. Through extensive simulation studies, we validate our analysis results. We also show that compared with the other analysis results, our analysis model improves the accuracy of the performance metrics.
Special issue on: 3GPP 5G Specifications
  • A R Prasad
5G NR: The Next Generation Wireless Access Technology
  • E Dahlman
  • S Parkvall
  • J Skold
  • Nr