Conference Paper

Machine Learning based Resource Allocation Strategy for Network Slicing in Vehicular Networks

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Li et al. proposed LA-ResNet, which combines residual networks with RNN and incorporates attention mechanisms to assist in traffic prediction [31]. Cui et al. [32] employed Convolutional Long Short-Term Memory (ConvL-STM) to integrate convolutional layers into the LSTM cells, enabling the combination of spatial-temporal features. However, these works mainly focus on utilizing spatial or topological information to assist in wireless traffic prediction without considering the correlation between services, which could further improve the accuracy of traffic prediction. ...
Article
Full-text available
With the continuous emergence of the concept of 6G, the rapid development of industrial application scenarios, and the rise in demand for dedicated line services, there has been a strong and diverse demand for network capacity. The introduction of fine-granularity FlexE technology provides a new approach to addressing the diversification of Quality of Service (QoS) and the resource waste caused by large granularity (5 Gbps). However, the reduction in granularity leads to a significant increase in the number of slots, posing a serious challenge to deterministic latency assurance, especially in scenarios with dynamic fluctuations in service bandwidth. Sudden bandwidth variations can lead to service interruptions and fluctuations, resulting in uncontrolled network blocking rates. To address this issue, this paper proposes a traffic-driven proactive FlexE slot orchestration algorithm called PMFAN-GDSO. Based on this algorithm, bandwidth variations are calculated in advance based on the bandwidth change provided via the traffic prediction algorithm before service fluctuations occur. This avoids the time-consuming slot orchestration calculations and service abrupt changes. The real-world network datasets indicate that, based on our proposed algorithm, significant time savings of up to 46.8% are achieved compared to non-proactive prediction.
... Recent works are using extensive ML and DL techniques for resource-aware NS. In [20], the authors have implemented a machine learning-based solution for intelligent resource allocation. Their proposed approach has framed the traffic characteristics as a Convolutional Long-Short Term Memory (ConvLSTM). ...
Article
Full-text available
Network Slicing (NS) technique is comprehensively reshaping the next-generation communication networks (e.g. 5G, 6G). Software-Defined Networking (SDN) and Network Functions Virtualization (NFV) predominantly control the flow of service functions on NS to incorporate versatile applications as per user demands. In the virtualized-Software Defined Networking vSDN environment, a chain of well-defined virtual network functions (VNFs) are installed on Service Function Chains (SFCs) by multiple Internet Service Providers (ISPs) concurrently. Generation, allocation, re-allocation, release and destroying associative VNFs on SFC is an extremely difficult task while keeping high selection accuracy. Towards solving this fundamental issue, in this work, we have proposed a multi-layered SFC formation for adaptive VNF allocation on dynamic slices. We have formulated an ILP to address the VNF-EAP (VNF-Embedding and Allocation Problem) over real network topology (AT&T Topology). Leveraging machine learning techniques we have shown an intelligent VNF selection mechanism to optimize resource utilization. The performance evaluation shows remarkable efficiency on ML-driven dynamic VNF selections over static allocations on SFCs by halving resource usage. Further, we have also studied a VNF typecasting technique for service backup on outage slices in the field of disaster management activities.
... TD3 adds noise to actions at the time of training, to achieve the objective of higher exploration of policies. TD3 also updates the target networks and policies a little seldom than the Q-function [12]. ...
Chapter
Full-text available
This paper proposes an automated trading strategy using reinforcement learning. The stock market has become one of the largest financial institutions. These institutions embrace machine learning solutions based on artificial intelligence for market monitoring, credit quality, fraud detection, and many other areas. We desire to provide an efficient and effective solution that would overcome the manual trading drawbacks by building a Trading Bot. In this paper, we will propose a stock trading strategy that uses reinforcement learning algorithms to maximize the profit. The strategy employs three actor critic models: Advanced Actor Critic(A2C), Twin Delayed DDPG (TD3) and Soft Actor Critic (SAC). Our strategy picks the most optimal model based on the current market situation. The performance of our trading bot is evaluated and compared with Markowitz portfolio theory.
... Ref. [47] dealt with the prediction and management issues of vehicular use cases in the existing research. Having these issues in mind, they proposed an ML-based resource allocation strategy for vehicular network slicing. ...
Article
Full-text available
Fifth-generation (5G) and beyond networks are envisioned to serve multiple emerging applications having diverse and strict quality of service (QoS) requirements. To meet ultra-reliable and low latency communication, real-time data processing and massive device connectivity demands of the new services, network slicing and edge computing, are envisioned as key enabling technologies. Network slicing will prioritize virtualized and dedicated logical networks over common physical infrastructure and encourage flexible and scalable networks. On the other hand, edge computing offers storage and computational resources at the edge of networks, hence providing real-time, high-bandwidth, low-latency access to radio network resources. As the integration of two technologies delivers network capabilities more efficiently and effectively, this paper provides a comprehensive study on edge-enabled network slicing frameworks and potential solutions with example use cases. In addition, this article further elaborated on the application of machine learning in edge-sliced networks and discussed some recent works as well as example deployment scenarios. Furthermore, to reveal the benefits of these systems further, a novel framework based on reinforcement learning for controller synchronization in distributed edge sliced networks is proposed.
... Cui [71] explores a variation of the LSTM architecture together with Convolutional Neural Networks (CNN) in a slice context approach for vehicular networks (V2X). The neural network training used a mobile network traffic dataset from the city of Milan, Italy [72], which contained data from the following three categories: SMS, telephone, and web browsing. ...
Article
Full-text available
The Network Slice Selection Function (NSSF) in heterogeneous technology environments is a complex problem, which still does not have a fully acceptable solution. Thus, the implementation of new network selection strategies represents an important issue in development, mainly due to the growing demand for applications and scenarios involving 5G and future networks. This work presents an integrated solution for the NSSF problem, called the Network Slice Selection Function Decision-Aid Framework (NSSF DAF), which consists of a distributed solution in which a part is executed on the user’s equipment (for example, smartphones, Unmanned Aerial Vehicles, IoT brokers) functioning as a transparent service, and another at the Edge of the operator or service provider. It requires a low consumption of computing resources from mobile devices and offers complete independence from the network operator. For this purpose, protocols and software tools are used to classify slices, employing the following four multicriteria methods to aid decision making: VIKOR (Visekriterijumska Optimizacija i Kompromisno Resenje), COPRAS (Complex Proportional Assessment), TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) and Promethee II (Preference Ranking Organization Method for Enrichment Evaluations). The general objective is to verify the similarity among these methods and applications to the slice classification and selection process, considering a specific scenario in the framework. It also uses machine learning through the K-means clustering algorithm, adopting a hybrid solution in the implementation and operation of the NSSF service in multi-domain slicing environments of heterogeneous mobile networks. Testbeds were conducted to validate the proposed framework, mapping the adequate quality of service requirements. The results indicate a real possibility of offering a complete solution to the NSSF problem that can be implemented in Edge, in Core, or even in the 5G Radio Base Station itself, without the incremental computational cost of the end user’s equipment, allowing for an adequate quality of experience.
... More recently, several works have applied Machine Learning (ML) methods to address the resource management and decision-making problem in V2X communications in vehicular ad hoc network (VANET). In order to meet the diversified service requests of vehicles in such a dynamic vehicular environment, [9], [10] and [11] use long short-term memory (LSTM) to predict data traffic and required resources to allocate to RAN slices. In addition, [9] and [11] seek to develop deep reinforcement learning methods to improve service performance through resource allocation. ...
... Figure 2 shows the evolution of Network Slicing, the year of conceptualization [24], implementation [25,26] network virtualization [27,28] the concept of SDN [29] and NFV [30,31], and different services for the users [32][33][34][35][36]. From the year 2015, research in the field of Network Slicing accelerated, with the result of increasing the quality of services. Various Machine Learning (ML) [37][38][39] and Deep Learning (DL) [40][41][42][43][44] algorithms are being applied to manage slices, user requests, slice admissions, resources, and traffic in 5G network slicing. Various optimization algorithms [45,46] are also utilized to optimize the network functions. ...
Article
Full-text available
Fifth-generation networks efficiently support and fulfill the demands of mobile broadband and communication services. There has been a continuing advancement from 4G to 5G networks, with 5G mainly providing the three services of enhanced mobile broadband (eMBB), massive machine type communication (eMTC), and ultra-reliable low-latency services (URLLC). Since it is difficult to provide all of these services on a physical network, the 5G network is partitioned into multiple virtual networks called “slices”. These slices customize these unique services and enable the network to be reliable and fulfill the needs of its users. This phenomenon is called network slicing. Security is a critical concern in network slicing as adversaries have evolved to become more competent and often employ new attack strategies. This study focused on the security issues that arise during the network slice lifecycle. Machine learning and deep learning algorithm solutions were applied in the planning and design, construction and deployment, monitoring, fault detection, and security phases of the slices. This paper outlines the 5G network slicing concept, its layers and architectural framework, and the prevention of attacks, threats, and issues that represent how network slicing influences the 5G network. This paper also provides a comparison of existing surveys and maps out taxonomies to illustrate various machine learning solutions for different application parameters and network functions, along with significant contributions to the field.
... In [24], deep reinforcement learning was been applied to allocate resources in the network slices. In [25], the authors showed that the network slicing could benefit from the proposed traffic prediction method using AI. ...
Article
Full-text available
Heterogeneous Vehicular Network (HetVNET) is a highly dynamic type of network that changes very quickly. Regarding this feature of HetVNETs and the emerging notion of network slicing in 5G technology, we propose a hybrid intelligent Software-Defined Network (SDN) and Network Functions Virtualization (NFV) based architecture. In this paper, we apply Conditional Generative Adversarial Network (CGAN) to augment the information of successful network scenarios that are related to network congestion and dynamicity. The results show that the proposed CGAN can be trained in order to generate valuable data. The generated data are similar to the real data and they can be used in blueprints of HetVNET slices.
Article
The effects of transport development on people’s lives are diverse, ranging from economy to tourism, health care, etc. Great progress has been made in this area, which has led to the emergence of the Internet of Vehicles (IoV) concept. The main objective of this concept is to offer a safer and more comfortable travel experience through making available a vast array of applications, by relying on a range of communication technologies including the fifth-generation mobile networks. The proposed applications have personalized Quality of Service (QoS) requirements, which raise new challenging issues for the management and allocation of resources. Currently, this interest has been doubled with the start of the discussion of the sixth-generation mobile networks. In this context, Network Slicing (NS) was presented as one of the key technologies in the 5G architecture to address these challenges. In this article, we try to bring together the effects of NS implications in the Internet of Vehicles field and show the impact on transport development. We begin by reviewing the state of the art of NS in IoV in terms of architecture, types, life cycle, enabling technologies, network parts, and evolution within cellular networks. Then, we discuss the benefits brought by the use of NS in such a dynamic environment, along with the technical challenges. Moreover, we provide a comprehensive review of NS deploying various aspects of Learning Techniques for the Internet of Vehicles. Afterwards, we present Network Slicing utilization in different IoV application scenarios through different domains; terrestrial, aerial, and marine. In addition, we review Vehicle-to-Everything (V2X) datasets as well as existing implementation tools; besides presenting a concise summary of the Network Slicing-related projects that have an impact on IoV. Finally, in order to promote the deployment of Network Slicing in IoV, we provide some directions for future research work. We believe that the survey will be useful for researchers from academia and industry. First, to acquire a holistic vision regarding IoV-based NS realization and identify the challenges hindering it. Second, to understand the progression of IoV powered NS applications in the different fields (terrestrial, aerial, and marine). Finally, to determine the opportunities for using Machine Learning Techniques (MLT), in order to propose their own solutions to foster NS-IoV integration.
Chapter
In the context of self-reported health, where the subjective perception of the patients is reported through simple yet effective questionnaires, information gathering is very important to obtain consistent and meaningful data analysis. Smart phones are a good tool to gather self-reported variables, but the interaction with the user should go ahead of scheduled notifications. We develop an intelligent notification system that learns by exploration the most adequate time to perform a questionnaire, while using just the answers of the notification messages from the user. We address the smart notification as a Reinforcement Learning (RL) problem, considering several states representations and reward functions for the Upper Confidence Bound, Tabular Q-learning and Deep Q-learning. We evaluate the algorithms on a simulator, followed by an initial prototype where the approach with better performance in simulation is selected for a small pilot. The simulator acts as a person, accepting or discarding the notifications according to the behavior of a three typical users. From the simulation experiments the UCB algorithm showed the most promising results, so we implemented and deployed the RL algorithm in a smartphone application with several users. On this initial pilot with four users, the UCB algorithm was able to find the adequate hours to send notifications for quiz answering.KeywordsmHealthNotificationsMachine LearningPersonalizationReinforcement LearningReceptivity
Article
Full-text available
5G and beyond networks are expected to support a wide range of services, with highly diverse requirements. Yet, the traditional “one-size-fits-all” network architecture lacks the flexibility to accommodate these services. In this respect, network slicing has been introduced as a promising paradigm for 5G and beyond networks, supporting not only traditional mobile services, but also vertical industries services, with very heterogeneous requirements. Along with its benefits, the practical implementation of network slicing brings a lot of challenges. Thanks to the recent advances in machine learning (ML), some of these challenges have been addressed. In particular, the application of ML approaches is enabling the autonomous management of resources in the network slicing paradigm. Accordingly, this paper presents a comprehensive survey on contributions on ML in network slicing, identifying major categories and sub-categories in the literature. Lessons learned are also presented and open research challenges are discussed, together with potential solutions.
Article
Full-text available
Fifth-generation (5G) wireless technology promises to be the critical enabler of use cases far beyond smartphones and other connected devices. This next-generation 5G wireless standard represents the changing face of connectivity by enabling elevated levels of automation through continuous optimization of several Key Performance Indicators (KPIs) such as latency, reliability, connection density, and energy efficiency. Mobile Network Operators (MNOs) must promote and implement innovative technologies and solutions to reduce network energy consumption while delivering high-speed and low-latency services to deploy energy-efficient 5G networks with a reduced carbon footprint. This research evaluates an energy-saving method using data-driven learning through load estimation for Beyond 5G (B5G) networks. The proposed `ECO6G’ model utilizes a supervised Machine Learning (ML) approach for forecasting traffic load and uses the estimated load to evaluate the energy efficiency and OPEX savings. The simulation results demonstrate a comparative analysis between the traditional time-series forecasting methods and the proposed ML model that utilizes learned parameters. Our ECO6G dataset is captured from measurements on a real-world operational 5G base station (BS). We showcase simulations using our ECO6G model for a given dataset and demonstrate that the proposed ECO6G model is accurate within 4.3millionover100,000BSsover5yearscomparedtothreeothermodelsthatwouldincreaseOPEXcostfrom4.3 million over 100,000 BSs over 5 years compared to three other models that would increase OPEX cost from 370 million to $1.87 billion during varying network load scenarios against other data-driven and statistical learning models.
Article
Full-text available
Mobile networks are facing an unprecedented demand for high-speed connectivity originating from novel mobile applications and services and, in general, from the adoption curve of mobile devices. However, coping with the service requirements imposed by current and future applications and services is very difficult since mobile networks are becoming progressively more heterogeneous and more complex. In this context, a promising approach is the adoption of novel network automation solutions and, in particular, of zero-touch management techniques. In this work, we refer to zero-touch management as a fully autonomous network management solution with human oversight. This survey sits at the crossroad between zero-touch management and mobile and wireless network research, effectively bridging a gap in terms of literature review between the two domains. In this paper, we first provide a taxonomy of network management solutions. We then discuss the relevant state-of-the-art on autonomous mobile networks. The concept of zero-touch management and the associated standardization efforts are then introduced. The survey continues with a review of the most important technological enablers for zero-touch management. The network automation solutions from the RAN to the core network, including end-toend aspects such as security, are then surveyed. Finally, we close this article with the current challenges and research directions.
Article
The concept of a programmable network instantiates dedicated network slices for Unmanned Aerial Vehicle (UAV)-based on-demand communication layers which improve the efficiency of overall network performance. With the increasing random service demands, network resource allocation, retention, and release have become serious networking challenges. Often existing methods consider dedicated resource allocations which result in poor resource utilization as well as service quality. Though Machine Learning (ML) techniques are being used for better performance, limited energy constraints and complexity in resource cycle management become a critical matter of fact again. To resolve these issues, we propose service-specific learning models on VNF (Virtual Network Function) data that are running on shared network slices. The results show an average reduction of 35% error from state-of-the-art techniques. This improved performance can further reduce chances of over or under allocation of resources which could lead to severe service denials to time-critical applications in the areas of disaster management, e-healthcare applications, etc.
Article
Cellular networks are important for the success of modern communication systems, which support billions of mobile users and devices. Powered by artificial intelligence techniques, cellular networks are becoming increasingly smarter, and cellular traffic prediction is an important basis for realizing various applications that have originated from this trend. In this survey, we review the relevant studies on cellular traffic prediction and classify the prediction problems as the temporal and spatiotemporal prediction problems. The prediction models with artificial intelligence are categorized into statistical, machine learning, and deep learning models and then compared. Various applications based on cellular traffic prediction are summarized along with their current progress. The potential research directions are pointed out for future research. To the best of our knowledge, this paper is the first comprehensive survey on cellular traffic prediction.
Article
The context of this study examines the requirements of Future Intelligent Networks (FIN), solutions, and current research directions through a survey technique. The background of this study is hinged on the applications of Machine Learning (ML) in the networking field. Through careful analysis of literature and real-world reports, we noted that ML has significantly expedited decision-making processes, enhanced intelligent automation, and helped resolve complex problems economically in different fields of life. Various researchers have also envisioned future networks incorporating intelligent functions and operations with the ML. Several efforts have been made to automate individual functions and operations in the networking domain; however, most of the existing ML models proposed in the literature lack several vital requirements. Hence, this study aims to present a comprehensive summary of the requirements of FIN and propose a taxonomy of different network functionalities that needs to be equipped with ML techniques. The core objectives of this study are to provide a taxonomy of requirements envisioned for end-to-end FIN, relevant ML techniques, and their analysis to find research gaps, open issues, and future research directions. The real benefit of machine learning applications in any domain can only be ensured if intelligent capabilities cover all its components. We observed that future generations of networks are heterogeneous, multi-vendor, and multidimensional, and ML can provide optimal results only if intelligent capabilities are used on a holistic scale. Realizing intelligence on a holistic scale is only possible if the ML algorithms can solve heterogeneous problems in a multi-vendor and multidimensional environment. ML models must be reliable and efficient, support distributed learning architecture, and possess the capability to learn and share the knowledge across the network layers and administrative domains to solve issues. Firstly, this study ascertains the requirements of the FIN and proposes their taxonomy through reviews on envisioned ideas by various researchers and articles gathered from reputed conferences and standard developing organizations using keyword queries. Secondly, we have reviewed existing studies on ML applications focusing on coverage, heterogeneity, distributed architecture, and cross-domain knowledge learning and sharing. Our study observed that in the past, ML applications were focused mainly on an individual/isolated level only, and aspects of global and deep holistic learning with cross-layer/domain knowledge sharing with agile ML operations are not explored at large. We recommend that the issues mentioned above be addressed with improved ML architecture and agile operations and propose ML pipeline-based architecture for FIN. The significant contribution of this study is the impetus for researchers to seek ML models suitable for a modular, distributed, multi-domain and multi-layer environment and provide decision-making on a global or holistic rather than individual function level.
Article
Full-text available
This work proposes a position-dependent deep learning (DL)-based algorithm that enables interference free resource allocation (RA) among mobile small cells (mScs). The proposed algorithm considers a vehicular environment comprising of city buses that generates historic data about the city buses positions. The position information of the moving buses is exploited to form interference free resource block (RB) allocation as data labels to the respective historic data. The long short-term memory (LSTM) algorithm is used for RA in mSc network based on position-dependent historic data. The numerical results obtained under non-dense and dense mSc network scenarios reveal that the proposed algorithm outperforms other machine learning (ML) and DL-based RA mechanisms. Moreover, the proposed RA algorithm shows improved results when compared to RA using Global Positioning System Dependent Interference Graph (GPS-DIG), but provides less data rates as compared to existing Time Interval Dependent Interference Graph (TIDIG)-based, and Threshold Percentage Dependent Interference Graph (TPDIG)-based RA while fulfilling the users’ demands. The proposed scheme is computationally less expensive in comparison with TIDIG and TPDIG-based algorithms.
Article
Full-text available
With the advent of 5G era, network slicing has received a great deal of attention as a means to support a variety of wireless services in a flexible manner. Network slicing is a technique to divide a single physical resource network into multiple slices supporting independent services. In beyond 5G (B5G) systems, the main goal of network slicing is to assign the physical resource blocks (RBs) such that the quality of service (QoS) requirements of eMBB, URLLC, and mMTC services are satisfied. Since the goal of each service category is dearly distinct and the computational burden caused by the increased number of time slots is huge, it is in general very difficult to assign each RB to a certain service properly. In this paper, we propose a deep reinforcement learning (DRL)-based network slicing technique to find out the resource allocation policy maximizing the long-term throughput while satisfying the QoS requirements in the B5G systems. Key ingredient of the proposed technique is to use action elimination to eliminate undesirable actions that cannot satisfy the QoS requirements. Numerical results demonstrate that the proposed technique is effective in maximizing the long-term throughput and handling the coexistence of use cases in the B5G environments.
Article
Full-text available
Next generation wireless architectures are expected to enable slices of shared wireless infrastructure which are customized to specific mobile operators/services. Given infrastructure costs and stochastic nature of mobile services' spatial loads, it is highly desirable to achieve efficient statistical multiplexing amongst network slices. We study a simple dynamic resource sharing policy which allocates a 'share' of a pool of (distributed) resources to each slice- Share Constrained Proportionally Fair (SCPF). We give a characterization of the achievable performance gains over static slicing, showing higher gains when a slice's spatial load is more 'imbalanced' than, and/or 'orthogonal' to, the aggregate network load. Under SCPF, traditional network dimensioning translates to a coupled share dimensioning problem, addressing the existence of a feasible share allocation given slices' expected loads and performance requirements. We provide a solution to robust share dimensioning for SCPF-based network slicing. Slices may wish to unilaterally manage their users' performance via admission control which maximizes their carried loads subject to performance requirements. We show this can be modeled as a "traffic shaping" game with an achievable Nash equilibrium. Under high loads, the equilibrium is explicitly characterized, as are the gains in the carried load under SCPF vs. static slicing. Detailed simulations of a wireless infrastructure supporting multiple slices with heterogeneous mobile loads show the fidelity of our models and range of validity of our high load equilibrium analysis.
Article
Full-text available
The study of socio-technical systems has been revolutionized by the unprecedented amount of digital records that are constantly being produced by human activities such as accessing Internet services, using mobile devices, and consuming energy and knowledge. In this paper, we describe the richest open multi-source dataset ever released on two geographical areas. The dataset is composed of telecommunications, weather, news, social networks and electricity data from the city of Milan and the Province of Trentino. The unique multi-source composition of the dataset makes it an ideal testbed for methodologies and approaches aimed at tackling a wide range of problems including energy consumption, mobility planning, tourist and migrant flows, urban structures and interactions, event detection, urban well-being and many others.
Article
Full-text available
Due to global climate change as well as economic concern of network operators, energy consumption of the infrastructure of cellular networks, or “Green Cellular Networking,” has become a popular research topic. While energy saving can be achieved by adopting renewable energy resources or improving design of certain hardware (e.g., power amplifier) to make it more energy-efficient, the cost of purchasing, replacing, and installing new equipment (including manpower, transportation, disruption to normal operation, as well as associated energy and direct cost) is often prohibitive. By comparison, approaches that work on the operating protocols of the system do not require changes to current network architecture, making them far less costly and easier for testing and implementation. In this survey, we first present facts and figures that highlight the importance of green mobile networking and then review existing green cellular networking research with particular focus on techniques that incorporate the concept of the “sleep mode” in base stations. It takes advantage of changing traffic patterns on daily or weekly basis and selectively switches some lightly loaded base stations to low energy consumption modes. As base stations are responsible for the large amount of energy consumed in cellular networks, these approaches have the potential to save a significant amount of energy, as shown in various studies. However, it is noticed that certain simplifying assumptions made in the published papers introduce inaccuracies. This review will discuss these assumptions, particularly, an assumption that ignores the effect of traffic-load-dependent factors on energy consumption. We show here that considering this effect may lead to noticeably lower benefit than in models that ignore this effect. Finally, potential future research directions are discussed.
Article
Full-text available
Although the research on traffic prediction is an established field, most existing works have been carried out on traditional wired broadband networks and rarely shed light on cellular radio access networks (CRANs). However, with the explosively growing demand for radio access, there is an urgent need to design a traffic-aware energy-efficient network architecture. In order to realize such a design, it becomes increasingly important to model the traffic predictability theoretically and discuss the traffic-aware networking practice technically. In light of that perspective, we first exploit entropy theory to analyze the traffic predictability in CRANs and demonstrate the practical prediction performance with the state-of-the-art methods. We then propose a blueprint for a traffic-based software-defined cellular radio access network (SDCRAN) architecture and address the potential applications of predicted traffic knowledge into this envisioned architecture.
Article
Full-text available
This deliverable describes WINNER II channel models for link and system level simulations. Both generic and clustered delay line models are defined for selected propagation scenarios. Disclaimer: The channel models described in this deliverable are based on a literature survey and measurements performed during this project. The authors are not responsible for any loss, damage or expenses caused by potential errors or inaccuracies in the models or in the deliverable.
Article
With accurate traffic prediction, future cellular networks can make self-management and embrace intelligent and efficient automation. This letter devotes itself to citywide cellular traffic prediction and propose a deep learning approach to model the nonlinear dynamics of wireless traffic. By treating traffic data as images, both the spatial and temporal dependence of cell traffic are well captured utilizing densely connected convolutional neural networks. A parametric matrix based fusion scheme is further put forward to learn influence degrees of the spatial and temporal dependence. Experimental results show that the prediction performance in terms of root mean square error (RMSE) can be significantly improved compared with those existing algorithms. The prediction accuracy is also validated by using the datasets of Telecom Italia.
Article
Wireless big data is attracting extensive attention from operators, vendors and academia, which provides new freedoms in improving the performance from various levels of wireless networks. One possible way to leverage big data analysis is predictive resource allocation, which has been reported to increase spectrum and energy resource utilization efficiency with the predicted user behavior including user mobility. However, few works address how the traffic load prediction can be exploited to optimize the data-driven radio access. We show how to translate the predicted traffic load into the essential information used for resource optimization by taking energy-saving transmission for non-real-time user as an example. By formulating and solving an energy minimizing resource allocation problem with future instantaneous bandwidth information, we not only provide a performance upper bound, but also reveal that only two key parameters are related to the future information. By exploiting the residual bandwidth probability derived from the traffic volume prediction, the two parameters can be estimated accurately when the transmission delay allowed by the user is large, and the closed-form solution of global optimal resource allocation can be obtained when the delay approaches infinity. We provide a heuristic resource allocation policy to guarantee a target transmission completion probability when the delay is no-so-large. Simulation results validate our analysis, show remarkable energy-saving gain of the proposed predictive policy over non-predictive policies, and illustrate that the time granularity in predicting traffic load should be identical to the delay allowed by the user.
Article
Understanding mobile traffic patterns of large scale cellular towers in urban environment is extremely valuable for Internet service providers, mobile users, and government managers of modern metropolis. This paper aims at extracting and modeling the traffic patterns of large scale towers deployed in a metropolitan city. To achieve this goal, we need to address several challenges, including lack of appropriate tools for processing large scale traffic measurement data, unknown traffic patterns, as well as handling complicated factors of urban ecology and human behaviors that affect traffic patterns. Our core contribution is a powerful model which combines three dimensional information (time, locations of towers, and traffic frequency spectrum) to extract and model the traffic patterns of thousands of cellular towers. Our empirical analysis reveals the following important observations. First, only five basic time-domain traffic patterns exist among the 9600 cellular towers. Second, each of the extracted traffic pattern maps to one type of geographical locations related to urban ecology, including residential area, business district, transport, entertainment, and comprehensive area. Third, our frequency-domain traffic spectrum analysis suggests that the traffic of any tower among 9600 can be constructed using a linear combination of four primary components corresponding to human activity behaviors. We believe that the proposed traffic patterns extraction and modeling methodology, combined with the empirical analysis on the mobile traffic, pave the way toward a deep understanding of the traffic patterns of large scale cellular towers in modern metropolis.
Article
5G wireless technology is paving the way to revolutionize future ubiquitous and pervasive networking, wireless applications, and user quality of experience. To realize its potential, 5G must provide considerably higher network capacity, enable massive device connectivity with reduced latency and cost, and achieve considerable energy savings compared to existing wireless technologies. The main objective of this article is to explore the potential of NFV in enhancing 5G radio access networks' functional, architectural, and commercial viability, including increased automation, operational agility, and reduced capital expenditure. The ETSI NFV Industry Specification Group has recently published drafts focused on standardization and implementation of NFV. Harnessing the potential of 5G and network functions virtualization, we discuss how NFV can address critical 5G design challenges through service abstraction and virtualized computing, storage, and network resources. We describe NFV implementation with network overlay and SDN technologies. In our discussion, we cover the first steps in understanding the role of NFV in implementing CoMP, D2D communication, and ultra densified networks.
Conference Paper
As the volume of mobile traffic has been growing quickly in recent years, reducing the congestion of mobile networks has become an important problem of networking research. Researchers found out that the inhomogeneity in the spatio-temporal distribution of the data traffic leads to extremely insufficient utilization of network resources. Thus, it is important to fundamentally understand this distribution to help us make better resource planning or introduce new management tools such as time-dependent pricing to reduce the congestion. However, due to the requirement of a large dataset, a detailed, radical and credible network-wide study for the spatio-temporal distribution of mobile traffic is still lacking. In this work, we conduct such a measurement study. Base on a large-scale data set obtained from 380,000 base stations in Shanghai spanning over one month, we quantitatively characterize the spatio-temporal distribution of mobile traffic and present a detailed visualized analysis. Furthermore, on the basis of quantitative analysis, we find that the mobile traffic loads uniformly follow a trimodal distribution, which is the combination of compound-exponential, power-law and exponential distributions, in terms of both spatial and temporal dimension. Extensive results show that our model is with accuracy over 99%, which provides fundamental and credible guidelines for the practical solutions of the issues in mobile traffic operations.
Article
Machine learning addresses the question of how to build computers that improve automatically through experience. It is one of today’s most rapidly growing technical fields, lying at the intersection of computer science and statistics, and at the core of artificial intelligence and data science. Recent progress in machine learning has been driven both by the development of new learning algorithms and theory and by the ongoing explosion in the availability of online data and low-cost computation. The adoption of data-intensive machine-learning methods can be found throughout science, technology and commerce, leading to more evidence-based decision-making across many walks of life, including health care, manufacturing, education, financial modeling, policing, and marketing.
Article
Enabled to provide pervasive access to distributed resources in parallel ways, heterogeneous scheduling is extensively applied in large-scaled computing system for high performance. Conventional real-time scheduling algorithms, however, either disregard applications’ security needs and thus expose the applications to security threats or run applications at inferior security levels without optimizing security performance. In recognition of high reliability, a security-aware model is firstly presented via quantization of security overheads of heterogeneous systems. Secondly, inspired by multi disciplines, the meta-heuristic is addressed based on the supercomputer hybrid architecture. On the other hand, some technological breakthroughs are achieved, including boundary conditions for different heterogeneous computing and cloud scheduling and descriptions of real-time variation of scheduling indexes (stringent timing and security constraints). Extensive simulator and simulation experiments highlight higher efficacy and better scalability for the proposed approaches compared with the other three meta-heuristics; the overall improvements achieve 8 %, 12 % and 14 % for high-dimension instances, respectively.
Article
This article presents an architecture vision to address the challenges placed on 5G mobile networks. A two-layer architecture is proposed, consisting of a radio network and a network cloud, integrating various enablers such as small cells, massive MIMO, control/user plane split, NFV, and SDN. Three main concepts are integrated: ultra-dense small cell deployments on licensed and unlicensed spectrum, under control/user plane split architecture, to address capacity and data rate challenges; NFV and SDN to provide flexible network deployment and operation; and intelligent use of network data to facilitate optimal use of network resources for QoE provisioning and planning. An initial proof of concept evaluation is presented to demonstrate the potential of the proposal. Finally, other issues that must be addressed to realize a complete 5G architecture vision are discussed.
Chapter
Convex optimization problems arise frequently in many different fields. A comprehensive introduction to the subject, this book shows in detail how such problems can be solved numerically with great efficiency. The focus is on recognizing convex optimization problems and then finding the most appropriate technique for solving them. The text contains many worked examples and homework exercises and will appeal to students, researchers and practitioners in fields such as engineering, computer science, mathematics, statistics, finance, and economics.
Article
We review the past 25 years of research into time series forecasting. In this silver jubilee issue, we naturally highlight results published in journals managed by the International Institute of Forecasters (Journal of Forecasting 1982–1985 and International Journal of Forecasting 1985–2005). During this period, over one third of all papers published in these journals concerned time series forecasting. We also review highly influential works on time series forecasting that have been published elsewhere during this period. Enormous progress has been made in many areas, but we find that there are a large number of topics in need of further development. We conclude with comments on possible future research directions in this field.