Article

DeepTP: An End-to-End Neural Network for Mobile Cellular Traffic Prediction

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Accurate traffic forecasting and prediction is important for stable, high quality 5G network slicing performance. However, as discussed in [154], traditional statistical models face some challenges that often impact the performance of network slicing. Here are some examples: (a) Linear methods such as ARIMA are less robust to complex temporal fluctuations such as traffic generation and long term. ...
... Here are some examples: (a) Linear methods such as ARIMA are less robust to complex temporal fluctuations such as traffic generation and long term. This is because the model tends to over reproduce the average of previously observed instances [154]. In nonhomogeneous time series scenarios where the inputs and predictions are not in the same set of data points, these methods tend to perform poorly, (b) The general presumption that the slice should always be allocated with the maximum throughput as indicated in the SLA or throughput should be assigned based on requirements of the admitted slice without any control mechanisms, (c) the accuracy of the results could be impacted by any shifts in the population, (d) supporting seasonal data i.e. time series data with a repeating cycle such as days of the week could be challenging for traditional methods like ARIMA. ...
... Accurately predicting slice footprint, results in increased number of slices on the common physical infrastructure [71]. However, today most techniques in traffic forecasting are mainly timeseries methods that ignore the spatial impact of traffic networks in traffic flow modelling [154]. Consideration of spatial and temporal dimensions in traffic forecasting is the key to comprehensive traffic forecasting. ...
Article
The 5th Generation (5G) and beyond networks are expected to offer huge throughputs, connect large number of devices, support low latency and large numbers of business services. To realize this vision, there is a need for a paradigm shift in the way cellular networks are designed, built, and maintained. Network slicing divides the physical network infrastructure into multiple virtual networks to support diverse business services, enterprise applications and use cases. Multiple services and use cases with varying architectures and quality of service requirements on such shared infrastructure complicates the network environment. Moreover, the dynamic and heterogeneous nature of 5G and beyond networks will exacerbate network management and operations complexity. Inspired by the successful application of machine learning tools in solving complex mobile network decision making problems, deep reinforcement learning (Deep RL) methods provide potential solutions to address slice lifecycle management and operation challenges in 5G and beyond networks. This paper aims to bridge the gap between Deep RL and the 5G network slicing research, by presenting a comprehensive survey of their existing research association. First, the basic concepts of Deep RL framework are presented. 5G network slicing and virtualization principles are then discussed. Thirdly, we review challenges in 5G network slicing and the current research efforts to incorporate Deep RL in addressing them. Lastly, we present open research problems and directions for future research.
... DNN-BTF learns spatial features by CNN and temporal features by RNN, respectively, to combine spatial and temporal features of spatiotemporal sequences with improving prediction accuracy. Feng and Chen [55] suggested a Deep Traffic Predictor (DeepTP) based on deep learning that can model spatial dependencies and external information and employs a Seq2Seq model with an attention mechanism to acquire reliable temporal features from latency and long-period traffic data. ...
... features by CNN and temporal features by RNN, respectively, to combine spatial temporal features of spatiotemporal sequences with improving prediction accuracy. and Chen [55] suggested a Deep Traffic Predictor (DeepTP) based on deep learning can model spatial dependencies and external information and employs a Seq2Seq m with an attention mechanism to acquire reliable temporal features from latency and l period traffic data. Tests on massive mobile cellular data are carried out, and experimental results reveal that DeepTP beats all benchmark models. ...
Article
Full-text available
At present, the amount of network equipment, servers, and network traffic is increasing exponentially, and the way in which operators allocate and efficiently utilize network resources has attracted considerable attention from traffic forecasting researchers. However, with the advent of the 5G era, network traffic has also shown explosive growth, and network complexity has increased dramatically. Accurately predicting network traffic has become a pressing issue that must be addressed. In this paper, a multilayer perceptron ensemble learning method based on convolutional neural networks (CNN) and gated recurrent units (GRU) spatiotemporal feature extraction (MECG) is proposed for network traffic prediction. First, we extract spatial and temporal features of the data by convolutional neural networks (CNN) and recurrent neural networks (RNN). Then, the extracted temporal features and spatial features are fused into new spatiotemporal features through integrated learning of a multilayer perceptron, and a spatiotemporal prediction model is built in the sequence-to-sequence framework. At the same time, the teacher forcing mechanism and attention mechanism are added to improve the accuracy and convergence speed of the model. Finally, the proposed method is compared with other deep learning models for experiments. The experimental results show that the proposed method not only has apparent advantages in accuracy but also shows some superiority in time training cost.
... The internet not only meets people's daily communication needs but also contributes greatly to the development of the country. With the rapid development of data networks and the increasing demand for network traffic, wireless network operators need to guarantee the Quality of Service (QoS) of their networks, thus, leading to a need to plan their network traffic properly [1,2]. Traffic planning provides a scientific approach to allocate traffic, while network traffic forecasting provides a solution to the network traffic planning problem. ...
... Besides, three evaluation metrics were selected as indicators to judge the effectiveness of the model as the following. 1 Root Mean Square Error (RMSE), which reflects the prediction error of the model. The error value is [0, +∞] in the range. ...
Article
Full-text available
Network traffic forecasting is essential for efficient network management and planning. Accurate long-term forecasting models are also essential for proactive control of upcoming congestion events. Due to the complex spatial-temporal dependencies between traffic flows, traditional time series forecasting models are often unable to fully extract the spatial-temporal characteristics between the traffic flows. To address this issue, we propose a novel dual-channel based graph convolutional network (DC-STGCN) model. The proposed model consists of two temporal components that characterize the daily and weekly correlation of the network traffic. Each of these two components contains a spatial-temporal characteristics extraction module consisting of a dual-channel graph convolutional network (DCGCN) and a gated recurrent unit (GRU). The DCGCN further consists of an adjacency feature extraction module (AGCN) and a correlation feature extraction module (PGCN) to capture the connectivity between nodes and the proximity correlation, respectively. The GRU further extracts the temporal characteristics of the traffic. The experimental results based on real network data sets show that the prediction accuracy of the DC-STGCN model overperforms the existing baseline and is capable of making long-term predictions.
... Recently, several studies have approached the traffic prediction problem by considering the spatiotemporal characteristics based on deep learning [13][14][15][16][17][18][19][20][21]. The traffic prediction problem deals with the estimation of traffic at time t + 1 through past traffic of duration L. This problem is represented as x t+1 = f (x t , x t−1 , x t−2 , . . . ...
... Previous studies that applied deep learning to networks have focused on traffic prediction problems [13][14][15][16][17][18][19][20][21]. This study attempted to apply deep learning for traffic modeling; however, prediction models are difficult to use. ...
Article
Full-text available
A substantial amount of money and time is required to optimize resources in a massive Wi-Fi network in a real-world environment. Therefore, to reduce cost, proposed algorithms are first verified through simulations before implementing them in a real-world environment. A traffic model is essential to describe user traffic for simulations. Existing traffic models are statistical models based on a discrete-time random process and combine a spatiotemporal characteristic model with the varying parameters, such as average and variance, of a statistical model. The spatiotemporal characteristic model has a mathematically strict assumption that the access points (APs) have approximately similar traffic patterns that increase during day times and decrease at night. The mathematical assumption ensures a homogeneous representation of the network traffic. It does not include heterogeneous characteristics, such as the fact that lecture buildings on campus have a high traffic during lectures, while restaurants have a high traffic only during mealtimes. Therefore, it is difficult to represent heterogeneous traffic using this mathematical model. Deep learning can be used to represent heterogeneous patterns. This study proposes a generative model for Wi-Fi traffic that considers spatiotemporal characteristics using deep learning. The proposed model learns the heterogeneous traffic patterns from the AP-level measurement data without any assumptions and generates similar traffic patterns based on the data. The result shows that the difference between the sample generated by the proposed model and the collected data is up to 72.1% less than that reported in previous studies.
... The attention mechanism is applied to the seq2seq model to build the sequential model. The observation of results shows that this model outperforms other traditional prediction models by more than 12.31 percent [12]. ...
... Equations (12) to (17) of LSTM are as follows: ...
Article
Full-text available
An IoT is the communication of sensing devices linked to the Internet in order to communicate data. IoT devices have extremely critical reliability with an efficient and robust network condition. Based on enormous growth in devices and their connectivity, IoT contributes to the bulk of Internet traffic. Prediction of network traffic is very important function of any network. Traffic prediction is important to ensure good system efficiency and ensure service quality of IoT applications, as it relies primarily on congestion management, admission control, allocation of bandwidth to the system, and the identification of anomalies. In this paper, a complete overview of IoT traffic forecasting model using classic time series and artificial neural network is presented. For prediction of IoT traffic, real network traces are used. Prediction models are evaluated using MAE, RMSE, and R-squared values. The experimental results indicate that LSTM- and FNN-based predictive models are highly sensitive and can therefore be used to provide better performance as a timing sequence forecast model than the conventional traffic prediction techniques.
... For this purpose, the scenario is often divided into a regular grid of regions by aggregating the traffic demand per region. More complex dependencies at a cell level can be modeled more effectively by feature extraction based on traffic correlation [34] or Graph Neural Networks (GNN) [17], [35]. ...
... It is well known that deep-learning architectures outperform classical time series analysis models for cellular traffic forecasting, especially when available measurements present a fine temporal and spatial granularity [30], [54]. However, all the above-mentioned works exclusively consider network information or simple time-independent location-based factors (e.g., points of interests in [34]) as inputs to their models. It is still to be checked if prediction accuracy can be improved by adding information related to external factors, such as social events. ...
Article
Full-text available
In cellular networks, a deep knowledge of the traffic demand pattern in each cell is essential in network planning and optimization tasks. However, a precise forecast of the traffic time series per cell is hard to achieve, due to the noise originated by abnormal local events. In particular, mass social events (e.g., concerts, conventions, sport events…) have a strong impact on traffic demand. In this paper, a data-driven model to estimate the impact of local events on cellular traffic is presented. The model is trained with a large dataset of geotagged social events taken from public event databases and hourly traffic data from a live Long Term Evolution (LTE) network. The resulting model is combined with a traffic forecast module based on a multi-task deep-learning architecture to predict the hourly traffic series with scheduled mass events. Model assessment is performed over a real dataset created with geolocated social event information collected from public event directories and hourly cell traffic measurements during two months in a LTE network. Results show that the addition of the proposed model significantly improves traffic forecasts in the presence of massive events.
... A common approach is to use deep learning to model the spatio-temporal dependence of traffic demand. The temporal aspect of traffic variations is often captured with recurrent neural networks based on Long Short-Term Memory (LSTM) units [14,[30][31][32]. Alternatively, in [33], a deep belief network and a Gaussian model are used to capture temporal dependencies of network traffic in a mesh wireless network. ...
... Alternatively, other authors model spatial dependencies of traffic carried in different cells. In [32], a general feature extractor is used with a correlation selection mechanism for modeling spatial dependencies among cells and an embedding mechanism to encode external information. In [15], to deal with an irregular cell distribution, the spatial relevancy among cells is modeled with a graph neural network based on distance among cell towers. ...
Article
Full-text available
Network dimensioning is a critical task in current mobile networks, as any failure in this process leads to degraded user experience or unnecessary upgrades of network resources. For this purpose, radio planning tools often predict monthly busy-hour data traffic to detect capacity bottlenecks in advance. Supervised Learning (SL) arises as a promising solution to improve predictions obtained with legacy approaches. Previous works have shown that deep learning outperforms classical time series analysis when predicting data traffic in cellular networks in the short term (seconds/minutes) and medium term (hours/days) from long historical data series. However, long-term forecasting (several months horizon) performed in radio planning tools relies on short and noisy time series, thus requiring a separate analysis. In this work, we present the first study comparing SL and time series analysis approaches to predict monthly busy-hour data traffic on a cell basis in a live LTE network. To this end, an extensive dataset is collected, comprising data traffic per cell for a whole country during 30 months. The considered methods include Random Forest, different Neural Networks, Support Vector Regression, Seasonal Auto Regressive Integrated Moving Average and Additive Holt–Winters. Results show that SL models outperform time series approaches, while reducing data storage capacity requirements. More importantly, unlike in short-term and medium term traffic forecasting, non-deep SL approaches are competitive with deep learning while being more computationally efficient.
... With the continuous expansion of the scale of large Internet enterprises and bank servers, the number of service users continues to grow, resulting in a surge in traffic data. To ensure network quality, enterprises and banks must accurately plan their network traffic and server resources [1,2]. The traditional server resource allocation and scheduling only feedback when the network is congested, which is inefficient Traffic prediction can accurately predict the change trend of traffic by analyzing the historical data of server equipment, and analyze the network conditions in real time to make dynamic network adjustment and allocation decisions. ...
Article
Full-text available
Accurate server traffic prediction can help enterprises formulate network resource allocation strategies in advance and reduce the probability of network congestion. Traditional prediction models ignore the unique data characteristics of server traffic that can be used to optimize the prediction model, so they often cannot meet the long-term and high-precision prediction required by server traffic prediction. To solve this problem, this paper establishes a hybrid model ARIMA-LSTM-CF, which combines the advantages of linear and nonlinear models, as well as the periodic fluctuation characteristics of server traffic data obtained from banks. In addition, this paper also uses the optimized K-means clustering method to extract the traffic data of workdays and non workdays. The results show that the new hybrid model performs better than the single ARIMA and LSTM models in predicting the long-term trend of server traffic. RMSE (root mean square error) and MAE (mean absolute error) are reduced by 50%. R2 score index reached 0.64. The results show that the model can effectively extract the data characteristics of server traffic data, and the model has accurate and stable long-term prediction ability.
... However, in [9], [10], where the goal is to optimize resource allocation using predicted traffic, evaluating the performance of network traffic prediction is challenging. Meanwhile, [6], [7] have also utilized a grid structure similar to [1]- [3], while [8] has used a private dataset and cell clustering based on the similarity of time-series trends. In a previous work [11], we proposed RNN-based models that used handover data for single-step prediction. ...
Preprint
This paper focuses on predicting downlink (DL) traffic volume in mobile networks while minimizing overprovisioning and meeting a given service-level agreement (SLA) violation rate. We present a multivariate, multi-step, and SLA-driven approach that incorporates 20 different radio access network (RAN) features, a custom feature set based on peak traffic hours, and handover-based clustering to leverage the spatiotemporal effects. In addition, we propose a custom loss function that ensures the SLA violation rate constraint is satisfied while minimizing overprovisioning. We also perform multi-step prediction up to 24 steps ahead and evaluate performance under both single-step and multi-step prediction conditions. Our study makes several contributions, including the analysis of RAN features, the custom feature set design, a custom loss function, and a parametric method to satisfy SLA constraints.
... A deep traffic predictor (DeepTP) model to forecast long-period cellular network traffic was proposed in [21]. The results showed that the DeepTP model outperformed other traffic forecast models by more than 12.3%. ...
Article
Full-text available
There is substantial demand for high network traffic due to the emergence of new highly demanding services and applications such as the internet of things (IoT), big data, blockchains, and next-generation networks like 5G and beyond. Therefore, network resource planning and forecasting play a vital role in better resource optimization. Accordingly, forecasting accuracy has become essential for network operation and planning to maintain the minimum quality of service (QoS) for real-time applications. In this paper, a hybrid network- bandwidth slice forecasting model that combines long-short term memory (LSTM) neural network and various local smoothing techniques to enhance the network forecasting model's accuracy was proposed and analyzed. The results show that the proposed hybrid forecasting model can effectively improve the forecasting accuracy with minimal data loss.
... In the context of traffic prediction, Trinh et al. [3] compared LSTM to MLP and ARIMA and showed that LSTM significantly outperforms the rest models. Feng et al. [20] proposed DeepTP, an LSTM-based network which models the spatial dependence of time-series and temporal changes and showed that the proposed framework outperforms LSTM without enhancements and traditional methods such as ARIMA. Chen et al. [21] proposed a clustered LSTM-based model for multivariate timeseries modeling. ...
Preprint
Full-text available
Mobile traffic prediction is of great importance on the path of enabling 5G mobile networks to perform smart and efficient infrastructure planning and management. However, available data are limited to base station logging information. Hence, training methods for generating high-quality predictions that can generalize to new observations on different parties are in demand. Traditional approaches require collecting measurements from different base stations and sending them to a central entity, followed by performing machine learning operations using the received data. The dissemination of local observations raises privacy, confidentiality, and performance concerns, hindering the applicability of machine learning techniques. Various distributed learning methods have been proposed to address this issue, but their application to traffic prediction has yet to be explored. In this work, we study the effectiveness of federated learning applied to raw base station aggregated LTE data for time-series forecasting. We evaluate one-step predictions using 5 different neural network architectures trained with a federated setting on non-iid data. The presented algorithms have been submitted to the Global Federated Traffic Prediction for 5G and Beyond Challenge. Our results show that the learning architectures adapted to the federated setting achieve equivalent prediction error to the centralized setting, pre-processing techniques on base stations lead to higher forecasting accuracy, while state-of-the-art aggregators do not outperform simple approaches.
... Furthermore, various exogenous factors are considered in some studies for cellular traffic prediction [48]. For instance, Assem et al. [49] proposed ST-DenNetFus with a late fusion method and introduced the day of the week, functional regions, and crowd mobilities. ...
Article
Cellular traffic prediction is of great importance for operators to manage network resources and make decisions. Traffic is highly dynamic and influenced by many exogenous factors, which would lead to the degradation of traffic prediction accuracy. This paper proposes an end-to-end framework with two variants to explicitly characterize the spatiotemporal patterns of cellular traffic among neighboring cells. It uses convolutional neural networks with an attention mechanism to capture the spatial dynamics and Kalman filter for temporal modelling. Besides, we can fully exploit the auxiliary information such as social activities to improve prediction performance. We conduct extensive experiments on three real-world datasets. The results show that our proposed models outperform the state-of-the-art machine learning techniques in terms of prediction accuracy.
... Among them, multi-site air quality prediction belongs to the category of spatial-temporal sequence prediction. Spatial-temporal sequence prediction has large-scale applications in our daily life, such as air quality prediction (Xu et al., 2018;Amato et al., 2020;Wang et al., 2020a;Pak et al., 2020;Zeng et al., 2021;Zhou et al., 2021), cellular flow prediction (Chen et al., 2018a;Feng et al., 2018;Zhang et al., 2018;Zhang et al., 2019;Zeng et al., 2020), traffic flow Predict Cui et al., 2019;Guo et al., 2019;Zhao et al., 2019;Xiao et al., 2020a) etc. With the further development of deep learning, spatial-temporal sequence prediction has been extensively studied. ...
Article
Full-text available
The immune ability of the elderly is not strong, and the functions of the body are in a stage of degeneration, the ability to clear PM2.5 is reduced, and the cardiopulmonary system is easily affected. Accurate prediction of PM2.5 can provide guidance for the travel of the elderly, thereby reducing the harm of PM2.5 to the elderly. In PM2.5 prediction, existing works usually used shallow graph neural network (GNN) and temporal extraction module to model spatial and temporal dependencies, respectively, and do not uniformly model temporal and spatial dependencies. In addition, shallow GNN cannot capture long-range spatial correlations. External characteristics such as air humidity are also not considered. We propose a spatial-temporal graph ordinary differential equation network (STGODE-M) to tackle these problems. We capture spatial-temporal dynamics through tensor-based ordinary differential equation, so we can build deeper networks and exploit spatial-temporal features simultaneously. In addition, in the construction of the adjacency matrix, we not only used the Euclidean distance between the stations, but also used the wind direction data. Besides, we propose an external feature fusion strategy that uses air humidity as an auxiliary feature for feature fusion, since air humidity is also an important factor affecting PM2.5 concentration. Finally, our model is evaluated on the home-based care parks atmospheric dataset, and the experimental results show that our STGODE-M can more fully capture the spatial-temporal characteristics of PM2.5, achieving superior performance compared to the baseline. Therefore, it can provide better guarantee for the healthy travel of the elderly.
... Stacked auto encoder (AE) base frameworks and long short term memory (LSTM)s are used to find the spatiotemporal correlations in wireless traffic data by Wang et al. [79]. The authors in Ref. [131,132] applied CNN and LSTM methods to extract spatiotemporal attributes to predict network traffic with a higher level of accuracy than conventional methods. Wang and his team used graph neural networks to understand the geospatial characteristics of wireless mobile traffic data and discuss possible applications in social activity predictions [118]. ...
Article
Full-text available
The convenience of availing quality services at affordable costs anytime and anywhere makes mobile technology very popular among users. Due to this popularity, there has been a huge rise in mobile data volume, applications, types of services, and number of customers. Furthermore, due to the COVID‐19 pandemic, the worldwide lockdown has added fuel to this increase as most of our professional and commercial activities are being done online from home. This massive increase in demand for multi‐class services has posed numerous challenges to wireless network frameworks. The services offered through wireless networks are required to support this huge volume of data and multiple types of traffic, such as real‐time live streaming of videos, audios, text, images etc., at a very high bit rate with a negligible delay in transmission and permissible vehicular speed of the customers. Next‐generation wireless networks (NGWNs, i.e. 5G networks and beyond) are being developed to accommodate the service qualities mentioned above and many more. However, achieving all the desired service qualities to be incorporated into the design of the 5G network infrastructure imposes large challenges for designers and engineers. It requires the analysis of a huge volume of network data (structured and unstructured) received or collected from heterogeneous devices, applications, services, and customers and the effective and dynamic management of network parameters based on this analysis in real time. In the ever‐increasing network heterogeneity and complexity, machine learning (ML) techniques may become an efficient tool for effectively managing these issues. In recent days, the progress of artificial intelligence and ML techniques has grown interest in their application in the networking domain. This study discusses current wireless network research, brief discussions on ML methods that can be effectively applied to the wireless networking domain, some tools available to support and customise efficient mobile system design, and some unresolved issues for future research directions.
... Other authors have analyzed the spatial interdependence of the traffic transmitted to various cells. e authors [29] used correlation selection with a general feature extractor mechanism to model spatial relationships between cells and an embedding approach to incorporate external information from various sources. To cope with inconsistent cell coverage, the authors in reference [30] modeled spatial coherence between cells using a graphical neural network dependent on cell tower range. ...
Article
Full-text available
Industry 4.0, also known as the Internet of things, is a concept that encompasses the joint applicability of operation, the Internet, and information technologies to expand the efficiency expectation of automation to include green and flexible processes and innovative products and services. Industrial network infrastructures must be modified to accommodate extra traffic from a variety of technologies in order to achieve this integration. In order to successfully implement cutting-edge wireless technologies, high-quality service (QoS) must be provided to end users. It is thus important to keep an eye on the functioning of the whole network without impacting base station throughput. Improved network performance is constantly needed, even for already-deployed cellular networks, such as the 4th generation (4G) and 3rd generation (3G). For the purpose of forecasting network traffic, an integrated model based on the long short-term memory (LSTM) model was used to combine clustering rough k-means (RKM) and fuzzy c-means (FCM). Clustering granules derived from FCM and RKM were also utilized to examine the network data for each calendar year. e novelty of our proposed model is the integration of the prediction and forecasting results obtained using existing prediction models with centroids of clusters. e WIDE backbone network's live network traffic statistics were used to evaluate the proposed solution. e integrated model's outcomes were assessed using a variety of statistical markers, including mean square error (MSE), root mean square error (RMSE), and standard error. e suggested technique was able to provide findings that were very accurate. e prediction error of LSTM with FCM was less on the basis of the MSE of 0.00783 and RMSE of 0.0885 at the training phase, where the prediction values of LSTM with the RKM had an MSE of 0.00564 and RMSE of 0.7511. Finally, the suggested model may substantially increase the prediction accuracy attained using FCM and RKM clustering.
... Other authors have analyzed the spatial interdependence of the traffic transmitted to various cells. e authors [29] used correlation selection with a general feature extractor mechanism to model spatial relationships between cells and an embedding approach to incorporate external information from various sources. To cope with inconsistent cell coverage, the authors in reference [30] modeled spatial coherence between cells using a graphical neural network dependent on cell tower range. ...
... In [8], a deep-learning-based end-to-end model, that is, Deep Traffic Predictor (DeepTP), was developed to forecast the load demand from the spatial-dependent and long-period cellular traffic where a recurrent neural network (RNN)-based seq2seq model with attention mechanism was adopted. The authors of [9] proposed the traffic load prediction based on the random connectivity long short-term memory (LSTM) network which can reduce the computational complexity. ...
Article
Full-text available
Abstract This paper investigates the prediction of mobile traffic load based on four variants of recurrent neural networks, which are the simple long short‐term memory (LSTM), stacked LSTM, gated recurrent unit (GRU) and bidirectional LSTM. In the considered schemes, the mobile traffic load of 15 min ahead of time is estimated based on the previous mobile traffic load data. The performance of the proposed scheme is verified using realistic traffic load data collected from the base station located in Kabul city, Afghanistan, which belongs to the SALAAM telecommunication operator during December 2020 and January 2021. Through performance evaluation, the authors confirm that the traffic load can be predicted with high accuracy using considered schemes and the GRU‐based scheme outperforms other schemes in terms of accuracy.
... Further developments in technology have led to a large number of relevant deep learning algorithms being applied to traffic speed prediction, including convolutional neural network , long short-term memory (LSTM) network (Tian et al., 2018), convolutional long short-term memory (ConvLSTM) network , and sequence to sequence (seq2seq) model (Feng et al., 2018), etc. To effectively improve the prediction accuracy and adapt to the changeable traffic conditions, hybrid models that combine parametric models and non-parametric models have become a new research direction. ...
Article
Full-text available
Accurate travel time information is essential for logistics vehicles to reserve the most suitable parking lot in logistics centers. The purpose of this study is to explore how the uncertainty of traffic time prediction affects parking lot reservation near logistics centers. A hybrid model integrating convolutional long short-term memory network and attention mechanism is proposed to provide the reliable information for travel time prediction intervals. Furthermore, a reliability-based parking lot reservation model is developed by explicitly considering logistics vehicles’ time probability. Several benchmark models are compared with the proposed traffic speed prediction model. The performance of the parking lot reservation model is illustrated by travel behavior questionnaire data and global positioning system data of collected from Beijing, China. The results illustrate that the proposed prediction model exhibits a better accuracy than benchmark models. Moreover, it is found that travel time prediction interval can improve the reliability and stability of travel time, and provide a reliable time information for the parking lot reservation.
... Key nodes in such networks are equipped with computing and storage capabilities, and the characteristics of the network traffic change before and after the network traffic passes through these key nodes. In an intelligent network, accurate and effective prediction can allow for an understanding of the network traffic characteristics in advance, which can be used to improve network resource utilization and prevent network congestion [1,2]. Therefore, it is especially important to establish an efficient and reliable prediction model for network traffic. ...
Article
Full-text available
Network traffic prediction is an important tool for the management and control of IoT, and timely and accurate traffic prediction models play a crucial role in improving the IoT service quality. The degree of burstiness in intelligent network traffic is high, which creates problems for prediction. To address the problem faced by traditional statistical models, which cannot effectively extract traffic features when dealing with inadequate sample data, in addition to the poor interpretability of deep models, this paper proposes a prediction model (fusion prior knowledge network) that incorporates prior knowledge into the neural network training process. The model takes the self-similarity of network traffic as a priori knowledge, incorporates it into the gating mechanism of the long short-term memory neural network, and combines a one-dimensional convolutional neural network with an attention mechanism to extract the temporal features of the traffic sequence. The experiments show that the model can better recover the characteristics of the original data. Compared with the traditional prediction model, the proposed model can better describe the trend of network traffic. In addition, the model produces an interpretable prediction result with an absolute correction factor of 76.4%, which is at least 10% better than the traditional statistical model.
... To develop a better offloading technique, we must first determine the traffic data for each computing activity, also known as the computation offloading data volume. Unlike previous techniques for the description of computer functionality, a profound LSTMbased learning algorithm is used to anticipate computational tasks [37]. Set V k ∈{V 1 , V 2 , V K }, the data size. ...
Article
Full-text available
The use of application media, gamming, entertainment, and healthcare engineering has expanded as a result of the rapid growth of mobile technologies. This technology overcomes the traditional computing methods in terms of communication delay and energy consumption, thereby providing high reliability and bandwidth for devices. In today’s world, mobile edge computing is improving in various forms so as to provide better output and there is no room for simple computing architecture for MEC. So, this paper proposed a secure and energy-efficient computational offloading scheme using LSTM. The prediction of the computational tasks is done using the LSTM algorithm, the strategy for computation offloading of mobile devices is based on the prediction of tasks, and the migration of tasks for the scheme of edge cloud scheduling helps to optimize the edge computing offloading model. Experiments show that our proposed architecture, which consists of an LSTM-based offloading technique and routing (LSTMOTR) algorithm, can efficiently decrease total task delay with growing data and subtasks, reduce energy consumption, and bring much security to the devices due to the firewall nature of LSTM.
... Cloud and cloud storage in areas close to a mobile user are supported by edge computing, providing 5G services to mobile devices using a server at the edge of the internet (which include Wi-Fi access point), routers, base stations, switches, cloud platforms or data centers, as well as any other storage and computational capability enabled devices. Although edge computing is accepted as an additional mode in cloud computing, the simultaneous processing of data requests and calculation tasks still creates a significant demand on intelligent communication systems in the age when 5G mobile communications are being commercialized (M. ; M. Chen et al. (2018); J. Feng et al. (2018); G. Orsini et al. (2016)). Intelligent gadgets have varied computing capabilities. ...
Article
Mobile technologies is evolving so rapidly in every aspect, utilizing every single resource in the form of applications which creates advancement in day to day life. This technological advancements overcomes the traditional computing methods which increases communication delay, energy consumption for mobile devices. In today’s world, Mobile Edge Computing is evolving as a scenario for improving in these limitations so as to provide better output to end users. This paper proposed a secure and energy-efficient computational offloading scheme using LSTM. The prediction of the computational tasks done using the LSTM algorithm. A strategy for computation offloading based on the prediction of tasks, and the migration of tasks for the scheme of edge cloud scheduling based on a reinforcement learning routing algorithm help to optimize the edge computing offloading model. Experimental results show that our proposed algorithm Intelligent Energy Efficient Offloading Algorithm (IEEOA), can efficiently decrease total task delay and energy consumption, and bring much security to the devices due to the firewall nature of LSTM.
... They propose an ensemble system that leverages convolutional LSTM and 3D-ConvNets structures to model long-term trends and short-term variations of the mobile traffic volume, respectively. Results suggest that the proposed system provided highly accurate long term (10-h long) traffic predictions, while operating with short observation intervals (2 h 38 propose an LSTM-based end-to-end model (DeepTP) to forecast traffic demands from spatial-dependent and long-period cellular traffic. DeepTP outperforms ARIMA, SVR, and GRU (although to a lesser extent) based on MRSE. ...
Article
Mobile network traffic prediction is an important input into network capacity planning and optimization. Existing approaches may lack the speed and computational complexity to account for bursting, non‐linear patterns, or other important correlations in time series mobile network data. We compare the performance of two deep learning (DL) architectures, long short‐term memory (LSTM) and gated recurrent unit (GRU), and two conventional machine learning (ML) architectures—Random Forest and Decision Tree—for predicting mobile Internet traffic using 2 months of Telecom Italia data for Milan. K‐Means clustering was used a priori to group cells based on Internet activity, and the Grid Search method was used to identify the best configurations for each model. The predictive quality of the models was evaluated using root mean squared error and mean absolute error. Both DL algorithms were effective in modeling Internet activity and seasonality, both within days and across 2 months. We find variations in performance across clusters within the city. Overall, the DL models outperformed the conventional ML models, and the LSTM outperformed the GRU in our experiments. We compare the performance of LSTM, GRU, Random Forest, and Decision Tree for predicting mobile Internet traffic using Telecom Italia data. K‐means clustering was used a priori to group cells based on Internet activity, and the Grid Search method was used to identify the best configurations for each model.
... In [43], an ARIMA model was applied to predict the use rate in the volume of mobile traffic. Artificial intelligence has been used for deep learning based on LSTM units [44][45][46]. In [47], a convolutional neural network was used for prediction and modelling traffic spatial dependencies, the same as the approach in [48]. ...
Article
Full-text available
The evolution of cellular technology development has led to explosive growth in cellular network traffic. Accurate time-series models to predict cellular mobile traffic have become very important for increasing the quality of service (QoS) with a network. The modelling and forecasting of cellular network loading play an important role in achieving the greatest favourable resource allocation by convenient bandwidth provisioning and simultaneously preserve the highest network utilization. The novelty of the proposed research is to develop a model that can help intelligently predict load traffic in a cellular network. In this paper, a model that combines single-exponential smoothing with long short-term memory (SES-LSTM) is proposed to predict cellular traffic. A min-max normalization model was used to scale the network loading. The single-exponential smoothing method was applied to adjust the volumes of network traffic, due to network traffic being very complex and having different forms. The output from a single-exponential model was processed by using an LSTM model to predict the network load. The intelligent system was evaluated by using real cellular network traffic that had been collected in a kaggle dataset. The results of the experiment revealed that the proposed method had superior accuracy, achieving R-square metric values of 88.21%, 92.20%, and 89.81% for three one-month time intervals, respectively. It was observed that the prediction values were very close to the observations. A comparison of the prediction results between the existing LSTM model and our proposed system is presented. The proposed system achieved superior performance for predicting cellular network traffic.
... Accurate network traffic prediction is the basis of network performance optimization and network integrated management [1]. The prediction results can be used for traffic engineering, anomaly detection, and energy consumption management [2,3]. Especially in the last decade, the complexity and the diversity of the network and the communication scenario increase dramatically, which promote researchers proposed many technologies such as ultra-dense deployment of cellular cells, device-to-device (D2D) network technology mobile virtual reality (MVR), and mobile edge computing to improve network capacity and service quality [4][5][6]. ...
Article
Full-text available
Accurate and real-time network traffic flow forecast holds an important role for network management. Especially at present, virtual reality (VR), artificial intelligence (AI), vehicle-to-everything (V2X), and other technologies are closely combined through the mobile network, which greatly increases the human-computer interaction activities. At the same time, it requires high-throughput, low delay, and high reliable service guarantee. In order to achieve ondemand real-time high-quality network service, we must accurately grasp the dynamic changes of network traffic. However, due to the increase of client mobility and application behavior diversity, the complexity and dynamics of network traffic in the temporal domain and the spatial domain increase sharply. To accurate capture the spatiotemporal features, we propose the spatial-temporal graph convolution gated recurrent unit (GC-GRU) model, which integrates the graph convolutional network (GCN) and the gated recurrent unit (GRU) together. In this model, the GCN structure could handle the spatial features of traffic flow with network topology, and the GRU is used to further process spatiotemporal features. Experiments show that the GC-GRU model has better prediction performance than other baseline models and can obtain spatial-temporal correlation in traffic lows better.
... However, the study did not use any preprocessing techniques. J. Feng et al. [15] proposed a deep traffic predictor (DeepTP) model to forecast long-period cellular network traffic. The study showed that the model outperformed other traffic forecast models by more than 12.3%. ...
Article
Full-text available
The demand for high steady state network traffic utilization is growing exponentially. Therefore, traffic forecasting has become essential for powering greedy application and services such as the internet of things (IoT) and Big data for 5G networks for better resource planning, allocation, and optimization. The accuracy of forecasting modeling has become crucial for fundamental network operations such as routing management, congestion management, and to guarantee quality of service overall. In this paper, a hybrid network forecast model was analyzed; the model combines a non-linear auto regressive neural network (NARNN) and various smoothing techniques, namely, local regression (LOESS), moving average, locally weighted scatterplot smoothing (LOWESS), the Sgolay filter, Robyn loess (RLOESS), and robust locally weighted scatterplot smoothing (RLOWESS). The effects of applying smoothing techniques with varied smoothing windows were shown and the performance of the hybrid NARNN and smoothing techniques discussed. The results show that the hybrid model can effectively be used to enhance forecasting performance in terms of forecasting accuracy, with the assistance of the smoothing techniques, which minimized data losses. In this work, root mean square error (RMSE) is used as performance measures and the results were verified via statistical significance tests. Keywords: Autoregressive neural network Bandwidth slice Forecast Local smoothing This is an open access article under the CC BY-SA license.
... Feng et al. [23] proposed Deep Traffic Predictor (DeepTP) to forecast traffic demands from spatial-dependent and long-period cellular traffic, which can be divided into two components: a general feature extractor for modeling spatial dependencies and encoding the external information, and a sequential module for modeling complicated temporal variations. ...
Article
In recent years, with the continuous development of information technology and the rapid growth of network scale, network monitoring and management become more and more important. Network traffic is an important part of network state. In order to ensure the normal operation of the network, improve the availability of the network, find network faults in time and deal with network attacks; it is necessary to detect the abnormal traffic in the network. Abnormal traffic detection is of great significance in the actual network management. Therefore, in order to improve the accuracy and efficiency of network traffic anomaly detection, this paper proposes a comprehensive anomaly detection method based on improved GRU traffic prediction and improved K-means clustering, and cascade the traffic prediction and clustering to achieve the purpose of anomaly detection. Firstly, an improved highway-GRU algorithm HS-GRU (An improved Gate Recurrent Unit neural network based on Highway network and STL algorithm, HS-GRU) is proposed, which combines STL decomposition algorithm with highway GRU neural network and uses this improved algorithm to predict traffic. And then, we proposed the EFMS-Kmeans algorithm(An improved clustering algorithmthat combinedMean Shift algorithmbased on electrostatic force with K-means clustering) to solve the shortcoming of the traditional K-means clustering which cannot automatically determine the number of clustering. The sum of the squared errors (SSE) method and the contour coefficient method were used to double test the clustering effect. After determining the clustering center, the potential energy gradient was directly used for anomaly detection by using the threshold method, which considered the local characteristics of the data and ensured the accuracy of anomaly detection. The simulation results show that the anomaly detection algorithm based on HS-GRU and EFMS-Kmeans clustering proposed in this paper can effectively improve the accuracy of flow anomaly detection and has important application value.
... The output of the LSTM network is passed to a fully connected neural network, which will finish the traffic prediction. Feng et al. [70] also introduced an LSTM-based model, namely, the deep traffic predictor (DeepTP), which forecasts BS traffic demands. In comparison to the solution proposed in [71], DeepTP adopts a features extractor module that employs the embedding and attention mechanism. ...
Article
Full-text available
It is challenging to deal with the Internet congestion problem because of several factors such as ever-growing traffic and distributed network architecture. The congestion problem can be solved or alleviated by various methods, including rate control, bandwidth-guarantee routing and bandwidth reservation. We use the term broad-sensed Internet congestion control and avoidance (BICC&A) to generally denote all of the above methods. Most BICC&A solutions depend on or benefit from the knowledge of network conditions, including traffic status (type and volume), available bandwidth and topology. In this paper, we present a comprehensive survey of the applications of machine learning to network condition acquirement methods for BICC&A and specific BICC&A methods. First, we provide an overview of the background knowledge of BICC&A and machine learning. Then, we provide detailed reviews on the applications of machine learning techniques to network condition acquirement methods for BICC&A and to specific BICC&A methods. Finally, we outline important research opportunities.
Article
Time series forecasting has gained significant traction in LTE networks as a way to enable dynamic resource allocation, upgrade planning, and anomaly detection. This work investigates short-term key performance indicator (KPI) forecasting for rural fixed wireless LTE networks. We show that rural fixed wireless LTE KPIs have shorter temporal dependencies compared to urban mobile networks. Second, we identify that the inclusion of environmental exogenous features yields minimal accuracy improvements. Finally, we find that sequence-to-sequence-based (Seq2Seq) models outperform simpler recurrent neural network (RNN) models, such as long short-term memory (LSTM) and gated recurrent unit (GRU), and random forest (RF).
Article
Full-text available
This paper presents a review of the literature on network traffic prediction, while also serving as a tutorial to the topic. We examine works based on autoregressive moving average models, like ARMA, ARIMA and SARIMA, as well as works based on Artifical Neural Networks approaches, such as RNN, LSTM, GRU, and CNN. In all cases, we provide a complete and self-contained presentation of the mathematical foundations of each technique, which allows the reader to get a full understanding of the operation of the different proposed methods. Further, we perform numerical experiments based on real data sets, which allows comparing the various approaches directly in terms of fitting quality and computational costs. We make our code publicly available, so that readers can readily access a wide range of forecasting tools, and possibly use them as benchmarks for more advanced solutions.
Article
Full-text available
With the development of 5G networks, cellular wireless networks are becoming more diverse and intelligent. As an important part of intelligent network management, wireless network traffic prediction has attracted more attention. Meanwhile, low delay communication is also an important part of the prediction task that needs to be considered. However, traditional deep learning models have many drawbacks in traffic prediction, such as excessive running time and computing resources. To address these issues, especially jointly considering effectiveness and efficiency, we propose a stacked broad learning system with multitask learning method for traffic flow prediction, called MTL-SBLS. Specifically, we use related tasks with similar change patterns as input to the prediction model and share more relevant features through multi-task learning. The multi-layer stacking structure of stacking broad learning system can effectively capture the traffic data features and ensure high prediction performance. When stacking new blocks, the fixed structure and weight of the underlying basic broad learning system blocks ensure that the newly generated stacking broad learning system still has a low computational cost. Finally, the experiments on three real data sets demonstrate that the MTL-SBLS model outperforms the other existing prediction methods (93.38% prediction accuracy on average). Furthermore, the MTL-SBLS model can maintain a running time of less than 10 sec on all three datasets, indicating that it is efficient. Thus, the MTL-SBLS model is proved to improve the accuracy of traffic flow prediction while maintaining low complexity and running time.
Chapter
With the rapid development of technology and the increasing complexity of the network, it is normal for hosts to communicate with each other, and the network traffic will have the interactive characteristics therefore. Besides, considering its natural time characteristics as time series, it is significant to extract these features for accuracy traffic prediction which can benefit the network management. For the interactive characteristics, CNN model can effectively recognize them by converting the interactive traffic matrices into the images as the input. For the time characteristics, CW-RNN model does well in time-series prediction problem. Based on these, we proposed the network traffic prediction algorithm CCRNN (Clockwork Convolutional Recurrent Neural Network) which combines the convolutional structure and the recurrent structure for prediction. In addition, for predicting traffic at different time granularity, this paper improves the activation mechanism of the CW-RNN. Our analysis and examination of our algorithm are based on the Abilene public dataset. The experimental results compared with other models indicate that the proposed network traffic forecasting model CCRNN can achieve good predictive performance.KeywordsCCRNNTraffic predictionInteractive characteristic
Article
This study presents a novel clustering-based algorithm to mitigate the demand of forecasting errors of newly deployed LTE (Long-Term Evolution) cells with insufficient historical data. The numbers and the usage of mobile networks are growing day by day. So, new base stations are set every day, and the newly developed cells do not have enough historical data to forecast. We developed a clustering-based algorithm to overcome this problem. We compared our approach with different forecasting methods such as classical time series methods, time series decomposition-based methods, and deep NNs (Neural Networks) methods. We tested our clustering-based solution compared with other approaches using seventy LTE cells’ daily historical performance data for two years. We collected this data from a Tier-1 Mobile Network Operator (MNO). We also analyzed the clustering features and benchmarked them for their contribution to the solution, and we measured the error rate by MAPE (Mean Absolute Percentage Error). As a result, we decreased the previous forecasting error rate from 133% to approximately 35%, showing that our novel algorithm is an efficient tool for this process.
Article
Cellular networks are important for the success of modern communication systems, which support billions of mobile users and devices. Powered by artificial intelligence techniques, cellular networks are becoming increasingly smarter, and cellular traffic prediction is an important basis for realizing various applications that have originated from this trend. In this survey, we review the relevant studies on cellular traffic prediction and classify the prediction problems as the temporal and spatiotemporal prediction problems. The prediction models with artificial intelligence are categorized into statistical, machine learning, and deep learning models and then compared. Various applications based on cellular traffic prediction are summarized along with their current progress. The potential research directions are pointed out for future research. To the best of our knowledge, this paper is the first comprehensive survey on cellular traffic prediction.
Article
Weather-related phenomena such as clouds, rain, snow affect the performance of radio links. To reduce the adverse effects of radio link failures’ on the user experience, mobile operators require intelligent monitoring systems to predict link failures and take actions before they happen. In this study, we show how machine learning can be used for prediction using a real-world telecom operator dataset. We propose a novel architecture to process time-series data and non-times-series data together in the same neural model to have better performance in predictions. We compare our model with the traditional approaches such as logistic regression (LR), support vector machines (SVM), and Long Short-term Memory (LSTM). Through experimental evaluations, we show that the F1-score of our proposed model is 0.638, whereas for the pure LSTM model it is 0.601. SVM and LR methods perform significantly worse with F1 scores of 0.455 and 0.105, respectively.
Article
Full-text available
Making machines learn is the new trend of the era and hence the use of Machine Learning is taking its pace. From small businesses to renowned companies like Google, Microsoft, all are opting the new phase of the technology. The world is growing ghastly towards 5th Generation computing and to be into the present phenomenon of applications of Machine Learning, one must know the benefits of its usage. Machine Learning is capable of providing numerous techniques that can help business, health care, and many scientific projects. Besides these benefits, Machine Learning is a technology that can be incorporated into other technologies to get better results. Cryptography is the one among such fields where Machine Learning can do wonders. Recent studies have been focusing on various application areas of machine learning in cryptography and the study is still going on. This paper will be focusing on all the applications of machine learning techniques in cryptography to achieve optimal network security which is the top most concern for the today’s technical world. The paper also provides implications of machine learning in cryptanalysis for future reference.
Article
Full-text available
Operators of modern mobile networks are faced with significant challenges in providing the requested level of service to an ever increasing number of user entities. Advanced machine learning techniques based on deep architectures and appropriate learning methods are recognized as promising ways of tackling the said challenges in many aspects of mobile networks, such as mobile data and mobility analysis, network control, network security and signal processing. Having firstly presented the background of deep learning and related technologies, the paper goes on to present the architectures used for deployment of deep learning in mobile networks. The paper continues with an overview of applications and services related to the new generation of mobile networks that employ deep learning methods. Finally, the paper presents practical use case of modulation classification as implementation of deep learning in an application essential for modern spectrum management. We complete this work by pinpointing future directions for research.
Conference Paper
We propose an AI-assisted intent-based traffic grooming scheme in 5G optical access network. The experimental results show scheme achieves autonomous decision-making for service assurance and efficiently optimizes network performance by AI-assisted intent-based traffic grooming.
Article
Deep learning technologies have been widely exploited to predict mobile traffic. However, individually training deep learning models for various traffic prediction tasks is not only time consuming but also unrealistic, sometimes due to limited traffic records. In this article, we propose a novel deep meta-learning based mobile traffic prediction framework, namely, dmTP, which can adaptively learn to learn the proper prediction model for each distinct prediction task from accumulated meta-knowledge of previously learned prediction tasks. In dmTP, we regard each mobile traffic prediction task as a base-task and adopt an LSTM network with a fixed structure as the base-learner for each base-task. In order to improve the base-learner's prediction accuracy and learning efficiency, we further employ an MLP as the meta-learner to find the optimal hyper-parameter value and initial training status for the base-learner of a new base-task according to its meta-features. Extensive experiments with real-world datasets demonstrate that while guaranteeing a similar or even better prediction accuracy, meta-learning in the proposed dmTP reduces the numbers of epochs and base-samples needed to train the base-learners by around 75 percent and 81 percent, respectively, as compared with the existing prediction models.
Article
The extremely high number of services with large bandwidth requirements and the increasingly dynamic traffic patterns of cell sites pose major challenges to optical fronthaul networks, rendering them incapable of coping with the extensive, uneven, and real-time traffic that will be generated in the future. In this paper, we first present the design of an adaptive graph convolutional network with gated recurrent unit (AGCN-GRU) network to learn the temporal and spatial dependencies of traffic patterns of cell sites to provide accurate traffic predictions, in which the AGCN model can capture potential spatial relations according to the similarity of network traffic patterns in different areas. Then, we innovatively consider how to deal with the unpredicted burst traffic and propose an AI-assisted intent-based traffic grooming scheme to realise automatic and intelligent cell sites clustering and traffic grooming. Finally, a software-defined testbed for 5G optical fronthaul network was established, on which the proposed schemes were deployed and evaluated by considering traffic datasets of existing optical networks. The experimental results showed that the proposed scheme can optimize network resource allocation, increase the average efficient resource utilization and reduce the average delay and the rejection ratio.
Chapter
This paper implements long short-term memory (LSTM) network to predict hotspot parameters in traffic density of cellular networks. The traffic density depends on numerous factors like time, location, number of mobile users connected and so on. It exhibits spatial and temporal relationships. However, only certain regions have higher data rates, known as hotspots. A hotspot is defined as a circular region with a particular centre and radius where the traffic density is the highest compared to other regions at a given timestamp. Forecasting traffic density is very important, especially in urban areas. Prediction of hotspots using LSTM would result in better resource allocation, beam forming, hand overs and so on. We propose two methods, namely log likelihood ratio (LLR) method and cumulative distribution function (CDF) method to compute the hotspot parameters. On comparing the performances of the two methods, it can be concluded that the CDF method is more efficient and less computationally complex than the LLR method.
Article
Understanding mobile data traffic and forecasting future traffic trend is beneficial to wireless carriers and service providers who need to perform resource allocation and energy saving management. However, predicting wireless traffic accurately at large-scale and fine-granularity is particularly challenging due to the following two factors: the spatial correlations between the network units (i.e., a cell tower or an access point) introduced by user arbitrary movements, and the time-evolving nature of user movements which frequently changes with time. In this paper, we use a time-evolving graph to formulate the time-evolving nature of user movements, and propose a model Graph-based Temporal Convolutional Network (GTCN) to predict the future traffic of each network unit in a wireless network. GTCN can bring significant benefits to two aspects. (1) GTCN can effectively learn intra- and inter-time spatial correlations between network units in a time-evolving graph through a node aggregation method. (2) GTCN can efficiently model the temporal dynamics of the mobile traffic trend from different network units through a temporal convolutional layer. Experimental results on two real-world datasets demonstrate the efficiency and efficacy of our method. Compared with state-of-the-art methods, the improvement of the prediction performance of our GTCN is 3.2\% to 10.2\% for different prediction horizons. GTCN also achieves 8.4 $\times$ faster on prediction time.
Article
Full-text available
Traffic prediction plays an important role in evaluating the performance of telecommunication networks and attracts intense research interests. A significant number of algorithms and models have been proposed to learn knowledge from traffic data and improve the prediction accuracy. In the recent big data era, the relevant research enthusiasm remains and deep learning has been exploited to extract the useful information in depth. In particular, Long Short-Term Memory (LSTM), one kind of Recurrent Neural Network (RNN) schemes, has attracted significant attentions due to the long-range dependency embedded in the sequential traffic data. However, LSTM has considerable computational cost, which can not be tolerated in tasks with stringent latency requirement. In this paper, we propose a deep learning model based on LSTM, called Random Connectivity LSTM (RCLSTM). Compared to the conventional LSTM, RCLSTM achieves a significant breakthrough in the architecture formation of neural network, whose connectivity is determined in a stochastic manner rather than full connected. So, the neural network in RCLSTM can exhibit certain sparsity, which means many neural connections are absent (distinguished from the full connectivity) and thus the number of parameters to be trained is reduced and much fewer computations are required. We apply the RCLSTM solution to predict traffic and validate that the RCLSTM with even 35% neural connectivity still shows a strong capability in traffic prediction. Also, along with increasing the number of training samples, the performance of RCLSTM becomes closer to the conventional LSTM. Moreover, the RCLSTM exhibits even superior prediction accuracy than the conventional LSTM when the length of input traffic sequences increases.
Conference Paper
Full-text available
Achieving accurate, real-time, and spatially fine-grained population estimation for a metropolitan city is extremely valuable for a variety of applications. Previous solutions look at data generated by human activities, such as night time lights and phone calls, for population estimation. However, these mechanisms cannot achieve both real-time and fine-grained population estimation because the data sampling rate is low and spatial granularity chosen is improper. We address these two problems by leveraging a key insight --- people frequently use data plan on cellphones and leave mobility signatures on cellular networks. Therefore, we are able to exploit these cellular signatures for real-time population estimation. Extracting population information from cellular data records is not easy because the number of users recorded by a cellular tower is not equal to the population covered by the tower, and mobile users' behavior is spatially and temporally different, where static estimating model does not work. We exploit context-aware city segmentation and dynamic population estimation model to address these challenges. We show that the population estimation error is reduced by 22.5% on a cellular dataset that includes 1 million users.
Chapter
Full-text available
The predictability of network traffic is a significant interest in many domains such as congestion control, admission control, and network management. An accurate traffic prediction model should have the ability to capture prominent traffic characteristics, such as long-range dependence (LRD) and self-similarity in the large time scale, multifractal in small time scale. In this paper we propose a new network traffic prediction model based on non-linear time series ARIMA/GARCH. This model combines linear time series ARIMA model with non-linear GARCH model. We provide a parameters estimation procedure for our proposed ARIMA/GARCH model. We then evaluate a scheme for our models’ prediction. We show that our model can capture prominent traffic characteristics, not only in large time scale but also in small time scale. Compare with existing FARIMA model, our model have better prediction accuracy.
Article
Full-text available
Theoretical results strongly suggest that in order to learn the kind of complicated functions that can repre- sent high-level abstractions (e.g. in vision, language, an d other AI-level tasks), one needs deep architec- tures. Deep architectures are composed of multiple levels of non-linear operations, such as in neural nets with many hidden layers or in complicated propositional formulae re-using many sub-formulae. Searching the parameter space of deep architectures is a difficult opti mization task, but learning algorithms such as those for Deep Belief Networks have recently been proposed to tackle this problem with notable success, beating the state-of-the-art in certain areas. This paper d iscusses the motivations and principles regarding learning algorithms for deep architectures, in particular those exploiting as building blocks unsupervised learning of single-layer models such as Restricted Boltzmann Machines, used to construct deeper models such as Deep Belief Networks.
Article
Full-text available
Seasonal ARIMA model is a good traffic model capable of capturing the behavior of a network traffic stream. In this paper, we give a general expression of seasonal ARIMA models with two periodicities and provide procedures to model and to predict traffic using seasonal ARIMA models. The experiments conducted in our feasibility study showed that seasonal ARIMA models can be used to model and predict actual wireless traffic such as GSM traffic in China.
Article
Full-text available
Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, LSTM can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units. Multiplicative gate units learn to open and close access to the constant error flow. LSTM is local in space and time; its computational complexity per time step and weight is O(1). Our experiments with artificial data involve local, distributed, real-valued, and noisy pattern representations. In comparisons with real-time recurrent learning, back propagation through time, recurrent cascade correlation, Elman nets, and neural sequence chunking, LSTM leads to many more successful runs, and learns much faster. LSTM also solves complex, artificial long-time-lag tasks that have never been solved by previous recurrent network algorithms.
Article
Intelligent human detection based on WiFi is a technique that has recently attracted a significant amount of interest from research communities. The use of ubiquitous WiFi to detect the number of queuing persons can facilitate dynamic planning and appropriate service provisioning. In this article, we propose HFD, one of the first schemes to leverage WiFi signals to estimate the number of queuing persons by employing classifiers from machine learning in a device-free manner. In the proposed HFD scheme, we first utilize the sliding window method to filter and remove the outliers. We extract two characteristics, skewness and kurtosis, as the identification features. Then, we use the support vector machine (SVM) to classify these two features to estimate the number of people in the current queue. Finally, we combine our scheme with the latest Fresnel Zone model theory to determine whether someone is in or out, and thus dynamically adjust the detected value. We implement a proof-of-concept prototype upon commercial WiFi devices and evaluate it in both conference room and corridor scenarios. The experimental results show that the accuracy of our proposed HFD detection can be maintained at about 90 percent with high robustness.
Article
Understanding mobile traffic patterns of large scale cellular towers in urban environment is extremely valuable for Internet service providers, mobile users, and government managers of modern metropolis. This paper aims at extracting and modeling the traffic patterns of large scale towers deployed in a metropolitan city. To achieve this goal, we need to address several challenges, including lack of appropriate tools for processing large scale traffic measurement data, unknown traffic patterns, as well as handling complicated factors of urban ecology and human behaviors that affect traffic patterns. Our core contribution is a powerful model which combines three dimensional information (time, locations of towers, and traffic frequency spectrum) to extract and model the traffic patterns of thousands of cellular towers. Our empirical analysis reveals the following important observations. First, only five basic time-domain traffic patterns exist among the 9600 cellular towers. Second, each of the extracted traffic pattern maps to one type of geographical locations related to urban ecology, including residential area, business district, transport, entertainment, and comprehensive area. Third, our frequency-domain traffic spectrum analysis suggests that the traffic of any tower among 9600 can be constructed using a linear combination of four primary components corresponding to human activity behaviors. We believe that the proposed traffic patterns extraction and modeling methodology, combined with the empirical analysis on the mobile traffic, pave the way toward a deep understanding of the traffic patterns of large scale cellular towers in modern metropolis.
Article
Understanding mobile big data, inherent within large-scale cellular towers in the urban environment, is extremely valuable for service providers, mobile users, and government managers of the modern metropolis. By extracting and modeling the mobile cellular data associated with over 9600 cellular towers deployed in a metropolitan city of China, this article aims to link cyberspace and the physical world with social ecology via such big data. We first extract a human mobility and cellular traffic consumption trace from the dataset, and then investigate human behavior in cyberspace and the physical world. Our analysis reveals that human mobility and the consumed mobile traffic have strong correlations, and both have distinct periodical patterns in the time domain. In addition, both human mobility and mobile traffic consumption are linked with social ecology, which in turn helps us to better understand human behavior. We believe that the proposed big data processing and modeling methodology, combined with the empirical analysis on mobile traffic, human mobility, and social ecology, paves the way toward a deep understanding of human behaviors in a large-scale metropolis.
Article
Understanding mobile traffic patterns of large scale cellular towers in urban environment is extremely valuable for Internet service providers, mobile users, and government managers of modern metropolis. This paper aims at extracting and modeling the traffic patterns of large scale towers deployed in a metropolitan city. To achieve this goal, we need to address several challenges, including lack of appropriate tools for processing large scale traffic measurement data, unknown traffic patterns, as well as handling complicated factors of urban ecology and human behaviors that affect traffic patterns. Our core contribution is a powerful model which combines three dimensional information (time, locations of towers, and traffic frequency spectrum) to extract and model the traffic patterns of thousands of cellular towers. Our empirical analysis reveals the following important observations. First, only five basic time-domain traffic patterns exist among the 9,600 cellular towers. Second, each of the extracted traffic pattern maps to one type of geographical locations related to urban ecology, including residential area, business district, transport, entertainment, and comprehensive area. Third, our frequency-domain traffic spectrum analysis suggests that the traffic of any tower among the 9,600 can be constructed using a linear combination of four primary components corresponding to human activity behaviors. We believe that the proposed traffic patterns extraction and modeling methodology, combined with the empirical analysis on the mobile traffic, pave the way toward a deep understanding of the traffic patterns of large scale cellular towers in modern metropolis.
Article
The development of LTE-Advanced and beyond cellular networks is expected to offer considerably higher data rates than the existing 3G networks. Among the many potential technologies in LTE-Advanced systems, users?? characteristics and social behavior have been studied to improve the networks?? performance. In this article we present the concept of user social pattern (USP), which characterizes the general user behavior, pattern, and rules of a group of users in a social manner, and utilize USP as an optimization basis for network performance enhancement. From large-scale traffic traces collected from current mobile cellular networks, the USP model is evaluated and verified. Furthermore, to evaluate the potential of spectral efficiency and energy efficiency enhancement based on USP in LTE-A HetNets, we establish a complete system and link-level HetNet simulation platform according to 3GPP LTE-A standards. Then, based on the platform, simulations are performed to evaluate the impact of USP on spectral and energy efficiency in an LTE-A network, and a USP-based spectral efficiency and energy efficiency enhancement scheme is proposed for a HetNet of the LTE-A system. Simulation results validate that USP can be used as an effective concept for network performance optimization in an LTE-A system.
Article
Accurate forecasting of inter-urban traffic flow has been one of the most important issues globally in the research on road traffic congestion. Because the information of inter-urban traffic presents a challenging situation, the traffic flow forecasting involves a rather complex nonlinear data pattern, particularly during daily peak periods, traffic flow data reveals cyclic (seasonal) trend. In the recent years, the support vector regression model (SVR) has been widely used to solve nonlinear regression and time series problems. However, the applications of SVR models to deal with cyclic (seasonal) trend time series had not been widely explored. This investigation presents a traffic flow forecasting model that combines the seasonal support vector regression model with chaotic immune algorithm (SSVRCIA), to forecast inter-urban traffic flow. Additionally, a numerical example of traffic flow values from northern Taiwan is used to elucidate the forecasting performance of the proposed SSVRCIA model. The forecasting results indicate that the proposed model yields more accurate forecasting results than the seasonal autoregressive integrated moving average, back-propagation neural network, and seasonal Holt–Winters models. Therefore, the SSVRCIA model is a promising alternative for forecasting traffic flow.
Article
It is envisioned that home networks will shift from current machine-to-human communications to the machine-to-machine paradigm with the rapid penetration of embedded devices in home surroundings. In this article, we first identify the fundamental challenges in home M2M networks. Then we present the architecture of home M2M networks decomposed into three subareas depending on the radio service ranges and potential applications. Finally, we focus on QoS management in home M2M networks, considering the increasing number of multimedia devices and growing visual requirements in a home area. Three standards for multimedia sharing and their QoS architectures are outlined. Cross-layer joint admission and rate control design is reported for QoS-aware multimedia sharing. This proposed strategy is aware of the QoS requirements and resilience of multimedia services. Illustrative results indicate that the joint design is able to intelligently allocate radio bandwidth based on QoS demands in resource-constrained home M2M networks.
She is currently a postdoctoal researcher at Tsinghua University. Her research interests include data-driven network optimization, intelligent transportation systems, fog computing, and spatial-temporal data mining
  • Ming Zeng Received Her
  • B E Ph
Ming Zeng received her B.E. and Ph.D degrees in communication and information systems from the University of Electronic Science and Technology of China, Chengdu, in 2009 and 2016, respectively. She is currently a postdoctoal researcher at Tsinghua University. Her research interests include data-driven network optimization, intelligent transportation systems, fog computing, and spatial-temporal data mining.