Article

Real-World Evaluation of Power Consumption and Performance of NB-IoT in Malaysia

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract and Figures

Narrowband-Internet of Things (NB-IoT) is expected to lead the way in wireless access technologies and support major 5G-based IoT applications in the future. Since NBIoT has yet to be fully deployed globally, there are only limited real-world performance evaluations available, particularly in the case of outdoor performance. Therefore, this study expanded on previous studies and comprehensively evaluated the performance of NB-IoT in the real-world in terms of coverage parameters, path loss (PL), packet delivery rate (PDR), and latency limits. Numerical and detailed analyses were also performed to calculate power consumption and estimate the average battery life of NBIoT, Sigfox, and LoRaWAN in relation to the various factors affecting power consumption. These three technologies were then compared in terms of power consumption. An overall PDR of 91.76% was achieved, indicating that NB-IoT can handle high data rates with minimal signal quality. NB-IoT performance was also found to correspond with theoretical assumptions on coverage and maximum range. However, depending on signal quality, latency was found to vary greatly, resulting in critical degradation of battery life efficiency. Although this study discovered that immense power savings could be achieved by applying critical power management strategies, achieving a long battery life for NBIoT was not simple. LoRaWAN and Sigfox technologies proved to be more battery efficient than NB-IoT. In a one packet delivery per day scenario, LoRaWAN, Sigfox, and NB-IoT were found to have average battery lives of 1608.9 days, 1527.6 days, and 344.9 days, respectively.
Content may be subject to copyright.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... These models encompassed a baseline FSPL model and a Close-In (CI) form LNSPL model, often referred to as a one-slope model. Accordingly, PL (in dB) was initially calculated from the observed (in dBm) using (1) [22]: ...
... Such an approach, thereby, takes advantage of the fact that multiple ML algorithms can have varying abilities for solving the same problem [37]. As such, compared to the popular LNSPL model presented in (7), the proposed models can be represented as given below in (20), (21), (22), and (23): ...
... The first term, denoted as 1 in Fig19, common to all models, is basically the FSPL, in dB, as a function of frequency, in MHz, at a reference distance of 1 m, calculated as per (3). The second term, denoted as 2 in Fig. 19, represents the response variable, in dB, of either a Stepwise linear regression ML model ( ), in (21) and (22) or a linear SVM ML model ( ), in (23) and (24), that models the actual PL deducted from the first term, as depicted in Fig. 19. Finally, the third term, referred to as 3 in Fig. 19, denotes the response variable, in dB, of either an EBT ML model ( ), in (21) and (23), or ANN ML model ( ), in (22) and (24), that models the shadowing [X]. ...
Article
Full-text available
The rapid growth of the Internet of Things (IoT) has led to the emergence of Low Power Wide Area Networks (LPWANs) to connect devices in a smart future world. However, challenges such as wireless channel propagation imperfections lead to signal power loss, higher retransmissions, and service quality degradation, ultimately limiting the efficiency of IoT implementation. This study investigates Long-Range (LoRa) IoT networks wireless channel propagation characterization in tropical environments using hybrid Machine Learning (ML) techniques. It specifically addresses the limitations of various established Path Loss (PL) models, such as Longley-Rice Irregular Terrain Model (ITM), through comprehensive evaluations across diverse scenarios, demonstrating their inadequacies in equatorial regions. Accordingly, four unique three-stage stacked ML-based semi-empirical PL models for LoRa communication were proposed. These models explore the combination of a Free Space PL (FSPL) with a Stepwise (STW) or linear Support Vector Machine (SVM) ML model, which is then integrated with an Ensemble of Bagged Trees (EBT) or Artificial Neural Network (ANN) ML model. With prediction accuracies reaching up to 89% for testing datasets, the FSPL-STW-EBT model showed the best performance, followed by the FSPL-SVM-EBT method. The latter two models showed the highest prediction accuracy for rural and suburban areas, 87% to 96%, across various datasets and categories. Additionally, these models achieved up to 95% accuracy in measurements taken on lake water surfaces. The study further emphasizes the balance between computational efficiency and predictive accuracy of these models, enhancing their suitability for real-time IoT applications in challenging environments. These findings highlight the potential of hybrid ML-based models to outperform conventional PL models and optimize IoT network design, planning, and deployment in tropical regions.
... IoT has caught academic and industry interest for the past few years [16] due to the exponential rise of connected devices and the need for new or optimized methods to manage many connected devices [13]. As a result, the number of connected devices nowadays is expected to be between 26 to 50 billion [17]- [21]. This trend is anticipated to accelerate further, expected to reach around 75 to 100 billion connected devices by 2025 [13], [16], [21]. ...
... As a result, the number of connected devices nowadays is expected to be between 26 to 50 billion [17]- [21]. This trend is anticipated to accelerate further, expected to reach around 75 to 100 billion connected devices by 2025 [13], [16], [21]. ...
... Finally, as a result of the crucial IoT application requirements [13], [20], Low Power Wide Area Networks (LPWANs) are emerging as an exciting new trend in the growth of wireless communication systems. Many LPWAN developments, such as Sigfox, LoRaWAN, and NB-IoT, have lately emerged in both unlicensed and licensed spectrum, becoming one of the leading novel technologies with numerous technological differences [16], [20], [21]. Its main features include large coverage area support and massive scale networking with low cost, long-life, and restricted data rate EDs [13], [16], [18]. ...
Article
Full-text available
The Internet of Things (IoT) has rapidly expanded for a wide range of applications towards a smart future world by connecting everything. As a result, new challenges emerge in meeting the requirements of IoT applications while retaining optimal performance. These challenges may include power consumption, quality of service, localization, security, and accurate modeling and characterization of wireless channel propagation. Among these challenges, the latter is critical to establishing point-to-point wireless communication between the sensors. Channel modeling also varies depending on the features of the surrounding area, which have a direct impact on the propagation of wireless signals. This presents a difficult task for network planners to efficiently design and deploy IoT applications without understanding the appropriate channel model to analyze coverage and predict optimal deployment configurations. As a result, this challenge has attracted considerable interest in academic and industrial communities in recent years. Therefore, this review presents an overview of current breakthroughs in wireless IoT technologies. The challenges in such applications are then briefly reviewed, focusing on wireless channel propagation modeling and characterization. Finally, the study gives a generalized form of commonly used channel models and a summary of recent channel modeling developments for wireless IoT technology. The outcome of this review is expected to provide a new understanding of the propagation behavior of present and future wireless IoT technologies, allowing network engineers to undertake correct planning and deployment in any environment. Additionally, the study may serve as a guideline for future channel modeling and characterization studies.
... Several studies [3,4,5] have been conducted to evaluate the performance of LoRa and NB-IoT in different environments, such as urban, suburban, and rural areas. Results have shown that both technologies perform well in urban and suburban areas, but their performance decreases in rural areas due to the lack of infrastructure and increased path loss. ...
... In [3], the results show that while NB-IoT provides better coverage in regions with low density, LoRa provides slightly better coverage in mainly urban areas with high density due to its ability to handle handovers and repeated messages in LoRa. In [4], NB-IoT performance was evaluated in terms of coverage parameters and power consumption, and compared with LoRa and Sigfox. LoRa was found to have better battery life performance. ...
Conference Paper
Full-text available
In this paper, we present a comprehensive evaluation of two prominent low-power wide-area networks (LPWAN) technologies , low power long range alliance (LoRa) and narrow-band internet-of-things (NB-IoT), which are widely used in the internet-of-things (IoT) sector. We investigate their performance under challenging conditions, specifically in a scenario where the signal is subject to non-line-of-sight (NLOS) reception caused by signal diffraction. Additionally, we analyze the potential application challenges and use cases for each technology and provide insight into which technology is more suitable for specific scenarios. Our findings aim to inspire future researchers and manufacturers in the field of LPWAN and IoT.
... The network layer can be composed of single node IoT to multiple IoT devices. The transmission medium can be wireless or wired and technology can be fiber optic [64], 5G [65], WiFi [66], Bluetooth [67], LoRa [68], Sigfox [69], NB-IoT [70]. In WSN where coverage is important, since a single node cannot service a wide area, therefore it needs multiple nodes to service it. ...
... Fiber Optic [64], 5G [65], WiFi [66], Bluetooth [67], LoRa [68], Sigfox [69], etc [70]. ...
Article
Full-text available
In this modern age, Internet of Things (IoT) and Wireless Sensor Network (WSN) as its derivatives have become one of the most popular and important technological advancements. In IoT, all things and services in the real world are digitalized and it continues to grow exponentially every year. This growth in number of IoT device in the end has created a tremendous amount of data and new data services such as big data systems. These new technologies can be managed to produce additional value to the existing business model. It also can provide a forecasting service and is capable to produce decision-making support using computational intelligence methods. In this survey paper, we provide detailed research activities concerning Computational Intelligence methods application in IoT WSN. To build a good understanding, in this paper we also present various challenges and issues for Computational Intelligence in IoT WSN. In the last presentation, we discuss the future direction of Computational Intelligence applications in IoT WSN such as Self-Organizing Network (dynamic network) concept.
... A comprehensive overview of LPWAN technologies is presented in [33], where the authors conduct a real-world performance evaluation based on coverage parameters, path loss, packet delivery rate, latency limits, and power consumption. The study highlights that LoRaWAN and Sigfox technologies are more battery-efficient compared to NB-IoT. ...
Article
Full-text available
In this study, we propose a method to estimate energy consumption in battery-powered Narrowband Internet of Things (NB-IoT) devices using the statistical data available from the NB-IoT modem, thereby circumventing the need for additional circuitry to measure battery voltage or current consumption. A custom edge node with an NB-IoT module and onboard current measurement circuit was developed and utilized to generate a labeled dataset. Each data point, generated upon UDP packet transmission, includes metadata such as radio channel quality parameters, temporal parameters (TX and RX time), transmission and reception power, and coverage extension mode. Feature selection through variance and correlation analysis revealed that coverage extension mode and temporal parameters significantly correlate to the energy consumption. Using these features, we tested 11 machine learning models for energy consumption estimation, assessing their performance and memory footprint, both of which are critical factors for resource-constrained embedded systems. Our best models achieved up to 93.8% of fit with measured values, with memory footprints below100 KB, some as lowas 3 KB. This approach offers a practical solution for the energy consumption estimation in NB-IoT devices without hardware modifications, thereby enabling energy-aware device management.
... However, NB-IoT had significant variations in the communication delay based on signal strength, which impacted the device's battery life. This study highlighted that there are trade-offs between connectivity and power efficiency in NB-IoT deployments, so signal quality can directly affect the operation of an IoT application (Alobaidy et al., 2022). ...
Article
The expansion of the Internet of Things (IoT) has created a need for reliable and fault-tolerant communication networks. However, ensuring consistent signal quality and power efficiency has proven to be challenging. This study evaluated the performance of Narrowband IoT (NB-IoT) communications using the SIM7020 module connected to a Raspberry Pi 4 Model B, focusing on signal quality across indoor, outdoor, urban and rural areas. Supervised machine learning for indoor localisation based on Received Signal Strength Indicator (RSSI) has been introduced, for example, to enhance NB-IoT performance. However, this and other approaches have encountered difficulties in mobile and obstructed environments, including signal attenuation, connectivity variability and increased power consumption. The objective of this study was to analyse NB-IoT signal strength and power consumption, providing guidance for deploying real-time communication IoT applications. Empirical data was analysed to understand the RSSI and Cellular Signal Quality (CSQ) in different locations. Signal quality in urban and outdoor environments was prone to fluctuations due to mobility and interference, whereas rural areas had weaker but more consistent signals. Indoor environments suffered from significant signal attenuation. The results emphasise the importance of improved handover mechanisms and adaptive deployment strategies to ensure reliable connectivity across various IoT applications.
... Cloud computing has improved processing power and data storage capacity by introducing a power and battery management system based on cloud computing technology . Because the Internet of Things (IoT) relies heavily on control center infrastructure for computation, any power device or battery containing IoT-related data is calculated, communications, and implements a server-less architecture based on the cloud to ensure continuity of operation regardless of local infrastructure availability and accessibility (Sharma et al., 2020;Alobaidy et al., 2021;Behjati et al., 2021;Ali and Mohammad, 2019). This study attempts to bridge these gaps related to power and battery management as well as the role of cloud computing in managing the power charge and discharge. ...
Article
Full-text available
A cloud computing-based power optimization system (CC-POS) is an important enabler for hybrid renewable-based power systems with higher output, optimal solutions to extend battery storage life, and remotely flexible power distribution control. Recent advancements in cloud computing have begun to deliver critical insights, resulting in adaptive-based control of storage systems with improved performance. This study aims to review the recently published literature on the topic of power management systems and battery charging control. The role of intelligent based cloud computing is to improve the battery life and manage the battery state of charge (SoC). To achieve this purpose, publishers' databases and search engines were used to obtain the studies reviewed in this paper. We identify and review the purpose, achievements, tools/algorithm, and recommendations for each survey work, and thus outline a number of key findings and future directions. Furthermore, the review includes a listing of novels and recently used algorithms. Additionally, a critical review of 174 research articles were analyzed as per Web of Science (WoS) and Scopus database. The key findings of this study are discussed in two key conceptual frameworks that contain a power optimization system and an optimal battery management system. The power transmission, distribution, and charge and discharge processes are controlled and stored on cloud computing using the power mix between hybrid renewable energy and other power sources.
... [11] The suggested method conserves more hardware resources and reduces the worst-case disturbance's impact on the channel's estimation inaccuracy under uncertain conditions. Figure 3 The performancecomplexity trade-off for this channel estimator with interference detection architecture is favorable [12]. ...
Preprint
Full-text available
The evaluation of the wireless channel linking a transmitter and a receiver is essential in many applications. In receiver design, channel estimate is a fundamental and difficult problem. A recent mathematical theory called compressive sensing (CS) makes use of the idea of randomness in the signal to recover sparse or compressible signals. This research work also proposes an efficient channel estimator with interference detection architecture for Multiple Input Multiple Output- Orthogonal Frequency Division Multiplexing (MIMO - OFDM) method. The proposed interference finding architecture efficiently detects and removes the interference between received signals with low power consumption and current consumption with minimum hardware requirements. The power consumption and latency for Virtex Family devices are 19mW and 9 ns respectively. The channel estimator with interference detection architecture is used to achieve the power consumption and latency (CPLD Family devices) are 21.97 and 10.32.
... Not all parts of the E-UTRA specification are implemented either. Thanks to this aspect, the receiver is simpler and therefore less energy-demanding and lower in price [90][91][92]. ...
Article
Full-text available
Geothermal energy installations are becoming increasingly common in new city developments and renovations. With a broad range of technological applications and improvements in this field, the demand for suitable monitoring technologies and control processes for geothermal energy installations is also growing. This article identifies opportunities for the future development and deployment of IoT sensors applied to geothermal energy installations. The first part of the survey describes the technologies and applications of various sensor types. Sensors that monitor temperature, flow rate and other mechanical parameters are presented with a technological background and their potential applications. The second part of the article surveys Internet-of-Things (IoT), communication technology and cloud solutions applicable to geothermal energy monitoring, with a focus on IoT node designs, data transmission technologies and cloud services. Energy harvesting technologies and edge computing methods are also reviewed. The survey concludes with a discussion of research challenges and an outline of new areas of application for monitoring geothermal installations and innovating technologies to produce IoT sensor solutions.
... It consists of 10 transceiver BS antennas. Putrajaya is a unique region with multiple environmental categories [39], [40]. Meanwhile, for final trained RF model testing and evaluation purpose, 12 transceiver BS located inside and outside of Putrajaya have been identified (refer to Table III). ...
Article
Full-text available
Nowadays, the dependency on high-performance digital mobile connectivity is not limited to human usage but also the intelligent objects increasingly deployed to serve the needs of Internet of Things (IoT) applications. However, the current network planning technique limitation has constrained the real potential of mobile digital connectivity development. This situation has hindered sustainable Internet-oriented economic and technological development. The 3 rd generation partnership project (3GPP), through its specification release 18 (Rel.18), has included and leveraged the potential capabilities of machine learning (ML) technologies in advanced mobile network planning. The main objective is to enhance mobile network planning performance and reduce complexity. To materialize this aim, we propose a novel ML-based Online coverage Estimator (MLOE) tool developed based on Random Forest (RF) ML algorithm. It uses seven unique features to predict the mobile network performance through reference signal received power (RSRP). Accordingly, the results showed that MLOE outperformed traditional empirical techniques and previous works. The final trained RF algorithm has achieved an outstanding root mean square error ( RMSE ) of 2.65 dB and a coefficient of determination ( R 2 ) of 0.93. With the dynamic and fast-growing mobile technology, MLOE has been deployed on an online platform using MATLAB® Web App Server, which offers a modular and scalable architecture.
... Further, a backup mechanism is needed to restore data when faults occur [2,4]. Despite these problems, the NB-IoT will continue to serve several applications under 5G communication [44,46]. ...
Article
Full-text available
This paper proposes a three-computing-layer architecture consisting of Edge, Fog, and Cloud for remote health vital signs monitoring. The novelty of this architecture is in using the Narrow-Band IoT (NB-IoT) for communicating with a large number of devices and covering large areas with minimum power consumption. Additionally, the architecture reduces the communication delay as the edge layer serves the health terminal devices with initial decisions and prioritizes data transmission for minimizing congestion on base stations. The paper also investigates different authentication protocols for improving security while maintaining low computation and transmission time. For data analysis, different machine learning algorithms, such as decision tree, support vector machines, and logistic regression, are used on the three layers. The proposed architecture is evaluated using CloudSim, iFogSim, and ns3-NB-IoT on real data consisting of medical vital signs. The results show that the proposed architecture reduces the NB-IoT delay by 59.9%, the execution time by an average of 38.5%, and authentication time by 35.1% for a large number of devices. This paper concludes that the NB-IoT combined with edge, fog, and cloud computing can support efficient remote health monitoring for large devices and large areas.
... Some works have focused on calculating the power consumption of NB-IoT modules from different vendors on different deployed networks. Alobaidy et al. [14] performed the experiment using Pycom FiPy as NB-IoT UE and measured its power consumption on MAXIS, which is a mobile network operator in Malaysia. Khan et al. [15] evaluated the power consumption of two different evaluation boards of Quectel BG96 LPWAN module from Avnet Silica NB-IoT sensor shield and Quectel UMTS & LTE EVB Kit. ...
Article
Full-text available
The Internet of Things (IoT) is being deployed to provide smart solutions for buildings, logistics, hospitals, and many more. It is growing with billions of connected devices. However, with such tremendous growth, maintenance and support are the hidden burdens. The devices deployed for IoT generally have a light microcontroller, low-power, low memory, and lightweight software. The software, which includes firmware and applications, can be managed remotely via a wireless connection. This improves flexibility, installation time, accessibility, effectiveness, and cost. The firmware can be updated constantly to remove known bugs and improve the functionality of the device. This work presents an approach to update firmware over-the-air (OTA) for constrained IoT devices. We used Narrowband IoT (NB-IoT) as the wireless communication standard to communicate between the managing server and devices. NB-IoT is one of the most promising low power wide area (LPWA) network protocols that supports more than 50k devices within a cell using a licensed spectrum. This work is a proof of concept demonstrating the usage of NB-IoT to update firmware for constrained devices. We also calculated the overall power consumption and latency for different sizes of the firmware.
... The study dataset is constructed from a comprehensive measurement campaign conducted in the Federal Territory of Putrajaya, a planned city that serves as the administrative capital of Malaysia. The selection of Putrajaya was based on the following factors: (i) has a unique town planning architecture that combination of multi-environment characteristics, i.e., high-rise and mid-rise buildings, single and double story terrace houses, lakes, densely-vegetated parks and open areas in one territory [6], [52] as shown in Fig. 2 and (ii) one of the testbed location for 5G NR network deployment beside Cyberjaya and others several major cities [53], [54]. However, at the time of this study, there is still no 5G NR mobile network officially operating for public consumers. ...
Article
Full-text available
The need for wider coverage and high-performance quality of mobile networks is critical due to the maturity of Internet penetration in today’s society. One of the primary drivers of this demand is the dramatic shift toward digitalization due to the Covid-19 pandemic impact. Meanwhile, the emergence of the 5G wireless standard and the increasingly complex actual operating environment of mobile networks make the traditional prediction model less reliable. With the recent advancements and promising capabilities of machine learning (ML), it is seen as an alternative to the traditional approaches for ground to ground (G2G) mobile communication coverage prediction. In this study, various ML models have been tested and evaluated to develop an ML-based received signal strength prediction model for mobile networks. However, the challenge is to identify a practical ML model that can fulfill the computing speed criteria while still meeting the prediction accuracy. A total of six categories of ML models, namely Linear Regression (LR), Artificial Neural Network (ANN), Support Vector Machine (SVM), Regression Trees (RT), Ensembles of Trees (ET), and Gaussian Process Regression (GPR) that consists of more than 20 types of established algorithms/kernels have been tested and evaluated in this paper to identify the best contender among them, in terms of speed and accuracy. Findings from the evaluation showed that the GPR model is the most accurate model for Reference Signal Received Power (RSRP) prediction in terms of RMSE and R2R^{2} , followed by ET, RT, SVM, ANN and LR. Nevertheless, prediction speed and model training times are also important factors in determining the most practical model for RSRP prediction for several real-world mobile network planning applications. Finally, the ET model with Random Forest (RF) algorithm has been selected and highly recommended as the most practically employed ML model for developing rigorous RSRP predictions model in multi-frequency bands and multi-environment. The developed prediction model is capable of being utilized for the network analysis and optimization.
Chapter
Internet of Things (IoT) has revolutionised our lives in many ways. Its current roles and applications in smart agriculture, smart homes, health care, environmental monitoring, industrial automation, transportation, and security/surveillance have positively enhanced the quality and comforts of our lives. Wireless communication plays a central role in ensuring the function of IoT. There are many challenges in developing the IoT. Inadequate key management, malicious attacks, unauthorised access, and device failure are security issues of concern. Power, especially off-grid power consumption, needs to be addressed. The high density of connected devices, limited bandwidth, and low data transmission can compromise the function of IoT. There is also the worry of a lack of trained workforce. Edge computing, Low-Power Wide Area Network (LPWAN), 5G networks, 6G networks, and mesh networks are advances that serve to enhance the capability of wireless communication. Future development of IoT in smart environment, health and wellness, agriculture sustainability, livestock management, and smart mobility in public transport are underway.
Conference Paper
Full-text available
Recently, Narrowband Internet of Things (NB-IoT) systems gained a significant focus as a promising direction for massive connectivity issues in forthcoming wireless communication systems. Thus, this paper investigates the challenge of maximizing connection density in NB-IoT networks, considering a downlink Non-Orthogonal Multiple Access (NOMA) scenario for simultaneous transmissions and efficient resource utilization. The base station assigns each device to one of the accessible Physical Resource Blocks (PRBs). This scheduling policy includes effective device clustering and optimal NOMA power assignment. We formulate this problem as finding semi-matching over a bipartite graph derived from multiple PRBs and distinct constraints, i.e., power budget, admitted PRB, interference, and quality of service constraints. Furthermore , we compared our solutions NOMA-Semi Matching 1 (NOMA-SM1) and NOMA-Semi Matching 2 (NOMA-SM2) with previous solutions. Numerical Simulations confirm that the proposed semi-matching aided approaches attain a better theoretical bound and superior performance.
Article
Wireless power transfer (WPT) has emerged to enhance the robustness of the energy harvesting Internet of Things (EH-IoT), whereas beamforming has been leveraged to significantly boost the efficiency of far-field WPT. Nevertheless, potential negative impacts due to high electromagnetic fields (EMF) exposure for radiation-susceptible users have not been thoroughly considered in the design of WPT for EH-IoT with IoT application-level requirements (e.g., coverage). In this paper, we explore the health-aware beamforming and IoT selection problem under the EH and human safety constraints. First, we formulate a new optimization problem, named Health-Aware Beamforming and IoT Selection (HABIS) and prove the NP-hardness. Second, we design an approximation algorithm, named Minimum Radiation Exposure and Maximum IoT Coverage (MREMIC), to exploit the EH-health dependency (EHHD) graph for properly addressing the trade-off between EH efficiency and the potential EMF radiation for the human body. We also discover the optimal health-aware beamforming to minimize the total radiation energy absorption of humans. Simulation results show that MREMIC can effectively charge IoT devices and significantly outperforms existing EH approaches by more than 200% regarding human safety.
Article
Parallel power loads anomalies are processed by a fast-density peak clustering technique that capitalizes on the hybrid strengths of Canopy and K-means algorithms all within Apache Mahout’s distributed machine-learning environment. The study taps into Apache Hadoop’s robust tools for data storage and processing, including HDFS and MapReduce, to effectively manage and analyze big data challenges. The preprocessing phase utilizes Canopy clustering to expedite the initial partitioning of data points, which are subsequently refined by K-means to enhance clustering performance. Experimental results confirm that incorporating the Canopy as an initial step markedly reduces the computational effort to process the vast quantity of parallel power load abnormalities. The Canopy clustering approach, enabled by distributed machine learning through Apache Mahout, is utilized as a preprocessing step within the K-means clustering technique. The hybrid algorithm was implemented to minimise the length of time needed to address the massive scale of the detected parallel power load abnormalities. Data vectors are generated based on the time needed, sequential and parallel candidate feature data are obtained, and the data rate is combined. After classifying the time set using the canopy with the K-means algorithm and the vector representation weighted by factors, the clustering impact is assessed using purity, precision, recall, and F value. The results showed that using canopy as a preprocessing step cut the time it proceeds to deal with the significant number of power load abnormalities found in parallel using a fast density peak dataset and the time it proceeds for the k-means algorithm to run. Additionally, tests demonstrate that combining canopy and the K-means algorithm to analyze data performs consistently and dependably on the Hadoop platform and has a clustering result that offers a scalable and effective solution for power system monitoring.
Article
Full-text available
5G and the Internet of Things (IoT) are a potent combination that offers a vast IoT infrastructure that can support billions of connected devices while maintaining reliability, affordability, and high-speed connectivity. Nevertheless, the integration of 5G-enabled IoT has received insufficient attention from security analysts, engineers, and researchers, resulting in a lack of information and viable solutions. This study investigates the benefits and issues associated with 5G-enabled IoT, including its privacy and security concerns as well as the technology drivers that enable its various layers. By incorporating 5G technology into IoT, several enhancements have been achieved, including enhanced reliability, simplicity, practicability, analysis, efficiency, agility, flexibility, and accessibility. To achieve the full potential of 5G for IoT, however, researchers must also address many research obstacles, such as designing the 5G-IoT architecture, managing committed machine interactions, and addressing security concerns. A promising strategy for overcoming these obstacles involves ensuring compatibility of operating systems with all devices, which will lead to the development of modern open-source standardization for smooth communication across diverse devices. This innovation will improve the security and privacy of 5G-enabled IoT devices by equipping their hardware with the intelligence to recognize, verify, and authorize processes with predetermined characteristics. This will provide authorized users with full autonomy and persistent connectivity, enhancing the overall user experience. However, 5G requires a thorough examination of its implications and difficulties. A safer and more efficient ecosystem for 5G-enabled IoT may be established by addressing these concerns and encouraging industry-wide collaboration.
Conference Paper
5G address the following major period of versatile telecom morals past the impending 4G guidelines. After 1G, 2G, 3G, and 4G organizations, it is now a new global distant norm. 5G remote technology is designed to deliver multi-gigabit per second (Gbps) maximum data speeds. Pinnacle downlink throughput execution is projected to be exceedingly high with 5G. The goal of this article is to identify, systematize, and generalize major of 5G technology. In the coming years, 5G networks will be widely deployed. Farmers might use real-time data to monitor, track, and automate their agricultural operations. At home and at the stadium, sports fans could have more influence over their viewing experience. With peak performance of 20 Gbps, 5G claims to be 20 times faster than 4G.It is assisting in the transformation of a variety of Industrial Internet of Things (IIoT) applications. For the first time, new experiences such as augmented reality will become a reality. 5G technology is a fifth-generation invention with a wide range of applications. By 2026, the number of 5G cell phone users is predicted to surpass 3 billion. Many outdated gadgets will need to be replaced because they do not support 5 G.
Conference Paper
Currently, the Chini Lake shores house around 500 indigenous people distributed across six villages with limited access to cellular tower communication coverage. This is mainly due to the challenging terrain profile and dense foliage. Therefore, it is important to establish a reliable line-of-sight (LoS) data transmission via a low-altitude platform (LAP). The high availability but low-cost wireless communication infrastructure such as LoRa is the perfect solution in this scenario. This will allow better coverage due to good propagation characteristics at lower frequency bands as well as the elevated platform. The solution shall also equip a wireless machine-to-machine (M2M) network, sensors technologies and a big data analytic enablement platform. In addition, the characterization of the wireless channel behaviour in Malaysia’s tropical rural areas, where the propagated wireless signal suffers from several imperfections, such as attenuation, diffraction, scattering and absorption due to the presence of various surrounding elements is also being investigated. The outcome of this research is expected to offer a new understanding of the propagation behaviour of current and future wireless IoT technologies, thus helping the network engineer to perform accurate planning and deployment in a rural environment. With this solution, the indigenous Orang Asli community in Chini Lake, Pahang, Malaysia will have access to digital content, as well as water level alerts for mitigation of flooding and drought situations, and Internet access for the promotion of local products and services.
Article
Full-text available
3GPP has recently introduced NB-IoT, a new mobile communication standard offering a robust and energy efficient connectivity option to the rapidly expanding market of Internet of Things (IoT) devices. To unleash its full potential, end-devices are expected to work in a plug and play fashion, with zero or minimal configuration of parameters, still exhibiting excellent energy efficiency. We performed the most comprehensive set of empirical measurements with commercial IoT devices and different operators to date, quantifying the impact of several parameters to energy consumption. Our findings prove that parameters’ settings does impact energy consumption, so proper configuration is necessary. We shed light on this aspect by first illustrating how the nominal standard operational modes map into real current consumption patterns of NB-IoT devices. Further, we investigated which device-reported metadata metrics better reflected performance and implemented an algorithm to automatically identify device state in current time series logs. We worked with two major western European operators to provide a measurement-driven analysis of the energy consumption and network performance of two popular NB-IoT boards under different parameter configurations. We observed that energy consumption is mostly affected by the paging interval in Connected state, set by the base station. However, not all operators correctly implement such settings. Furthermore, under the default configuration, energy consumption in not strongly affected by packet size nor by signal quality, unless it is extremely bad. Our observations indicate that simple modifications to the default parameters’ settings can yield great energy savings.
Article
Full-text available
Internet of Things (IoT) based application requires integration with the wireless communication technology to make the application data readily available. In this paper, a modified meander shape microstrip patch antenna has been proposed for IoT applications at 2.4 GHz ISM (Industrial, Scientific and Medical) band. The dimension of the antenna is 40x10x1.6 mm3. The antenna design is comprised of an inverse S-shape meander line connected with a slotted rectangular box. A capacitive load (C-load) and parasitic patch with the shaped ground are applied to the design. Investigations show that the antenna designed with an inverse S-shape patch and connecting rectangular box in the microstrip line has a higher efficiency and gain compare to the conventional meander shape antenna. The C-load is applied to the feed line to match the impedance. Moreover, parametric studies are carried out to investigate the flexibility of the antenna. Results show that, the gain and efficiency can be improved through adjusting the rectangular box with applying parasitic element and the shaped ground. The parasitic element has high impact on the bandwidth of the antenna of 12.5%. The finalized antenna has a peak gain of -0.256 dBi (measured) and 1.347 dBi (Simulated) with 79% radiation efficiency at 2.4 GHz. To prove the efficiency and eligibility in IoT applications, the measurement of the power delivered and received by the antenna at 2.4 GHz is performed and compared with the results of a dipole antenna. The antenna is integrated with 2.4 GHz radio frequency module and IoT sensors to validate the performance. The antenna novelty relies on the size compactness with high fractional bandwidth that is validated through the IoT application environment.
Article
Full-text available
LoRa wireless technology is a revolutionary wireless network access technology with a wide application prospect. An identification method for Lora devices based on physical layer fingerprinting is proposed to provide identities for authentication. Contrary to previous works, a differential constellation trace figure is established from the radio frequency (RF) fingerprinting features of LoRa devices, which transforms the feature matching to the image recognition. A classification method based on Euclidean distance of clustering center of LoRa signal is performed to analyze the differential constellation trace figure. The experimental results show that six LoRa transmission modules can be recognized accurately, and even in a low signal-to-noise ratio (SNR) environment, the different LoRa devices can still be distinguished and identified effectively.
Article
Full-text available
NarrowBand IoT (NB-IoT) is emerging as a promising communication technology offering a reliable wireless connection to a large number of devices employed in pervasive monitoring scenarios, such as Smart City, Precision Agriculture, and Industry 4.0. Since most of the NB-IoT transmissions occur in the uplink, the random access channel (that is the primary interface between devices and the base station) may usually become the main bottleneck of the entire system. For this reason, analytical models and simulation tools able to investigate its behavior in different scenarios are of the utmost importance for driving current and future research activities. Unfortunately, scientific literature partially addresses the current open issues by means of simplified and, in many cases, not standard-compliant approaches. To provide a significant step forward in this direction, the contribution of this paper is three-folded. First, it presents a flexible, open-source, and 3GPP-compliant implementation of the NB-IoT random access procedure. Second, it formulates an analytical model capturing both collision and success probabilities associated with the aforementioned procedure. Third, it presents the cross-validation of both the analytical model and the simulation tool, by taking into account reference applications scenarios of sensor networks enabling periodic reporting in monitoring infrastructures. Obtained results prove the remarkable accuracy, demonstrating a well-calibrated instrument, which will be also useful for future research activities.
Article
Full-text available
Low-power wide-area (LPWA) communication has gained increasing attention in recent years with the rapid growth of fifth generation evolution (5G), the Internet of Things (IoT), and mobile computing. Narrowband Internet of Things (NB-IoT) is one kind of LPWA technology based on cellular IoT, which supports massive connections, wide area coverage, ultra-low power consumption, and ultra-low cost. Research on NB-IoT communication is increasingly attractive. Network calculus theory facilitates the performance analysis and network optimization of the NB-IoT system. We aim to analyze and optimize the NB-IoT networks. In this paper, we construct a random access traffic model including the NB-IoT user equipment (UE) arrival process and eNB service process. Then, we utilize the stochastic network calculus (SNC) to analyze the network delay in NB-IoT traffic model. Random latency bounds in different arrival processes are derived. Simulations show that SNC can evaluate the system delay under different distributions effectively. For the condition that numerous UEs access simultaneously following the Beta distribution, we first propose an improved K-means algorithm to cluster the NB-IoT terminals. Then, we raise the scheduling strategy on the basis of priority. It consists of the priority generation algorithm IPGNTQ and the NB-IoT task scheduling algorithm SANTQ. The extensive experiment results verify that our proposed optimized strategy can alleviate the network congestion effectually. Moreover, we compare our proposed optimized scheme with four existing uplink traffic scheduling schemes, showing that ours outperforms all of them.
Article
Full-text available
This paper develops a machine leaning framework that evolves an optimal propagation model for the last mile with Low Altitude Platforms from existing propagation models. Existing propagation models reviewed exhibit both advantages and shortcomings in relation to a set of factors that affect performance across different terrains, i.e. path loss, elevation angle, altitude, coverage, power consumption, operational frequency, interference, and antenna type. A comparison of the predictions between the optimized and the existing models in relation to above set of factors reveals significant improvements are achieved with the optimal model.
Conference Paper
Full-text available
Internet of Things (IoT) has become the part of everyday life as well as industry and society. One of the main concerns of mobile operator before implementing IoT in the mobile network is how much the IoT traffic will affect the performance of MBB users. In this paper, we present the simulation study that analyzes the impact of NB-IoT implementation on LTE MBB performance, based on one real network case considering the current status as well as the projected future traffic growth in the network.
Article
Full-text available
The advancement of technologies over years has poised IoT to scoop out untapped Information & Communication technology opportunities. It is anticipated that IoT will handle the gigantic network of billions of devices to deliver plenty of smart services to the users. Undoubtedly, this will make our life more resourceful but at the cost of high energy consumption and carbon footprint. Consequently, there is a high demand for green communication to reduce energy consumption, which requires optimal resource availability and controlled power levels. In contrast to this, IoT devices are constrained in terms of resources- memory, power and, computation. Low Power Wide Area (LPWA) technology is a response to the need for efficient utilization of power resource, as it evinces characteristics such as the capability to proffer low power-connectivity to a huge number of devices spread over wide geographical areas at low-cost. Various LPWA technologies like LoRa, SigFox, etc. exist in the market, offering a proficient solution to the users. However, in order to abstain the need of new infrastructure (like Base Station) that is required for proprietary technologies, a new cellular-based licensed technology, Narrow Band IoT (NBIoT) is introduced by 3GPP in Rel-13. This technology presents a good candidature to handle LPWA market because of its characteristics like enhanced indoor coverage, low power consumption, latency insensitivity and massive connection support towards NBIoT. This survey presents a profound view of IoT and NBIoT, subsuming their technical features, resource allocation, energy efficiency techniques and applications. The challenges that hinder NBIoT path to success are also identified and discussed. In this paper, two novel energy efficient techniques “Zonal thermal pattern analysis” (ZTPA) and Energy Efficient Adaptive Health Monitoring System (E2AHMS) have been proposed towards Green IoT.
Article
Full-text available
Wireless channel propagation characteristics and models are important to ensure the communication quality of wireless sensor networks in agriculture. Wireless channel attenuation experiments were carried out at different node antenna heights (0.8 m, 1.2 m, 1.6 m, and 2.0 m) in the tillering, jointing, and grain filling stages of rice fields. We studied the path loss variation trends at different transmission distances and analyzed the differences between estimated values and measured values of path loss in a free space model and a two-ray model. Regression analysis of measured path loss values was used to establish a one-slope log-distance model and propose a modified two-slope log-distance model. The attenuation speed in wireless channel propagation in rice fields intensified with rice developmental stage and the transmission range had monotone increases with changes in antenna height. The relative error (RE) of estimation in the free space model and the two-ray model under four heights ranged from 6.48–15.49% and 2.09–13.51%, respectively, and these two models were inadequate for estimating wireless channel path loss in rice fields. The ranges of estimated RE for the one-slope and modified two-slope log-distance models during the three rice developmental stages were 2.40–2.25% and 1.89–1.31%, respectively. The one-slope and modified two-slope log-distance model had better applicability for modeling of wireless channels in rice fields. The estimated RE values for the modified two-slope log-distance model were all less than 2%, which improved the performance of the one-slope log-distance model. This validates that the modified two-slope log-distance model had better applicability in a rice field environment than the other models. These data provide a basis for modeling of sensor network channels and construction of wireless sensor networks in rice fields. Our results will aid in the design of effective rice field WSNs and increase the transmission quality in rice field sensor networks.
Conference Paper
Full-text available
We explore functionalities of MAC layer for three Low Power Wide Area Networks (LPWAN) technologies (LoRa, Sigfox and NB-IoT) dedicated for Smart Cities applications and Internet of Things (IoT). These networks are dedicated to long-range (up to dozens of kilometers) and low rate communication to ensure a good autonomy up to 10 years. The technical differences at the level of MAC layer between Sigfox and LoRa and NB-IoT are explained and evaluated. We apply our simulator to evaluate the performances of these technologies in terms of Packet Error Rate.
Article
Full-text available
By 2020, more than 50 billion devices will be connected through radio communications. In conjunction with the rapid growth of the Internet of Things (IoT) market, low power wide area networks (LPWAN) have become a popular low-rate long-range radio communication technology. Sigfox, LoRa, and NB-IoT are the three leading LPWAN technologies that compete for large-scale IoT deployment. This paper provides a comprehensive and comparative study of these technologies, which serve as efficient solutions to connect smart, autonomous, and heterogeneous devices. We show that Sigfox and LoRa are advantageous in terms of battery lifetime, capacity, and cost. Meanwhile, NB-IoT offers benefits in terms of latency and quality of service. In addition, we analyze the IoT success factors of these LPWAN technologies, and we consider application scenarios and explain which technology is the best fit for each of these scenarios.
Article
Full-text available
IoT is rapidly expanding to all productive sectors; however, as thousands of devices are envisioned to be deployed, a careful study of the radio spectrum (and its interferences) must be considered to predict and improve the network performance. In this sense, radio-channel characterization is essential; in particular, path-loss characterization allows to mathematically predict the propagation behavior and interferences of IoT devices with precision. In this work, we show that by means of a set of mathematical corrections and measuring procedures, we can soundly employ the RSSI indicator, available in most IoT motes, to derive accurate path-loss models of industrial infrastructures. Experimental results reveal that the proposed scheme substantially improves the characterization accuracy of a real Smart Grid environment. This, in turn, is proven to have a measurable positive impact on the planning stage of an IoT network, whose life-of-time and cost analysis can be better anticipated.
Article
Full-text available
In the recent period, fast ICT expansion and rapid appearance of new technologies raised the importance of fast and accurate planning and deployment of emerging communication technologies, especially wireless ones. In this paper is analyzed possible usage of Lee propagation model for planning, design and management of networks based on LoRa 868MHz technology. LoRa is wireless technology which can be deployed in various Internet of Things and Smart City scenarios in urban areas. The analyses are based on comparison of field measurements with model calculations. Besides the analyses of Lee propagation model usability, the possible optimization of the model is discussed as well. The research results can be used for accurate design, planning and for preparation of high-performance wireless resource management of various Internet of Things and Smart City applications in urban areas based on LoRa or similar wireless technology. The equipment used for measurements is based on open-source hardware.
Article
Full-text available
By 2020, more than twenty five billion devices would be connected through wireless communications. According to the rapid growth of Internet of Things (IoT) market, Low Power Wide Area (LPWA) technologies become popular. In various LPWA technologies, NarrowBand (NB)-IoT and Long Range (LoRa) are two leading technologies. In this paper, we provide a comprehensive survey on NB-IoT and LoRa as efficient solutions connecting the devices. It is shown that unlicensed LoRa has the advantage in terms of battery lifetime, capacity and cost. Meanwhile, licensed NB-IoT gives benefits in terms of QoS, latency, reliability and range.
Conference Paper
Full-text available
In addition to long battery life and low cost, coverage is one of the most critical performance metrics for the low power wide area networks (LPWAN). In this work we study the coverage of the recently developed LoRa LPWAN technology via real-life measurements. The experiments were conducted in the city of Oulu, Finland, using the commercially available equipment. The measurements were executed for the cases when a node located on ground (attached on the roof rack of a car) or on water (attached to the radio mast of a boat) was reporting its data to a base station. For a node operating in the 868 MHz ISM band using 14 dBm transmit power and the maximum spreading factor, we have obtained the maximum communication range of over 15 km on ground and close to 30 km on water. Besides the actual measurements, in the paper we also present a channel attenuation model derived from the measurement data. The model can be used to estimate the path loss in 868 MHz ISM band in an area similar to Oulu, Finland.
Article
In this paper, we evaluated several network routing energy models for smart farm application with consideration of several factors, such as mobility, traffic size and node size using wireless ZigBee technology. The energy models considered are generic, MICA and Zigbee compliant MICAz models. Wireless sensor networks deployment under several scenarios are considered in this paper, taken into account commercial farm specification with varying complex network deployment circumstances to further understand the energy constraint and requirement of the smart farm application. Several performance indicators, such as packet delivery ratio, throughput, jitter and the energy consumption are evaluated and analysed. The simulation result shows that both throughput and packet delivery ratio increases as the nodes density is increased, indicating that, smart farm network with higher nodes density have a superior Quality of Service (QoS) than networks with sparsely deployed nodes. It is also revealed that traffic from the mobile nodes causes increase in the energy consumption, overall network throughput, average end-to-end delay and average jitter, compared to static nodes traffic. Based on the results obtained, the Generic radio energy models consumed the highest total energy, while MICAz energy consumption model offers the least consumption, having the lowest ‘Idle’ and ‘receive’ modes consumption. The MICAz model also has the lowest total consumed energy as compared with the other energy models, suggesting that it is the most suitable energy model that should be adopted for future smart farm deployment.
Article
Wireless Sensor Networks (WSNs) are one of the most important technical fields that have gained a wide interest by various developers and leading industrial companies to use them in various applications, especially in the applications of monitoring electric power consumption as well as military and medical applications. The quick developments in these networks promise to revolutionize the way people live and thus help to overcome management and control issues as well as reduction of service costs. Although one of the essential features of WSN is its capability to work in a wide area network, the communication in such area could be noisy and present a challenge for the robustness of such systems. In this review paper, the concepts of WSNs are presented and discussed as well as discussing the most popular data transmission technologies used in WSN. The paper focuses on power monitoring applications in building environment and summarizes the recent studies in this field and the enhancements achieved and present the most important challenges faced in such applications. Finally, some conclusions and suggestions are drawn at the end of this paper.
Article
In recent years, we have witnessed the rapid development of LoRa technology, together with extensive studies trying to understand its performance in various application settings. In contrast to measurements performed in large outdoor areas, limited number of attempts have been made to understand the characterization and performance of LoRa technology in indoor environments. In this paper, we present a comprehensive study of LoRa technology in multi-floor buildings. Specifically, we investigate the large-scale fading characteristic, temporal fading characteristic, coverage and energy consumption of LoRa technology in four different types of buildings. Moreover, we find that the energy consumption using different parameter settings can vary up to 145 times. These results indicate the importance of parameter selection and enabling LoRa adaptive data rate feature in energy-limited applications. We hope the results in this paper can help both academia and industry understand the performance of LoRa technology in multi-floor buildings to facilitate developing practical indoor applications.
Article
Wireless sensor networks (WSNs) have received significant attention in the last few years in the agriculture field. Among the major challenges for sensor nodes’ deployment in agriculture is the path loss in the presence of dense grass or the height of trees. This results in degradation of communication link quality due to absorption, scattering, and attenuation through the crop’s foliage or trees. In this study, two new path-loss models were formulated based on the MATLAB curve-fitting tool for ZigBee WSN in a farm field. The path loss between the router node (mounted on a drone) and the coordinator node was modeled and derived based on the received signal strength indicator (RSSI) measurements with the particle swarm optimization (PSO) algorithm in the farm field. Two path-loss models were formulated based on exponential (EXP) and polynomial (POLY) functions. Both functions were combined with PSO, namely, the hybrid EXP-PSO and POLY-PSO algorithms, to find the optimal coefficients of functions that would result in accurate path-loss models. The results show that the hybrid EXP-PSO and POLY-PSO models noticeably improved the coefficient of determination (R2) of the regression line, with the mean absolute error (MAE) found to be 1.6 and 2.7 dBm for EXP-PSO and POLY-PSO algorithms. The achieved R2 in this study outperformed the previous state-of-the-art models. An accurate path-loss model is essential for smart agriculture application to determine the behavior of the propagated signals and to deploy the nodes in the WSN in a position that ensures data communication without unnecessary packets’ loss between nodes.
Conference Paper
LoRa has emerged as a prominent technology for the Internet of Things (IoT), with LoRa Wide Area Network (LoRaWAN) emerging as a suitable connection solution for smart things. The choice of the best location for the installation of gateways, as well as a robust network server configuration, are key to the deployment of a LoRaWAN. In this paper, we present an evaluation of Received Signal Strength Indication (RSSI) values collected from the real-life LoRaWAN deployed in Skelleftea, Sweden, when compared with the values calculated by a Radio Frequency (RF) planning tool for the Irregular Terrain Model (ITM), Irregular Terrain with Obstructions Model (ITWOM) and Okumura-Hata propagation models. Five sensors are configured and deployed along a wooden bridge, with different Spreading Factors (SFs), such as SF 7, 10 and 12. Our results show that the RSSI values calculated using the RF planning tool for ITWOM are closest to the values obtained from the real-life LoRaWAN. Moreover, we also show evidence that the choice of a propagation model in an RF planning tool has to be made with care, mainly due to the terrain conditions of the area where the network and the sensors are deployed.
Article
Internet of Things (IoT) monitoring and tracking devices could be used to track and monitor wildlife (for example mountain lion) in the mountainous and rocky environments. Similarly, these devices would provide connectivity through device-to-device and device-to-machine communications during search and rescue (SAR), and military operations in the same environments. They may also monitor the locations, health conditions, general safety of the hikers during hiking recreations. However, accurate and reliable communication models that will ensure efficient and functional deployment of the devices are not available. This study uses actual IoT devices in 900 MHz and 2.4 GHz bands to take measurements in the rocky terrains and propose reliable communication models. The proposed models are compared with theoretical models which deviate by 8 dB to 38 dB. Also, result shows that mountains and rocks cause about 8 dB signal loss on the average. The practical propagation data and models could be used to improve the operation of SAR team, scientific communities and other applications of IoT.
Article
Recently, LoRaWAN has emerged as a promising technology for the Internet of things (IoT), owing its ability to support low-power and long-range communications. However, real-world deployment and network optimization require accurate path-loss (PL) modeling, so as to estimate network coverage, performance, and profitability. For that reason, in this work, LoRaWAN radio channel is investigated in the 868 MHz band. Extensive measurement campaigns were carried out in both indoor and outdoor environments at urban and rural locations in Lebanon (Saint Joseph University of Beirut campus, Beirut city, and Bekaa valley). Based on empirical results, PL models are developed for LoRaWAN communications and compared with widely used empirical models.Moreover, the performance and the coverage of LoRaWAN deployment are evaluated based on real measurements. The results show that the proposed PL models are accurate and simple to be applied in Lebanon and other similar locations. Coverage ranges up to 8 km and 45 km were obtained in urban and rural areas, respectively. This reveals the reliability of this promising technology for long-range IoT communications.
Article
NarrowBand-IoT has just joined the LPWAN community. Unlike most of its competitors, NB-IoT did not emerge from a blank slate. Indeed, it is closely linked to LTE, from which it inherits many of the features that undoubtedly determine its behavior. In this paper, we empirically explore the boundaries of this technology, analyzing from a user’s point of view critical characteristics such as energy consumption, reliability and delays. The results show that its performance in terms of energy is comparable and even outperforms, in some cases, an LPWAN reference technology like LoRa, with the added benefit of guaranteeing delivery. However, the high variability observed in both energy expenditure and network delays call into question its suitability for some applications, especially those subject to service-level agreements.
Conference Paper
Although the Narrowband Internet of Things (NB- IoT) is tightly connected and integrated with cellular networks such as the Long Term Evolution (LTE), it remains an inde- pendent radio interface technology. Therefore, it is important to understand its specific requirements by providing a systematic mathematical model of its physical (PHY) layer. This article begins by systematically and numerically describing the PHY layer of the NB-IoT technology, its applications and the context of its deployment. The article then presents a mathematical modelling of the NB-IoT operations mode as this is key for modelling its PHY layer. This is followed by a clear and concise mathematical modelling of the NB-IoT down-link (DL) by elucidating the fundamental differences with the ones of the LTE. This modelling study goes further by providing a comprehensive model and analysis of the data rate against energy efficiency dilemma as an identified problem for the PHY design of the NB-IoT. This analysis is performed theoretically but also tested by means of computer simulation. The obtained results demonstrate that the contribution of the NB-IoT to the LTE in terms of energy consumption is less significant in the in- band deployment compared to the guard band and standalone bands. However, the in-band deployment is expected to reduce the LTE’s channel capacity more than the other deployment modes. In terms of network scalability, the obtained results further show that the data rate and energy efficiency of the network are reduced affected by the increase in the number of devices within a cell. These modelled network performances of the NB- IoT sufficiently demonstrate that in order to enable a licensed Massive IoT network, there is a need for developing novel PHY layer techniques (modulation, channel coding etc.) capable to efficiently scale the NB-IoT network while maintaining acceptable data rate and low energy consumption.
Article
This paper aims to determine the distance between the mobile sensor node (i.e., bicycle) and the anchor node (i.e., coach) in outdoor and indoor environments. Two approaches were considered to estimate such a distance. The first approach was based on the traditional channel propagation model that used the log-normal shadowing model (LNSM), while the second approach was based on a proposed hybrid particle swarm optimization–artificial neural network (PSO–ANN) algorithm to improve the distance estimation accuracy of the mobile node. The first method estimated the distance according to the LNSM and the measured received signal strength indicator (RSSI) of the anchor node, which in turn used the ZigBee wireless protocol. The LNSM parameters were measured based on the RSSI measurements in both outdoor and indoor environments. A feed-forward neural network type and the Levenberg–Marquardt training algorithm were used to estimate the distance between the mobile node and the coach. The hybrid PSO–ANN algorithm significantly improved the distance estimation accuracy more than the traditional LNSM method without additional components. The hybrid PSO–ANN algorithm achieved a mean absolute error of 0.022 and 0.208 m for outdoor and indoor environments, respectively. The effect of anchor node density on localization accuracy was also investigated in the indoor environment.
NB-IoT Deployment guide to basic feature set requirements
GSMA, "NB-IoT Deployment guide to basic feature set requirements," Gsma, vol. Release 3, no. June, 2019.
3GPP TS 23.682 (clause 4.5.4): Architecture enhancements to facilitate communications with packet data networks and applications
3GPP, "3GPP TS 23.682 (clause 4.5.4): Architecture enhancements to facilitate communications with packet data networks and applications," 2016.
His research interest includes wireless sensor networks, wireless communication, IoT, and channel propagation modeling and estimation
  • A H Haider
Haider A.H. Alobaidy received the M.Sc. degree in Electrical Engineering/ Electronics & Communication from the Faculty of Engineering, Al-Mustansiriyah University, Iraq, in 2016. He is currently a Ph.D. student in the Department of Electrical, Electronic and Systems Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (UKM). His research interest includes wireless sensor networks, wireless communication, IoT, and channel propagation modeling and estimation. He is a Student Member of IEEE.
NB-IoT deployment guide to basic feature set requirements, release 3
  • U K London
  • White Gsma
  • Paper
"NB-IoT deployment guide to basic feature set requirements, release 3," London, U.K., GSMA, White Paper, Jun. 2019.
Alobaidy (Graduate Student Member, IEEE) received the M.Sc. degree in electrical engineering/electronics and communication
  • A H Haider
Haider A. H. Alobaidy (Graduate Student Member, IEEE) received the M.Sc. degree in electrical engineering/electronics and communication