Fig 2 - uploaded by Tharindu Bandaragoda
Content may be subject to copyright.
Source publication
The technological landscape of intelligent transport systems (ITS) has been radically transformed by the emergence of the big data streams generated by the Internet of Things (IoT), smart sensors, surveillance feeds, social media, as well as growing infrastructure needs. It is timely and pertinent that ITS harness the potential of an artificial int...
Contexts in source publication
Context 1
... the data transformation process, feature vectors that are mostly unstructured with unlabeled target variables are passed on to L2. In L2, a novel online incremental machine learning algorithm is proposed for real-time concept drift detection and adaptation (Fig. 2). The underlying base algorithm has been applied to examine context awareness in the Aarhus city of Denmark motor traffic dataset [20] and to study how the driver behavior change can affect the coordination between autonomous and human-driven vehicles [21], and the algorithm has been extended in this work for recurrent and non-recurrent ...
Context 2
... the data transformation process, feature vectors that are mostly unstructured with unlabeled target variables are passed on to L2. In L2, a novel online incremental machine learning algorithm is proposed for real-time concept drift detection and adaptation (Fig. 2). The underlying base algorithm has been applied to examine context awareness in the Aarhus city of Denmark motor traffic dataset [20] and to study how the driver behavior change can affect the coordination between autonomous and human-driven vehicles [21], and the algorithm has been extended in this work for recurrent and non-recurrent ...
Citations
... RKD might be better for tinyML due to efficiency in data transfer. Choice depends on tinyML application's requirements and constraints [16]. ...
Edge-based video surveillance systems encounter significant obstacles in object detection due to the limited computational power and energy efficiency of edge devices, which are required to deliver real-time processing capabilities. Traditional object detection models are excessively resource-hungry for these environments, making optimization techniques absolutely essential. This study robustly explores the implementation of quantized transfer learning utilizing SSD MobileNet V2 with 8-bit quantization to significantly elevate the performance of object detection on resource-constrained edge devices. Experimental results decisively indicate that the Raspberry Pi 5 and Nvidia Jetson Orin Nano surpass other devices, achieving total latencies of 5 ms and 85 ms, respectively, affirming their exceptional suitability for real-time applications. The quantized int8 model secures an impressive accuracy of 80.65% while dramatically lowering memory consumption and latency when compared to the unoptimized int32 model. Furthermore, the model demonstrates outstanding performance on a masked-unmasked dataset with an F1 score of 0.92 for masked detection. These findings underscore the transformative potential of quantization in enhancing both inference speed and resource efficiency in edge-based surveillance systems. Future research will boldly investigate advanced hybrid quantization strategies and architectural enhancements to achieve an optimal balance of efficiency and accuracy, alongside scalability across a broader spectrum of edge devices and datasets.
... On the other hand, in [7], we do not take air absorption losses or diffraction into account when assessing the impact of a dust storm. Many articles tried to explain this in 5G [16][17][18][19][20][21][22][23][24][25]. ...
... A version of the Mie model based on the propagating millimeter wave has been provided in [20] to simulate the impact of a particle dust storm and examine its attenuation. During the same time, the effects of precipitation and diffraction events on wireless PPT communication systems were studied. ...
... Matrices such as root mean square error (RMSE), mean absolute error (MAE), and R squared (R 2 ) are used to evaluate the performance of the deep learning model used [18][19][20][21][22][23]. ...
... These DL-based solutions are primarily trapped in scale-sensitive difficulties. Nallaperuma et al. [49] captured idea drifts and classified them as recurrent or non-recurrent traffic occurrences using incremental and deep reinforcement learning approaches. Reza et al. [50] reviewed various DL models for urban traffic management and mentioned the effects, defects and limitations of various datasets available in image, video, and text formats. ...
The rapid population growth, insecure lifestyle, wastage of natural resources, indiscipline behavior of human beings, urgency in the medical field, security of patient information, agricultural-related problems, and automation requirements in industries are the reasons for invention of technologies. Smart cities aim to address these challenges through the integration of technology, data, and innovative practices. Building a smart city involves integrating advanced technologies and data-driven solutions to enhance urban living, improve resource efficiency, and create sustainable environments. This review presents five of the most critical technologies for smart and/or safe cities, addressing pertinent topics such as intelligent traffic management systems, information and communications technology, blockchain technology, re-identification, and the Internet of Things. The challenges, observations, and remarks of each technology in solving problems are discussed, and the dependency effects on the technologies’ performance are also explored. Especially deep learning models for various applications are analyzed. Different models performance, their dependency on dataset size, type, hyper-parameters, and the non-availability of labels or ground truth are discussed.
... Unsupervised learning has been applied in ITS as shown in the studies below. In Ref. [86], data, models, and algorithms for smart transport planning are presented. The authors examined how clustering analysis can be applied in trip distribution, generation and traffic zone division. ...
Intelligent Transport Systems (ITS) are crucial for safety, efficiency, and reduced congestion in transportation. They require efficient, secure, high-speed communication. Radio Frequency (RF) technologies like Fifth Generation (5G), Beyond 5G (B5G), and Sixth Generation (6G) are promising, but spectrum scarcity mandates coexistence with Optical Wireless Communication (OWC) networks, which offer high data rates and security, forming a strong foundation for hybrid RF/OWC applications in ITS. In this paper, we delve into the application of Machine Learning (ML) to enhance data communications within OWC systems in ITS. We commence by conducting an in-depth examination of the data communication prerequisites and the associated challenges within the ITS domain. Subsequently, we elucidate the compelling rationale behind the convergence of heterogeneous RF technologies with OWC for data communications in ITS scenarios. Our investigation then pivots towards elucidating the indispensable role played by ML in optimizing data communications via OWC within ITS. To provide a comprehensive perspective, we systematically evaluate and compare a spectrum of ML methodologies employed in OWC ITS data communications. As a culmination of our study, we proffer a set of valuable recommendations and illuminate promising avenues for future research endeavors that warrant further exploration within this critical intersection of ML, OWC, and ITS data communications.
... behavior. Nallaperuma et al. [36] introduced a smart traffic management platform (STMP) to enhance the traffic control decisions. The STMP used the unsupervised online incremental ML to detect the drifts in big data streams. ...
With the rapid increase in the number of vehicles on the road, the necessity for traffic management systems became apparent. In order to effectively control traffic, a number of technical solutions were proposed to solve traffic issues, such as eliminating traffic congestion and identifying the shortest routes. Researchers are motivated to use various data-driven solutions that assist decision-makers in making timely decisions due to the enormous amount of traffic data that has been gathered by utilizing sensors, traffic signals, and cameras. The primary objective of this research is to analyze the current traffic management systems from several technical perspectives. According to the technology employed, this study reviews recent traffic management system methodologies and classifies them into five main groups: machine learning-based, fuzzy logic-based, statistically-based, graph-based, and hybrid approaches. Each group is presented together with a thorough overview of its scope, main challenges, analysis type, and dataset. Researchers and practitioners are anticipated to use this study as a guide to develop new technical-based traffic management systems, as well as to propose new contributions or enhance current ones.
... In practical traffic engineering, STTD is one of the main ingredients required to provide vital information for the timely and effective deployment of large-scale traffic control strategies (Tsitsokas et al., 2023;Hu and Ma, 2024), data-driven traffic demand management (Nallaperuma et al., 2019), and elaborate traffic optimization routines in modern intelligent transportation systems (Zhang et al., 2011). Meanwhile, the ever-increasing amount of STTD in transportation systems has left traffic agencies in need of a generalized method to analyze the continuously collected data in various types. ...
Spatiotemporal Traffic Data (STTD) measures the complex dynamical behaviors of the multiscale transportation system. Existing methods aim to reconstruct STTD using low-dimensional models. However, they are limited to data-specific dimensions or source-dependent patterns, restricting them from unifying representations. Here, we present a novel paradigm to address the STTD learning problem by parameterizing STTD as an implicit neural representation. To discern the underlying dynamics in low-dimensional regimes, coordinate-based neural networks that can encode high-frequency structures are employed to directly map coordinates to traffic variables. To unravel the entangled spatial-temporal interactions, the variability is decomposed into separate processes. We further enable modeling in irregular spaces such as sensor graphs using spectral embedding. Through continuous representations, our approach enables the modeling of a variety of STTD with a unified input, thereby serving as
a generalized learner of the underlying traffic dynamics. It is also shown that it can learn implicit low-rank priors and smoothness regularization from the data, making it versatile for learning different dominating data patterns. We validate its effectiveness through extensive experiments in real-world scenarios, showcasing applications
from corridor to network scales. Empirical results not only indicate that our model has significant superiority
over conventional low-rank models, but also highlight that the versatility of the approach extends to different
data domains, output resolutions, and network topologies. Comprehensive model analyses provide further
insight into the inductive bias of STTD.We anticipate that this pioneering modeling perspective could lay the
foundation for universal representation of STTD in various real-world tasks.
... GSOM demonstrates robustness for dynamic environments, and future work will focus on confirming this behavior for the proposed framework. Future studies will also focus on other related network traffic applications and the development of online incremental machine learning algorithms to cope with the unstationary distributions of the network data; for example, we can apply an online version of GSOM [61,62] to implement an online detection method with a change detection ability. ...
Machine learning is regarded as an effective approach in network intrusion detection, and has gained significant attention in recent studies. However, few intrusion detection methods have been successfully applied to detect anomalies in large-scale network traffic data, and low explainability of the complex algorithms has caused concerns about fairness and accountability. A further problem is that many intrusion detection systems need to work with distributed data sources in the cloud. In this paper, we propose an intrusion detection method based on distributed computing to learn the latent representations from large-scale network data with lower computation time while improving the intrusion detection accuracy. Our proposed classifier, based on a novel hierarchical algorithm combining adaptability and visualization ability from a self-structured unsupervised learning algorithm and achieving explainability from self-explainable supervised algorithms, is able to enhance the understanding of the model and data. The experimental results show that our proposed method is effective, efficient, and scalable in capturing the network traffic patterns and detecting detailed network intrusion information such as type of attack with high detection performance, and is an ideal method to be applied in cloud-computing environments.
... Instead of focusing only on one road, researchers must take into account a variety of spatio-temporal characteristics of urban road networks. However, as the road network grows, ITS must manage larger and larger volumes of data [9]. Compressing the traffic flow data that has been gathered has become important and realistic in order to address the issues caused by enormous data and constrained processing capacity [10,11]. ...
For smart cities, predicting traffic flow is crucial to lower traffic jams and enhancing transportation efficiency. The smart city needs effective models, highly dependable networks, and data privacy for traffic flow prediction (Traff-FP). The majority of current research uses a central training mode and ignores privacy issue conveyed by distributed traffic data. In this paper, an effective traffic flow prediction (ETraff-FP) is proposed to forecast traffic flow using actual historical traffic data. Initially, pre-processing is carried out using data normalization and handling missing value. The three major components of Traff-FP framework for each local Traff-FP model are recurrent long short-term capture network (RLSCN), federated gated graph attentive network (FGAN) and semantic connection relationship capture network (SCRCN). The long-term spatio and temporal information in each location has been captured by RLSCN, which encompasses constituents like fully connected (FC) layers, convolution, and bidirectional long short term memory (BiLSTM) to collect short-term information. FGAN, which incorporates bi-directional gated recurrent unit (Bi-GRU), exchanges short-term spatio-temporal hidden information while it trains local Traff-FP model using elliptic curve diffie-hellman (ECDiff-H) algorithm. Accordingly, the hyper parameters of ETraff-FP are tuned using extended remora optimization algorithm (EReOA). The ETraff-FP framework is trained and tested with TaxiNYC and TaxiBJ datasets. For simulation, python platform is utilized and various evaluation metrics are analysed. Accordingly, the ETraff-FP framework has reached better improvements with MSE of 8.98% and 10.57%, RMSE of 8.62% and 18.65%, MAE of 2.11% and 10.57%, R2-score of 0.959% and 0.913%, and MAPE of 21.12% and 24.89% against the existing methods using TaxiNYC and TaxiBJ datasets. Overall, the proposed work not only advances the state-of-the-art in traffic flow prediction but also proves the value of enabling effective and efficient traffic management systems in urban and smart city environments.
... Predictive modeling and data analytics have the potential to offer numerous benefits in urban planning decisions. For instance, in traffic management, predictive modeling can aid in forecasting traffic patterns and pinpoint congested areas, which informs decisions on road construction, signal timing, and interventions to enhance traffic flow [1][2][3][4][5]. In crime prevention, data analytics can aid police departments in identifying high-risk areas for crime and directing resources accordingly, while predictive modeling can help forecast the probability of crimes happening in specific areas, facilitating targeted prevention efforts [6][7][8][9]. ...
In the realm of urban planning and infrastructure development, the fusion of expansive data and cutting-edge analytics has ushered in an era where cities can craft their trajectories based on evidence and foresight. This data-driven approach empowers municipalities to optimize resource allocation, enhance operational efficiency, and ultimately elevate the quality of services provided to residents. However, the pursuit of data accuracy, a linchpin in this paradigm, is not without its hurdles, including the challenges of cost, time constraints, and varying data availability. Despite these obstacles, the significance of accurate data cannot be overstated. It serves as the cornerstone for informed decision-making, enabling city governments to pinpoint and rectify disparities, channel resources with precision, and curtail inefficiencies. This paper under-takes a comprehensive examination of the cost-effectiveness of data accuracy within the framework of city development strategy, employing a meticulous cost-benefit analysis. By delineating the nuanced relationship between accuracy and cost, the research aims to identify the optimal threshold for data accuracy. Armed with this knowledge, city governments can make judicious choices, harnessing the transformative potential of accurate data to sculpt more equitable, sustainable, and livable urban landscapes. The findings of this study promise to guide cities toward a future where precision in data informs policy, fostering cities that are not only resilient but responsive to the diverse needs of their inhabitants.
... Various studies have begun investigating how various aspects seek to compromise dataeven with many cyber measures in place [8]. One such concern is how the Internet is gradually replacing normal social activities as users now engage with web content -as tools to compensate for their loneliness and social seclusion [9], [10]. ...
The digital revolution frontiers have rippled across society today – with various web content shared online for users as they seek to promote monetization and asset exchange, with clients constantly seeking improved alternatives at lowered costs to meet their value demands. From item upgrades to their replacement, businesses are poised with retention strategies to help curb the challenge of customer attrition. The birth of smartphones has proliferated feats such as mobility, ease of accessibility, and portability – which, in turn, have continued to ease their rise in adoption, exposing user device vulnerability as they are quite susceptible to phishing. With users classified as more susceptible than others due to online presence and personality traits, studies have sought to reveal lures/cues as exploited by adversaries to enhance phishing success and classify web content as genuine and malicious. Our study explores the tree-based Random Forest to effectively identify phishing cues via sentiment analysis on phishing website datasets as scrapped from user accounts on social network sites. The dataset is scrapped via Python Google Scrapper and divided into train/test subsets to effectively classify contents as genuine or malicious with data balancing and feature selection techniques. With Random Forest as the machine learning of choice, the result shows the ensemble yields a prediction accuracy of 97 percent with an F1-score of 98.19% that effectively correctly classified 2089 instances with 85 incorrectly classified instances for the test-dataset.