Article

6G Wireless Networks: Vision, Requirements, Architecture, and Key Technologies

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

A key enabler for the intelligent information society of 2030, 6G networks are expected to provide performance superior to 5G and satisfy emerging services and applications. In this article, we present our vision of what 6G will be and describe usage scenarios and requirements for multi-terabyte per second (Tb/s) and intelligent 6G networks. We present a large-dimensional and autonomous network architecture that integrates space, air, ground, and underwater networks to provide ubiquitous and unlimited wireless connectivity. We also discuss artificial intelligence (AI) and machine learning [1], [2] for autonomous networks and innovative air-interface design. Finally, we identify several promising technologies for the 6G ecosystem, including terahertz (THz) communications, very-large-scale antenna arrays [i.e., supermassive (SM) multiple-input, multiple-output (MIMO)], large intelligent surfaces (LISs) and holographic beamforming (HBF), orbital angular momentum (OAM) multiplexing, laser and visible-light communications (VLC), blockchain-based spectrum sharing, quantum communications and computing, molecular communications, and the Internet of Nano-Things.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... F UTURE wireless communication systems must ensure seamless green connectivity among numerous low-power devices. For this, it is essential to mitigate electronic waste resulting from battery replacements and reduce disruptions caused by battery depletion [1]- [3]. Energy harvesting (EH) technologies are fundamental enablers for this by providing wireless charging capability and promoting sustainable Internet of Things [4], [5]. ...
... In contrast, this paper investigates BD-RIS-assisted WPT under non-linear EH models, a novel direction not yet explored. 1 Our main contributions are: ...
Preprint
Full-text available
Radio frequency (RF) wireless power transfer (WPT) is a promising technology to seamlessly charge low-power devices, but its low end-to-end power transfer efficiency remains a critical challenge. To address the latter, low-cost transmit/radiating architectures, e.g., based on reconfigurable intelligent surfaces (RISs), have shown great potential. Beyond diagonal (BD) RIS is a novel branch of RIS offering enhanced performance over traditional diagonal RIS (D-RIS) in wireless communications, but its potential gains in RF-WPT remain unexplored. Motivated by this, we analyze a BD-RIS-assisted single-antenna RF-WPT system to charge a single rectifier, and formulate a joint beamforming and multi-carrier waveform optimization problem aiming to maximize the harvested power. We propose two solutions relying on semi-definite programming for fully connected BD-RIS and an efficient low-complexity iterative method relying on successive convex approximation. Numerical results show that the proposed algorithms converge to a local optimum and that adding transmit sub-carriers or RIS elements improves the harvesting performance. We show that the transmit power budget impacts the relative power allocation among different sub-carriers depending on the rectifier's operating regime, while BD-RIS shapes the cascade channel differently for frequency-selective and flat scenarios. Finally, we verify by simulation that BD-RIS and D-RIS achieve the same performance under pure far-field line-of-sight conditions (in the absence of mutual coupling). Meanwhile, BD-RIS outperforms D-RIS as the non-line-of-sight components of the channel become dominant.
... These early developments may pave the way for widespread adoption. Therefore, compared to 5G, 6G will deliver higher data rates (up to 1 Tbps), ultra-low latency (less than 1 ms), and greater connection density [9,10]. A unique aspect of 6G is the potential integration of mobile base stations, such as satellites, which presents new challenges in routing and priority management [11,12]. ...
... A unique aspect of 6G is the potential integration of mobile base stations, such as satellites, which presents new challenges in routing and priority management [11,12]. While this study focuses on current SDN environments, the proposed methods provide valuable insights for future architectures like 6G [9,13]. Specifically, SDN and OpenFlow are anticipated to play a crucial role in network slicing and dynamic resource allocation [14,15]. ...
Article
Full-text available
This paper tackles key challenges in Software-Defined Networking (SDN) by proposing a novel approach for optimizing resource allocation and dynamic priority assignment using OpenFlows priority field. The proposed Lagrangian relaxation (LR)-based algorithms significantly reduces network delay, achieving performance management with dynamic priority levels while demonstrating adaptability and efficiency in a sliced network. The algorithms’ effectiveness were validated through computational experiments, highlighting the strong potential for QoS management across diverse industries. Compared to the Same Priority baseline, the proposed methods: RPA, AP–1, and AP–2, exhibited notable performance improvements, particularly under strict delay constraints. For future applications, the study recommends expanding the algorithm to handle larger networks, integrating it with artificial intelligence technologies for proactive resource optimization. Additionally, the proposed methods lay a solid foundation for addressing the unique demands of 6G networks, particularly in areas such as base station mobility (Low-Earth Orbit, LEO), ultra-low latency, and multi-path transmission strategies.
... Next-generation wireless communication systems are expected to be established not only to connect with mobile devices but also with various Internet of Things (IoT) devices to meet emerging applications such as Online Gaming, Extended Reality (XR), Metaverses, Healthcare or Telemetry [1]. Beyond demands of high throughput and improved spectral efficiency in response to these applications, the sixth generation (6G) also requires wireless infrastructure to have advanced ultra-reliability and low-latency communication (URLLC) features [2], where the communication reliability must touch 99.99% and the transmission latency should be preserved within one millisecond. Besides, the openness and broadcast nature of the wireless transmission medium, particularly interactions of diverse IoT domains with the contingent on short-packet delivery, also renders more and more security flaws that potential attackers can exploit to compromise wireless information transmission [3]. ...
... In response to the stringent requirements of URLLC, a promising technique that holds promise for achieving physi-cal transmission with reduced latency is Short-Packet Communication (SPC) [2,19]. The popularity of SPC is laid by the foundation of a new theory, namely channel coding rate in the finite blocklength regime, developed by Polyanskiy in 2010 [20]. ...
Article
Full-text available
Future wireless communications are expected to serve a wide range of emerging applications, such as Online Gaming, Extended Reality (XR), Metaverses, Healthcare or Telemetry, where communication from diverse connected Internet of Things (IoT) devices require not only stringent conditions such as ultra-reliability and low-latency communication (URLLC) together with high bandwidth but also concerns about content security as well as copyright protection. To deal with URLLC demands, ShortPacket Communication (SPC) has been recently considered a vital solution. Meanwhile, to meet high spectrum utilization, Non-Orthogonal Multiple Access (NOMA) has emerged as a potential technology in the last decade, for its ability to serve multi-user communication simultaneously by exploiting powerdomain rather than time or frequency domains. Especially, incorporating Reconfigurable Intelligent Surfaces (RIS) with NOMA/SPC-based systems can further boost the system’s spectral efficiency as well as enhance communication coverage. However, NOMA-based systems hugely demand a reliable user-paring process, which imposes challenges in ensuring secure short-packet delivery for emerging IoT applications. Hence, this paper studies downlink RIS-assisted short-packet NOMA systems with the focus of improving the secure performance of the pairing process with untrusted users. Our study contributes a new strategy for arbitrary paring users by designing a joint power allocation policy and RIS’s phase shifter, where untrustworthy users will be allocated with higher power levels while trustworthy users will be configured with sub-optimal phase shift criterion at RIS to maximize its cascaded channel gain. Besides, we also derive closed-form expressions for the average block-error rate (BLER) to analyze the performance of trustworthy users as well as the average secure BLER to quantify the secure performance when untrustworthy users wiretap trustworthy users’ information using successive interference mechanisms. Moreover, we further develop asymptotic expressions for both cases to measure the diversity gain and induce key parameters. Subsequently, Monte Carlo simulations are provided as a benchmark to corroborate the theoretical findings. This work can be used as a copyright protection technique for digital content such as games.
... In the 6G era, network users will benefit from higher speeds, comprehensive coverage, extremely high network reliability, ultra-low latency, and high energy efficiency. 6G is envisioned as an autonomous ecosystem with advanced intelligence and perception that will incorporate many emerging technologies to support new applications such as holographic communications and tactile feedback [2], [4]. In addition, compared with previous generations of wireless communications technologies, 6G is more human-centric and will have the characteristics of high security, privacy, affordability, and full customization, further improving the user experience [5]. ...
... In the 6G era, traditional mobile communications will remain mainstream [5], with 6G application scenarios that enhance these communications through improved security, privacy protection, expanded, cost-effective network coverage. Addressing these issues is crucial for successfully implementing 6G applications [4]. Similarly, new applications and services such as tactile Internet [42], real-time remote surgery [43], the brain-computer interface [44], ultra-precise industrial control [45], and fully automatic intelligent networks are expected to be realized with the support of 6G. ...
Article
Full-text available
The unprecedented advancement of Artificial Intelligence (AI) has positioned Explainable AI (XAI) as a critical enabler in addressing the complexities of next-generation wireless communications. With the evolution of the 6G networks, characterized by ultra-low latency, massive data rates, and intricate network structures, the need for transparency, interpretability, and fairness in AI-driven decision-making has become more urgent than ever. This survey provides a comprehensive review of the current state and future potential of XAI in communications, with a focus on network slicing, a fundamental technology for resource management in 6G. By systematically categorizing XAI methodologies–ranging from model-agnostic to model-specific approaches, and from pre-model to post-model strategies–this paper identifies their unique advantages, limitations, and applications in wireless communications. Moreover, the survey emphasizes the role of XAI in network slicing for vehicular network, highlighting its ability to enhance transparency and reliability in scenarios requiring real-time decision-making and high-stakes operational environments. Real-world use cases are examined to illustrate how XAI-driven systems can improve resource allocation, facilitate fault diagnosis, and meet regulatory requirements for ethical AI deployment. By addressing the inherent challenges of applying XAI in complex, dynamic networks, this survey offers critical insights into the convergence of XAI and 6G technologies. Future research directions, including scalability, real-time applicability, and interdisciplinary integration, are discussed, establishing a foundation for advancing transparent and trustworthy AI in 6G communications systems.
... The upcoming sixth-generation (6G) communication network is anticipated to deliver ultra-fast data transmission and seamless connectivity [1]. To fulfill the stringent quality inherent in service (QoS) requirements of 6G, various cutting-edge technologies, such as intelligent reflecting surfaces (IRSs) [2], [3], fluid antenna systems [4], and movable antennas [5], have demonstrated their effectiveness in enhancing system capacity by proactively modifying channel environments. ...
Preprint
This paper investigates the resource allocation design for a pinching antenna (PA)-assisted multiuser multiple-input single-output (MISO) non-orthogonal multiple access (NOMA) system featuring multiple dielectric waveguides. To enhance model accuracy, we propose a novel frequency-dependent power attenuation model for dielectric waveguides in PA-assisted systems. By jointly optimizing the precoder vector and the PA placement, we aim to maximize the system's sum-rate while accounting for the power attenuation across dielectric waveguides. The design is formulated as a non-convex optimization problem. To effectively address the problem at hand, we introduce an alternating optimization-based algorithm to obtain a suboptimal solution in polynomial time. Our results demonstrate that the proposed PA-assisted system not only significantly outperforms the conventional system but also surpasses a naive PA-assisted system that disregards power attenuation. The performance gain compared to the naive PA-assisted system becomes more pronounced at high carrier frequencies, emphasizing the importance of considering power attenuation in system design.
... A diverse array of emerging wireless services for the sixthgeneration (6G) wireless systems, such as unmanned vehicles, V2X, and the industrial Internet of things (IIoT), require wireless networks to provide both target sensing and communication promptly [1]. Simultaneously, communication and sensing systems are increasingly converging their frequency bands in response to the rising demand for fast communication and high-resolution sensing [2], which has led to competition for spectrum. ...
Preprint
The recently proposed multi-chirp waveform, affine frequency division multiplexing (AFDM), is regarded as a prospective candidate for integrated sensing and communication (ISAC) due to its robust performance in high-mobility scenarios and full diversity achievement in doubly dispersive channels. However, the insufficient Doppler resolution caused by limited transmission duration can reduce the accuracy of parameter estimation. In this paper, we propose a new off-grid target parameter estimation scheme to jointly estimate the range and velocity of the targets for AFDM-ISAC system, where the off-grid Doppler components are incorporated to enhance estimation accuracy. Specifically, we form the sensing model as an off-grid sparse signal recovery problem relying on the virtual delay and Doppler grids defined in the discrete affine Fourier (DAF) domain, where the off-grid components are regarded as hyper-parameters for estimation. We also employ the expectation-maximization (EM) technique via a sparse Bayesian learning (SBL) framework to update hyper-parameters iteratively. Simulation results indicate that our proposed off-grid algorithm outperforms existing algorithms in sensing performance and is highly robust to the AFDM-ISAC high-mobility scenario.
... I N recent years massive research efforts have been directed towards reconfigurable intelligent surface (RIS) technology in both academia and industry. These efforts were motivated by the fact that the RIS is considered a revolutionary technology for beyond fifth-generation (B5G) and sixth generation (6G) wireless communications networks due to its ability to build controllable radio environments and enhance the quality of communication in a cost-effective way [1]- [3]. Metamaterials are commonly used to build the RISs [4] as a reflective array that is made up of a large number of passive elements. ...
Preprint
Full-text available
Integrating BD-RIS into wireless communications systems has attracted significant interest due to its transformative potential in enhancing system performance. This survey provides a comprehensive analysis of BD-RIS technology, examining its modeling, structural characteristics, and network integration while highlighting its advantages over traditional diagonal RIS. Specifically, we review various BD-RIS modeling approaches, including multiport network theory, graph theory, and matrix theory, and emphasize their application in diverse wireless scenarios. The survey also covers BD-RIS's structural diversity, including different scattering matrix types, transmission modes, intercell architectures, and circuit topologies, showing their flexibility in improving network performance. We delve into the potential applications of BD-RIS, such as enhancing wireless coverage, improving PLS, enabling multi-cell interference cancellation, improving precise sensing and localization, and optimizing channel manipulation. Further, we explore BD-RIS architectural development, providing insights into new configurations focusing on channel estimation, optimization, performance analysis, and circuit complexity perspectives. Additionally, we investigate the integration of BD-RIS with emerging wireless technologies, such as millimeter-wave and terahertz communications, integrated sensing and communications, mobile edge computing, and other cutting-edge technologies. These integrations are pivotal in advancing the capabilities and efficiency of future wireless networks. Finally, the survey identifies key challenges, including channel state information estimation, interference modeling, and phase-shift designs, and outlines future research directions. The survey aims to provide valuable insights into BD-RIS's potential in shaping the future of wireless communications systems.
... The rapid evolution of 6G networks aims to deliver seamless connectivity, ultrareliable low-latency communication (URLLC), and enhanced mobile broadband (eMBB), and support massive machine-type communications (mMTC) [1,2]. A crucial aspect of 6G is the integration of low-power Internet of Things (IoT) devices, which will be widely used in applications such as smart cities and industrial automation [3][4][5]. However, these devices face significant challenges due to limited energy resources. ...
Article
Full-text available
Wireless powered communication networks (WPCNs) provide a sustainable solution for energy-constrained IoT devices by enabling wireless energy transfer (WET) in the downlink and wireless information transmission (WIT) in the uplink. However, their performance is often limited by interference in uplink communication and inefficient resource allocation. To address these challenges, we propose an RSMA-aided WPCN framework, which optimizes rate-splitting factors, power allocation, and time division to enhance spectral efficiency and user fairness. To solve this non-convex joint optimization problem, we employ the simultaneous perturbation stochastic approximation (SPSA) algorithm, a gradient-free method that efficiently estimates optimal parameters with minimal function evaluations. Compared to conventional optimization techniques, SPSA provides a scalable and computationally efficient approach for real-time resource allocation in RSMA-aided WPCNs. Our simulation results demonstrate that the proposed RSMA-aided framework improves sum throughput by 12.5% and enhances fairness by 15–20% compared to conventional multiple-access schemes. These findings establish RSMA as a key enabler for next-generation WPCNs, offering a scalable, interference-resilient, and energy-efficient solution for future wireless networks.
... These emerging networks aim to provide multidimensional, intelligent, and green communication features, fostering ubiquitous connectivity among a vast array of devices [3]. However, the increasing computational demands of real-time applications like autonomous driving, Metaverse services [4], [5], and telemedicine, coupled with the limited computational capabilities and battery capacities of local devices, present significant challenges. ...
Preprint
Full-text available
Mobile edge computing (MEC)-enabled air-ground networks are a key component of 6G, employing aerial base stations (ABSs) such as unmanned aerial vehicles (UAVs) and high-altitude platform stations (HAPS) to provide dynamic services to ground IoT devices (IoTDs). These IoTDs support real-time applications (e.g., multimedia and Metaverse services) that demand high computational resources and strict quality of service (QoS) guarantees in terms of latency and task queue management. Given their limited energy and processing capabilities, IoTDs rely on UAVs and HAPS to offload tasks for distributed processing, forming a multi-tier MEC system. This paper tackles the overall energy minimization problem in MEC-enabled air-ground integrated networks (MAGIN) by jointly optimizing UAV trajectories, computing resource allocation, and queue-aware task offloading decisions. The optimization is challenging due to the nonconvex, nonlinear nature of this hierarchical system, which renders traditional methods ineffective. We reformulate the problem as a multi-agent Markov decision process (MDP) with continuous action spaces and heterogeneous agents, and propose a novel variant of multi-agent proximal policy optimization with a Beta distribution (MAPPO-BD) to solve it. Extensive simulations show that MAPPO-BD outperforms baseline schemes, achieving superior energy savings and efficient resource management in MAGIN while meeting queue delay and edge computing constraints.
... Several comprehensive overviews of 6G have recently offered its specifications, requirements, and technologies that make it possible [4]. 6G is expected to explore novel frequency bands to accommodate the escalating demand for rapid, dependable, and low-latency communication. ...
Conference Paper
This article presents a compact ring-shaped dual-band monopole antenna with wide bandwidth, operating at the 18.2 and 27 GHz frequencies. The design integrates the radiating element and ground plane modifications, enabling dual-band operation throughout the millimeter-wave spectrum. The proposed microstrip patch antenna is designed on Rogers RT Duroid 5880 substrate, with a dielectric constant of 2.2 and a thickness of 0.787 mm. The proposed antenna demonstrates enhanced performance in reflection coefficient (<-10dB), wide bandwidth, good impedance matching, high return loss reaching-31.42 dB, radiation patterns, and constant gain. The lower band resonates at 18.2 GHz within the frequency range of (14.74 - 21.77) GHz, achieving a peak gain of 2.3 dBi. The resonance of the upper band occurs at 27 GHz within the frequency range of (25.03 - 28.87) GHz, producing a peak gain of 4.6 dBi. These characteristics highlight the system's suitability for 5G/6G millimeter-wave cellular communications and lower 6G band applications.
... The next generation of mobile networks is expected to advance ubiquitous wireless connectivity, which is crucial for a fully connected world [1]. The sixth generation (6G) of mobile networks will enable the Internet of everything (IoE), extending connections beyond machineto-machine (M2M) communication and allowing seamless and reliable communication among devices, individuals, and organizations [2]. ...
Preprint
Full-text available
Sixth-generation (6G) mobile networks is expected to revolutionize wireless connectivity, paving the way for the Internet of Everything (IoE) and enabling seamless communication between devices, people, and organizations. Advanced techniques, such as non-orthogonal multiple access (NOMA), massive multiple input multiple output (MIMO), and millimeter wave (mmWave) are proposed to deliver high data rates and increase the capacity and spectral efficiency of ultra-dense device networks. However, in scenarios like enhanced remote area communications (eRAC), the use of sub-gigahertz frequencies and TV white spaces (TVWS) is essential to provide Internet services in rural and remote areas, making these advanced techniques impractical. In this context, this paper proposes a new multiple access approach based on the faster-than-Nyquist generalized frequency division multiplexing (FTN-GFDM) waveform to improve user throughput and network capacity in remote areas. Resources in the FTN-GFDM time-frequency grid are allocated with the goal of avoiding inter-user interference (IUI) caused by signal overlap in the time or frequency domains, enabling equalization and detection of user signals without loss of bit error rate (BER) performance. This paper exploits resource allocation strategies and uplink and downlink scenarios for faster-than-Nyquist generalized frequency division multiple access (FTN-GFDMA) schemes, evaluating the effects of interference on BER performance and detection complexity.
... However, due to the nonorthogonality of the signals, the receiving end needs to use more complex signal processing techniques to distinguish and recover the information of each user [43]. In heterogeneous semantic and bit communication systems, because of the limited computational resources and prior knowledge, it is impossible to restore semantic information for the bit user [44]. Therefore, the B user treats the received superimposed signal as interference and decodes the bit signal directly. ...
Article
Full-text available
Considering the coexistence of semantic and bit transmissions in future networks, transmission policy design is crucial for heterogeneous semantic and bit communications to improve transmission efficiency. In this paper, we investigate downlink semantic and bit data transmissions from the access point (AP) to several semantic users and a bit user, and we propose a TDMA–NOMA-based transmission scheme to efficiently utilize wireless communication resources. The transmission time and power resource allocation problem is formulated with the aim of maximizing the throughput of the bit user while guaranteeing the semantic demands of the semantic users are met. Due to the time-varying channel conditions and mixed continuous–discrete variables, we propose a parameterized deep Q network (P-DQN)-based algorithm to solve the problem, where an actor network is employed to output continuous parameters, and a deep Q network is used to find the optimal discrete actions. the simulation results show that the proposed P-DQN-based algorithm converges faster than other learning methods. The simulations also demonstrate that the proposed TDMA–NOMA-based transmission scheme can improve the average bit throughput by up to 20% while meeting the semantic demands compared to other multiple access schemes.
... The CityPulse project [10] stands out as a significant endeavor in this field, designed to establish a comprehensive data analytics infrastructure for smart cities. This infrastructure facilitates the creation of smart city services through a distributed architecture on 5G and 6G networks that allows for semantic discovery, data analytics, and the interpretation of extensive (near-)real-time data from the Internet of Things and social media streams [11]. The integration of ITS within this framework significantly enhances traffic management and urban planning by utilizing data-driven approaches to optimize traffic flow and reduce congestion. ...
Article
Full-text available
The efficient management and prediction of urban traffic flow are paramount in the age of beyond 5G smart cities and advanced transportation systems. Traditional methods often fail to handle the nonlinear and dynamic nature of traffic data, necessitating more advanced solutions. This paper introduces NeuroSync, a novel neural network architecture designed to leverage the strengths of spiking neuron layers and gated recurrent units (GRUs) combined with temporal pattern attention mechanisms to effectively forecast traffic patterns. The architecture is specifically tailored to address the complexities inherent in nonstationary urban traffic datasets, capturing both spatial and temporal relationships within the data. NeuroSync not only outperforms traditional forecasting models such as ARIMA and exponential smoothing but also shows significant improvement over contemporary neural network approaches like LSTM, CNN, Seq2Seq, RNN, GRU, Transformer, and Autoencoder in terms of mean squared error (MSE) and mean absolute error (MAE). The model's efficacy is demonstrated through extensive experiments with real‐world traffic data, underscoring its potential to enhance urban mobility management and support the infrastructure of intelligent transportation systems.
... New service classes, such as an ultra-HD video and integrative virtual reality, are driving the demand for even higher spectral efficiency and the exploration of extensive frequency bands. 1 The carrier frequency below 6 GHz is almost full with current applications, such as AM, FM, RADAR, 3G, 4G, and so on, and the spectrum for wireless communication is increasingly scarce. 2 Ultra-wideband technology (UWB) 3 or Cognitive Radio (CR) and other different solutions have been proposed by various researchers, but only suitable only for short-range and can have security threads. 4 In the realm of wired communication, the endeavor to deliver high-speed Internet involves substituting the existing copper cables within infrastructure with high-speed optical fiber cables. ...
Article
Full-text available
Millimeter-wave and terahertz (THz) frequency bands are being explored in wired and wireless communication systems due to ongoing demands for increased data rates. Recently, it has been proposed that twisted-pair cables, already part of existing infrastructure, could be utilized for terabit-per-second data transmission by exploiting wireless THz radiation between the copper wires. THz radiation can be wirelessly transferred through the dielectric gap and the air gap in between the copper wires. The air gap and the dielectric material between the copper wires in a CAT6 (Category 6) cable can be considered a circular hollow-core waveguide, providing a suitable medium for the propagation of THz waves. Therefore, this work aims to estimate the data rate per distance by experimentally analyzing the copper circular waveguide. Furthermore, the impact of a waveguide radius is also examined. Waveguide propagation characteristics were experimentally analyzed using THz time-domain spectroscopy as well as on a simulation basis using CST (Computer Simulation Technology) Microwave Studio 2022. It was found that the radius of the waveguide has a significant effect on the transmission characteristics of the waveguide and the channel capacity for a longer range. The proposed waveguides achieved a maximum data rate in Tbps (terabits per second) for a few meters depending upon the diameter of the waveguide. This study investigates the propagation of THz waves through narrower spaces and explores the initial steps toward realizing the concept of a TDSL (Terabit Digital Subscriber Line) and future high-frequency communication systems.
... In the advent of the 6G era, visible light communication (VLC) emerges as a pivotal technology, promising high-speed, efficient, and interference-resistant wireless communication [1][2][3]. Its applicability spans across the Internet of Things [4], smart industries [5], and indoor navigation [6], marking its significance in the future of communication technologies. ...
Article
Full-text available
Given the burgeoning necessity for high-speed, efficient, and secure wireless communication in 6G, visible light communication (VLC) has emerged as a fervent subject of discourse within academic and industrial circles alike. Among these considerations, it is imperative to construct scalable multi-user VLC systems, meticulously addressing pivotal issues such as power dissipation, alignment errors, and the safeguarding of user privacy. However, traditional methods like multiplexing holography (MPH) and multiple focal (MF) phase plates have shown limitations in meeting these diverse requirements. Here, we propose a novel spatial multiplexing holography (SMH) theory, a comprehensive solution that overcomes existing hurdles by enabling precise power allocation, self-designed power coverage, and secure communication through orbital angular momentum (OAM). The transformative potential of SMH is demonstrated through simulations and experimental studies, showcasing its effectiveness in power distribution within systems of VR glasses users, computer users, and smartphone users; enhancing power coverage with an 11.6 dB improvement at coverage edges; and securing data transmission, evidenced by error-free 1080P video playback under correct OAM keys. Our findings illustrate the superior performance of SMH in facilitating seamless multi-user communication, thereby establishing a new benchmark for future VLC systems in the 6G landscape.
... FDMA finds numerous applications in wireless networks [40], [41], and it is anticipated that 6G networks will leverage the FDMA technology [42]. Additionally, both Wi-Fi 6 and Wi-Fi 7, will depend on channel bonding [43], [44]. ...
Preprint
Full-text available
The optimal solution to an optimization problem depends on the problem's objective function, constraints, and size. While deep neural networks (DNNs) have proven effective in solving optimization problems, changes in the problem's size, objectives, or constraints often require adjustments to the DNN architecture to maintain effectiveness, or even retraining a new DNN from scratch. Given the dynamic nature of wireless networks, which involve multiple and diverse objectives that can have conflicting requirements and constraints, we propose a multi-task learning (MTL) framework to enable a single DNN to jointly solve a range of diverse optimization problems. In this framework, optimization problems with varying dimensionality values, objectives, and constraints are treated as distinct tasks. To jointly address these tasks, we propose a conditional computation-based MTL approach with routing. The multi-task DNN consists of two components, the base DNN (bDNN), which is the single DNN used to extract the solutions for all considered optimization problems, and the routing DNN (rDNN), which manages which nodes and layers of the bDNN to be used during the forward propagation of each task. The output of the rDNN is a binary vector which is multiplied with all bDNN's weights during the forward propagation, creating a unique computational path through the bDNN for each task. This setup allows the tasks to either share parameters or use independent ones, with the decision controlled by the rDNN. The proposed framework supports both supervised and unsupervised learning scenarios. Numerical results demonstrate the efficiency of the proposed MTL approach in solving diverse optimization problems. In contrast, benchmark DNNs lacking the rDNN mechanism were unable to achieve similar levels of performance, highlighting the effectiveness of the proposed architecture.
... Artificial Intelligence (AI) and Machine Learning (ML) will be the head components in 6G network system architecture design [20]. 6G networks are envisaged to manage themselves with very little reliance on human intervention, in stark contrast to the previous generations which relied on lot of manual management of the net -works [21]. It will be feasible to conduct maintenance before failure, reduce idle resources, and associate resources effectively and efficiently to the network dynamics to increase performance of the networks. ...
Article
Full-text available
The rollout of sixth-generation (6G) wireless networks is likely to transform communication systems in terms of speed, latency, and connectivity. This paper investigates adaptive beamforming schemes for communications in the Terahertz range. Additionally, it investigates the possibility for resource management that minimizes energy consumption. The following algorithms are proposed in this work: Adaptive Meta-Surface Assisted Beamforming (AMAB) algorithm and an Energy Adaptive Resource Allocation with Predictive Optimization (EARAPO) algorithm. The problem of communication in the THz range is mitigated by the AMAB approach which varies beam angles and uses Reconfigurable Intelligent Surface (RIS) technology to improve signal delivery. This overcomes the drawback of THz frequencies’ large attenuation and short range, which are a hindrance to effective high-rate data transmission needed in 6G applications. The EARAPO algorithm applies Machine Learning approaches to resource allocation and management of network demand by forecasting traffic trends. Such a predictive strategy supports enhanced re- source utilization through elasticity of network technology which leads to reduction of energy cost with zero impact to the quality of service (QoS). The enhanced AMAB algorithm with RIS consistently improves SINR across various numbers of multi-user equipment (UE), even in network in-tensive scenarios. Similarly, power consumption efficiency of EARAPO was al-so superior while adaptation to power hungry variants of both algorithms resulted in power consumption being moderated until later in the escalation of network requirements. From a future-oriented view, these algorithms are quite effective in enhancing the quality of signals, making better use of resources and reducing energy usage in the upcoming wireless networks. All these studies open up a perspective for the future of sustainable and high-performance wireless technologies.
... It is envisioned that 6th Generation (6G) will enable ubiquitous connectivity for a massive number of devices and provide low-latency and high-reliability communications services, such as the tactile Internet, remote surgery, and autonomous driving [1], [2]. These computation-intensive and latency-sensitive applications challenge both the computational capabilities of User Equipments (UEs) and the processing efficiency of Mobile Cloud Computing (MCC). ...
Preprint
Full-text available
In this paper, a Rate-Splitting Multiple Access (RSMA) scheme is proposed to assist a Mobile Edge Computing (MEC) system where local computation tasks from two users are offloaded to the MEC server, facilitated by uplink RSMA for processing. The efficiency of the MEC service is hence primarily influenced by the RSMA-aided task offloading phase and the subsequent task computation phase, where reliable and low-latency communication is required. For this practical consideration, short-packet communication in the Finite Blocklength (FBL) regime is introduced. In this context, we propose a novel uplink RSMA-aided MEC framework and derive the overall Successful Computation Probability (SCP) with FBL consideration. To maximize the SCP of our proposed RSMA-aided MEC, we strategically optimize: (1) the task offloading factor which determines the number of tasks to be offloaded and processed by the MEC server; (2) the transmit power allocation between different RSMA streams; and (3) the task-splitting factor which decides how many tasks are allocated to splitting streams, while adhering to FBL constraints. To address the strong coupling between these variables in the SCP expression, we apply the Alternative Optimization method, which formulates tractable subproblems to optimize each variable iteratively. The resultant non-convex subproblems are then tackled by Successive Convex Approximation. Numerical results demonstrate that applying uplink RSMA in the MEC system with FBL constraints can not only improve the SCP performance but also provide lower latency in comparison to conventional transmission scheme such as Non-orthogonal Multiple Access (NOMA).
... Later, it was upgraded to 256-1,024 antennas for 5G. 6G is projected to use over 10,000 antenna elements using Massive MIMO using spatial multiplexing with narrow beams, resulting in increased spectrum efficiency and lower propagation loss for high-frequency communications [8]. ...
Article
Full-text available
5G will only be able to meet some of the demands of the coming technological advances in 2030 and beyond. Compared to 5G networks, sixth-generation (6G) networks are expected to introduce novel use cases and performance metrics, such as global coverage, cost efficiency, increased radio spectrum, energy intelligence, and safety. The growing global demand for ultra-high spectral efficiencies, data rates, speeds, and bandwidths in next-generation wireless networks motivates researchers to investigate the peak capabilities of massive MIMO (multiple input multiple outputs) and new technique filter bank multi-carrier (FBMC). Lower out-of-band emissions are observed in the FBMC technique compared to orthogonal frequency division multiplexing (OFDM), an essential requirement of upcoming next-generation wireless systems. This paper compares the spectral efficiency for Massive MIMO uplink in a single-cell scenario using linear detectors at the BS with perfect CSI. Arbitrarily larger SNR values are obtained with a higher number of BS antennas in Massive MIMO, which helps to increase the data rate. This paper also demonstrates how linear detectors can help to reduce the symbol error rate (SER) in a Massive MIMO. This paper demonstrates that with the same number of BS antennas and user combinations ZF detector outperforms the MRC detector.
... Unified Data Collection LayerThe unified data collection layer serves as the foundation of our architecture. According to recent research in distributed learning for 6G-IoT networks, a centralized data collection infrastructure is crucial for managing the massive amount of data generated which as noted in[4], target a 100x improvement over 5G networks. Cross-application communication protocols support the convergence of various vertical industries, including healthcare, transportation, and industrial automation, as identified in the 6G vision.According to research on intelligent network systems, automated deployment workflows are essential for managing the complexity of modern network environments, particularly in scenarios involving multiple network slices and heterogeneous services[5]. ...
Article
Full-text available
This article presents a comprehensive network architecture for managing reusable AI-based applications in 6G networks, addressing the critical challenge of AI silos in current implementations. It introduces a unified approach to data collection, feature extraction, model management, and application integration across network domains. By implementing standardized workflows and shared resources, the architecture enables efficient end-to-end management while promoting reusability and scalability. The solution incorporates a unified data collection layer, shared feature repository, model management framework, and application integration layer, all designed to support the demanding requirements of next-generation networks. Through multiple use cases including RAN optimization, network security, and service quality management, the article demonstrate the architecture's effectiveness in real-world scenarios. The results show significant improvements in development efficiency, resource utilization, scalability, and maintenance operations. It contributes to the evolution of 6G networks by providing a structured approach to integrating AI capabilities while preventing the formation of isolated solutions.
... Emerging wireless technologies are being developed to meet the increasing demand for ultra-high data rates, massive device connectivity, exceptional reliability, and minimal latency, driving the evolution of sixth-generation (6G) and beyond communication systems [1][2][3][4]. Among the various candidates, rate-splitting multiple access (RSMA) has gained prominence as a promising 6G multiple access solution. ...
Preprint
Full-text available
This paper presents a novel rate-splitting sparse code multiple access (RS-SCMA) framework, where common messages are transmitted using quadrature phase-shift keying (QPSK) modulation, while private messages are sent using SCMA encoding. A key feature of RS-SCMA is its ability to achieve a tunable overloading factor by adjusting the splitting factor. This flexibility enables an optimal trade-off, ensuring the system maintains superior performance across varying levels of overloading factor. We present a detailed transceiver design and analyze the influence of rate-splitting on the overloading factor. Extensive simulation results, both with and without low-density parity-check (LDPC) codes, highlight RS-SCMA's potential as a strong candidate for next-generation multiple access technologies.
Preprint
Fluid antenna system (FAS) is an emerging technology that uses the new form of shape- and position-reconfigurable antennas to empower the physical layer for wireless communications. Prior studies on FAS were however limited to narrowband channels. Motivated by this, this paper addresses the integration of FAS in the fifth generation (5G) orthogonal frequency division multiplexing (OFDM) framework to address the challenges posed by wideband communications. We propose the framework of the wideband FAS OFDM system that includes a novel port selection matrix. Then we derive the achievable rate expression and design the adaptive modulation and coding (AMC) scheme based on the rate. Extensive link-level simulation results demonstrate striking improvements of FAS in the wideband channels, underscoring the potential of FAS in future wireless communications.
Article
With the development of satellite communication technology, satellite-terrestrial integrated networks (STINs), which integrate satellite networks and ground networks, can realize global seamless coverage of communication services. Confronting the intricacies of network dynamics, the resource heterogeneity, and the unpredictability of user mobility, dynamic resource allocation within networks faces formidable challenges. Digital twin (DT), as a new technique, can reflect a physical network to a virtual network to monitor, analyze, and optimize the physical networks. Nevertheless, in the process of constructing a DT model, the deployment location and resource allocation of DTs may adversely affect its performance. Therefore, we propose a STIN model, which alleviates the problem of insufficient single-layer deployment flexibility of the traditional edge network by deploying DTs in multi-layer nodes in a STIN. To address the challenge of deploying DTs in the network, we propose a multi-layer DT deployment problem in the STIN to reduce system delay. Then we adopt a multi-agent reinforcement learning (MARL) scheme to explore the optimal strategy of the DT multi-layer deployment problem. The implemented scheme demonstrates a notable reduction in system delay, as evidenced by simulation outcomes.
Article
Full-text available
In the realm of telecommunications, the emergence of Fifth Generation (5G) technology and the prospective arrival of Sixth Generation (6G) networks represent a paradigm shift of considerable magnitude. These technological advancements have precipitated a profound transformation within the industry, fundamentally altering operational methodologies and technological architectures. A crucial technology in this evolution is Software-Defined Networking (SDN), which enhances programmability, scalability, and flexibility. However, integrating SDN with 5G and 6G networks presents both benefits and challenges, especially when combined with Artificial Intelligence (AI) and Machine Learning (ML). This review provides a comprehensive analysis of the current state of SDN in the context of 5G and 6G networks, highlighting the opportunities and challenges associated with integrating ML and AI. The review also discusses future research directions and potential solutions to the identified challenges. It examines the significant hurdles to SDN deployment in these advanced network environments and the unprecedented opportunities it presents. The paper offers an in-depth analysis of SDN's impact on the future of mobile networks, identifying key issues and potential innovations. With the capability to support faster cellular communications and a vast number of connected devices, 6G will revolutionize the software industry. 6G will enable a seamless edge-to-cloud architecture, which current cloud solutions cannot sustain due to the immense volume of data processing and transfer required. Existing software architectures, development methodologies, and orchestration and offloading systems are not yet equipped to manage such demands.
Article
Recently, the concept of digital twin (DT), open radio access Network (O-RAN) and the adoption of semantic communications (SC) have been labeled as keystone technologies towards the deployment of 6G networks. This article aims to provide a comprehensive vision of how a DT-en-abled and SC based O-RAN architecture can sub-stantially contribute towards the achievement of ultra-reliable low-latency communications (URLLCs) requirements essential for the deployment of the innovative 6G-oriented services. A brief overviews about each single component are primarily provided. Subsequently, potential use cases and services delivered through such unified network framework are illustrated. Challenges and future research directions are also highlighted and discussed.
Article
Full-text available
Substantial improvements in the area of ultra reliable and low-latency communication (URLLC) capabilities, as well as possibilities of meeting the rising demand for high-capacity and high-speed connectivity are expected to be achieved with the deployment of next generation 6G wireless communication networks. This thank to the adoption of key technologies such as unmanned aerial vehicles (UAVs), reflective intelligent surfaces (RIS), and mobile edge computing (MEC), which hold the potential to enhance coverage, signal quality, and computational efficiency. However, the integration of these technologies presents new optimization challenges, particularly for ensuring network reliability and maintaining stringent latency requirements. The Digital Twin (DT) paradigm, coupled with artificial intelligence (AI) and deep reinforcement learning (DRL), is emerging as a promising solution, enabling real-time optimization by digitally replicating network devices to support informed decision-making. This paper reviews recent advances in DT-enabled URLLC frameworks, highlights critical challenges, and suggests future research directions for realizing the full potential of 6G networks in supporting next-generation services under URLLCs requirements.
Article
In traditional mobile networks, trust between subscribers and their serving networks relies on a hardware root-of-trust: the Subscriber Identity Module (SIM). Conversely, trust between service and home networks is established via Trusted Third Parties (TTPs), known as Clearing Houses (CHs). The 6G environment will witness a substantial increase in subscriber numbers, driven by the mass deployment of the Internet of Everything (IoE) and improvements in network performance. Simultaneously, the performance capabilities required of TTPs to manage trustworthy operator-to-operator (O2O) interactions in 6G must align with the demands of the 6G ecosystem. This work focuses on enhancing CH intermediation capabilities to support O2O trustworthy interactions within the 6G context. Given the close connection between performance and trustworthiness, this paper explores these aspects by modeling interactions between communication parties using a Petri Net model. This model is applied to analyze the quantitative relationships among the non-functional requirements of future 6G communication scenarios, considering both traditional and blockchain-based approaches.
Article
Amid the global rollout of fifth-generation (5G) services, researchers in academia, industry, and national laboratories have been developing proposals for the sixth-generation (6G), whose materialization is fraught with many fundamental challenges. To alleviate these challenges, a deep learning (DL)-enabled semantic communication (SemCom) has emerged as a promising 6G technology enabler, which embodies a paradigm shift that can change the status quo viewpoint that wireless connectivity is an opaque data pipe carrying messages whose context-dependent meanings have been ignored. Since 6G is also critical for the materialization of major SemCom use cases, the paradigms of 6G for SemCom and SemCom for 6G call for a tighter integration of 6G and SemCom. For this purpose, this comprehensive article provides the fundamentals of semantic information, semantic representation, and semantic entropy; details the state-of-the-art SemCom research landscape; presents the major SemCom trends and use cases; discusses current SemCom theories; exposes the fundamental and major challenges of SemCom; and offers future research directions for SemCom. We hope this article stimulates many lines of research on SemCom theories, algorithms, and implementation.
Article
Full-text available
Compared to intensity modulation with direct detection, coherent detection offers superior receiver sensitivity, higher spectral efficiency, and better background noise suppression. However, research on coherent detection in underwater wireless optical communication (UWOC) systems is relatively limited. This paper investigates the average bit error rate (BER) performance of coherent UWOC systems employing different phase-shift keying schemes in the presence of phase errors. The underwater turbulent channel is characterized by the mixture exponential-generalized gamma distribution, taking into account the impact of pointing errors as well. Phase errors are modeled by a Gaussian distribution. Specifically, we first derive the average BER expressions for M -ary phase-shift keying (MPSK) under ideal carrier phase estimation. Then, we derive the exact average BER expressions for binary phase-shift keying and quadrature phase-shift keying, as well as tight approximations for MPSK, considering the influence of phase errors. Additionally, we provide asymptotic BER expressions in the high signal-to-noise ratio region and the BER floor expression. Finally, these expressions are validated through Monte Carlo simulations.
Preprint
This article presents a novel perspective to model and simulate reconfigurable intelligent surface (RIS)-assisted communication systems. Traditional methods in antenna design often rely on array method to simulate, whereas communication system modeling tends to idealize antenna behavior. Neither approach sufficiently captures the detailed characteristics of RIS-assisted communication. To address this limitation, we propose a comprehensive simulation framework that jointly models RIS antenna design and the communication process. This framework simulates the entire communication pipeline, encompassing signal generation, modulation, propagation, RIS-based radiation, signal reception, alignment, demodulation, decision, and processing. Using a QPSK-modulated signal for validation, we analyze system performance and investigate the relationship between bit error rate (BER), aperture fill time, array size, and baseband symbol frequency. The results indicate that larger array sizes and higher baseband symbol frequencies exacerbate aperture fill time effects, leading to increased BER. Furthermore, we examine BER variation with respect to signal-to-noise ratio (SNR) and propose an optimal matching-based alignment algorithm, which significantly reduces BER compared to conventional pilot-based alignment methods. This work demonstrates the entire process of RIS communication, and reveals the source of bit errors, which provides valuable insights into the design and performance optimization of RIS-assisted communication systems.
Chapter
The continuous evolvement of new formative 5G and evolution towards 6G are pertinent to changing the dynamics of the development of the principles of the agile management and open new horizons of development for enterprises. This chapter examines how these advanced communication technologies increase flexibility to facilitate quick shifts in line with market trends and improve efficiency and innovation. We demonstrate how 5G/6G increase the connectivity, reduce the latency, and enhance the bandwidth which enables the IoT integration, the AI adoption, and real-time analytics into an adaptable platform. Through the discussion of various cases and industry examples, the reader is offered practical guidelines on the use of the discussed technologies for gaining a competitive edge.
Article
Full-text available
Massive MIMO (multiple-input multiple-output) is no longer a “wild” or “promising” concept for future cellular networks—in 2018 it became a reality. Base stations (BSs) with 64 fully digital transceiver chains were commercially deployed in several countries, the key ingredients of Massive MIMO have made it into the 5G standard, the signal processing methods required to achieve unprecedented spectral efficiency have been developed, and the limitation due to pilot contamination has been resolved. Even the development of fully digital Massive MIMO arrays for mmWave frequencies—once viewed prohibitively complicated and costly—is well underway. In a few years, Massive MIMO with fully digital transceivers will be a mainstream feature at both sub-6 GHz and mmWave frequencies. In this paper, we explain how the first chapter of the Massive MIMO research saga has come to an end, while the story has just begun. The coming wide-scale deployment of BSs with massive antenna arrays opens the door to a brand new world where spatial processing capabilities are omnipresent. In addition to mobile broadband services, the antennas can be used for other communication applications, such as low-power machine-type or ultra-reliable communications, as well as non-communication applications such as radar, sensing and positioning. We outline five new Massive MIMO related research directions: Extremely large aperture arrays, Holographic Massive MIMO, Six-dimensional positioning, Large-scale MIMO radar, and Intelligent Massive MIMO.
Article
Full-text available
Faster, ultra-reliable, low-power and secure communications has always been high on the wireless evolutionary agenda. However, the appetite for faster, more reliable, greener and more secure communications continues to grow. The state-of-the-art methods conceived for achieving the performance targets of the associated processes may be accompanied by an increase in computational complexity. Alternatively, a degraded performance may have to be accepted due to the lack of jointly optimized system components. In this survey we investigate the employment of quantum computing for solving problems in wireless communication systems. By exploiting the inherent parallelism of quantum computing, quantum algorithms may be invoked for approaching the optimal performance of classical wireless processes, despite their reduced number of cost-function evaluations. In this contribution we discuss the basics of quantum computing using linear algebra, before presenting the operation of the major quantum algorithms, which have been proposed in the literature for improving wireless communications systems. Furthermore, we investigate a number of optimization problems encountered both in the physical and network layer of wireless communications, while comparing their classical and quantum-assisted solutions. Finally, we state a number of open problems in wireless communications that may benefit from quantum computing.<br/
Presentation
Full-text available
Article
Full-text available
While celebrating the 21st year since the very first IEEE 802.11 “legacy” 2 Mbit/s wireless Local Area Network standard, the latest Wi-Fi newborn is today reaching the finish line, topping the remarkable speed of 10 Gbit/s. IEEE 802.11ax was launched in May 2014 with the goal of enhancing throughput-per-area in high-density scenarios. The first 802.11ax draft versions, namely D1.0 and D2.0, were released at the end of 2016 and 2017. Focusing on a more mature version D3.0, in this tutorial paper, we help the reader to smoothly enter into the several major 802.11ax breakthroughs, including a brand new OFDMA-based random access approach as well as novel spatial frequency reuse techniques. In addition, this tutorial will highlight selected significant improvements (including PHY enhancements, MU-MIMO extensions, power saving advances, and so on) which make this standard a very significant step forward with respect to its predecessor 802.11ac.
Article
Full-text available
Network slicing has been identified as the backbone of the rapidly evolving 5G technology. However, as its consolidation and standardization progress, there are no literatures that comprehensively discuss its key principles, enablers and research challenges. This paper elaborates network slicing from an end-to-end perspective detailing its historical heritage, principal concepts, enabling technologies and solutions as well as the current standardization efforts. In particular, it overviews the diverse use cases and network requirements of network slicing, the pre-slicing era, considering RAN sharing as well as the end-to-end orchestration and management, encompassing the radio access, transport network and the core network. This paper also provides details of specific slicing solutions for each part of the 5G system. Finally, this paper identifies a number of open research challenges and provides recommendations towards potential solutions.
Article
Full-text available
This article discusses the basic system architecture for terahertz (THz) wireless links with band- widths of more than 50 GHz into optical networks. New design principles and breakthrough technologies are required in order to demonstrate Tbps data-rates at near zero-latency using the proposed system concept. Specifically, we present the concept of designing the baseband signal processing for both the optical and wireless link and using an end-to-end (E2E) error correction approach for the combined link. We provide two possible electro-optical baseband interface architectures, namely transparent optical-link and digital-link architectures, which are currently under investigation. THz wireless link requirements are given as well as the main principles and research directions for the development of a new generation of transceiver frontends, which will be capable of operating at ultra-high spectral efficiency by employing higher-order modulation schemes. Moreover, we discuss the need for developing a novel THz network information theory framework, which will take into account the channel characteristics and the nature of interference in the THz band. Finally, we highlight the role of pencil-beamforming (PBF), which is required in order to overcome the propagation losses, as well as the physical layer and medium access control challenges.
Article
To address the question in the subtitle of this article, we start by discussing earlier mobile communication systems, beginning with the first analog wireless cellular standards, then moving on to second generation (2G) [or Global System for Mobile Communications (GSM)], passing third generation (3G) and fourth generation (4G), and proceeding to fifth generation (5G). First, we present each generation's key achievements in terms of user services, each generation's technologyrelated factors of success (called innovations) as well as its relation to regulation, and each generation's potential deficiencies.
Article
As a promising machine learning tool to handle the accurate pattern recognition from complex raw data, deep learning (DL) is becoming a powerful method to add intelligence to wireless networks with large-scale topology and complex radio conditions. DL uses many neural network layers to achieve a brain-like acute feature extraction from high-dimensional raw data. It can be used to find the network dynamics (such as hotspots, interference distribution, congestion points, traffic bottlenecks, spectrum availability, etc.) based on the analysis of a large amount of network parameters (such as delay, loss rate, link SNR, etc.). Therefore, DL can analyze extremely complex wireless networks with many nodes and dynamic link quality. This article performs a comprehensive survey of the applications of DL algorithms for different network layers, including physical layer modulation/coding, data link layer access control/resource allocation, and routing layer path search and traffic balancing. The use of DL to enhance other network functions, such as network security, sensing data compression, etc., is also discussed. Moreover, the challenging unsolved research issues in this field are discussed in detail, which represent the future research trends of DL-based wireless networks. This article can help the readers to deeply understand the state-of-the-art of the DL-based wireless network designs, and select interesting unsolved issues to pursue in their research.
Article
Space-air-ground integrated network (SAGIN), as an integration of satellite systems, aerial networks and terrestrial communications, has been becoming an emerging architecture and attracted intensive research interest during the past years. Besides bringing significant benefits for various practical services and applications, SAGIN is also facing many unprecedented challenges due to its specific characteristics such as heterogeneity, self-organization, and time-variability. Compared to traditional ground or satellite networks, SAGIN is affected by the limited and unbalanced network resources in all three network segments, so that it is difficult to obtain the best performances for traffic delivery. Therefore, the system integration, protocol optimization, resource management and allocation in SAGIN is of great significance. To the best of our knowledge, we are the first to present the state-of-the-art of the SAGIN since existing survey papers focused on either only one single network segment in space or air, or the integration of space-ground, neglecting the integration of all the three network segments. In light of this, we present in this paper a comprehensive review of recent research works concerning SAGIN from network design and resource allocation to performance analysis and optimization. After discussing several existing network architectures, we also point out some technology challenges and future directions.
Article
In this article, we propose a blockchain verification protocol as a method for enabling and securing spectrum sharing in moving cognitive radio (CR) networks. The spectrum-sharing mechanism is used as a medium-access protocol for accessing wireless bandwidth among competing CRs. We introduce a virtual currency, called Specoins, for payment to access the spectrum. An auction mechanism based on a first-come-first-served queue is used, with the price for the spectrum advertised by each primary user in a decentralized fashion. The blockchain protocol facilitates the transactions between primary and secondary users and is used to validate and save each user’s virtual wallet. Also important for mobile networks, the blockchain serves as a distributed database that is visible by all participating parties, and any node can volunteer to update the blockchain. The volunteer nodes are called miners, and they are awarded with Specoins. We propose diverse methods to exchange the Specoins to make leasing possible even by CRs that are not miners. We show the improvement of the proposed algorithm compared with the conventional Aloha medium-access protocol in terms of spectrum usage. This difference is investigated using small-scale fading variation in the wireless channel to compare the performance of our secure method with the conventional medium access used in vehicular communications. The secure blockchain verification protocol is not only secure but also outperforms the conventional system in moderate cases of small-scale fading. In the case of severe small-scale fading, the blockchain protocol will outperform the conventional system if multipath diversity is not used.
Article
Line-of-sight wireless communications can benefit from the simultaneous transmission of multiple independent data streams through the same medium in order to increase system capacity. A common approach is to use conventional spatial multiplexing with spatially separated transmitter/receiver antennae, for which inter-channel crosstalk is reduced by employing multiple-input-multiple-output (MIMO) signal processing at the receivers. Another fairly recent approach to transmitting multiple data streams is to use orbital-angular-momentum (OAM) multiplexing, which employs the orthogonality among OAM beams to minimize inter-channel crosstalk and enable efficient (de)multiplexing. In this paper, we explore the potential of utilizing both of these multiplexing techniques to provide system design flexibility and performance enhancement. We demonstrate a 16 Gbit/s millimeter-wave link using OAM multiplexing combined with conventional spatial multiplexing over a short link distance of 1.8 meters (shorter than Rayleigh distance). Specifically, we implement a spatial multiplexing system with a 2 × 2 antenna aperture architecture, in which each transmitter aperture contains two multiplexed 4 Gbit/s data-carrying OAM beams. A MIMO-based signal processing is used at the receiver to mitigate channel interference. Our experimental results show performance improvements for all channels after MIMO processing, with bit-error rates of each channel below the forward error correction limit of 3.8 × 10 -3 . We also simulate the capacity for both the 4 × 4 MIMO system and the 2 × 2 MIMO with OAM multiplexing. Our work indicates that OAM multiplexing and conventional spatial multiplexing can be simultaneously utilized to provide design flexibility. The combination of these two approaches can potentially enhance system capacity given a fixed aperture area of the transmitter/receiver (when the link distance is within a few Rayleigh distances).
Article
The solid-state lighting is revolutionizing the indoor illumination. Current incandescent and fluorescent lamps are being replaced by the LEDs at a rapid pace. Apart from extremely high energy efficiency, the LEDs have other advantages such as longer lifespan, lower heat generation and improved color rendering without using harmful chemicals. One additional benefit of LEDs is that they are capable of switching to different light intensity at a very fast rate. This functionality has given rise to a novel communication technology (known as Visible Light Communication-VLC) where LED luminaires can be used for high speed data transfer. This survey provides a technology overview and review of existing literature of visible light communication and sensing. This paper provides a detailed survey of (1) visible light communication system and characteristics of its various components such as transmitter and receiver, (2) physical layer properties of visible light communication channel, modulation methods and MIMO techniques, (3) medium access techniques, (4) system design and programmable platforms and (5) visible light sensing and application such as indoor localization, gesture recognition, screen-camera communication and vehicular networking. We also outline important challenges that need to be addressed in order to design high-speed mobile networks using visible light communication.
Article
Molecular communication (MC) is the most promising communication paradigm for nanonetwork realization since it is a natural phenomenon observed among living entities with nanoscale components. Since MC significantly differs from classical communication systems, it mandates reinvestigation of information and communication theoretical fundamentals. The closest examples of MC architectures are present inside our own body. Therefore, in this paper, we investigate the existing literature on intrabody nanonetworks and different MC paradigms to establish and introduce the fundamentals of molecular information and communication science. We highlight future research directions and open issues that need to be addressed for revealing the fundamental limits of this science. Although the scope of this development encompasses wide range of applications, we particularly emphasize its significance for life sciences by introducing potential diagnosis and treatment techniques for diseases caused by dysfunction of intrabody nanonetworks.
Article
The solid-state lighting is revolutionizing the indoor illumination. Current incandescent and fluorescent lamps are being replaced by the LEDs at a rapid pace. Apart from extremely high energy efficiency, the LEDs have other advantages such as longer lifespan, lower heat generation, and improved color rendering without using harmful chemicals. One additional benefit of LEDs is that they are capable of switching to different light intensity at a very fast rate. This functionality has given rise to a novel communication technology (known as visible light communication-VLC) where LED luminaires can be used for high speed data transfer. This survey provides a technology overview and review of existing literature of visible light communication and sensing. This paper provides a detailed survey of 1) visible light communication system and characteristics of its various components such as transmitter and receiver; 2) physical layer properties of visible light communication channel, modulation methods, and MIMO techniques; 3) medium access techniques; 4) system design and programmable platforms; and 5) visible light sensing and application such as indoor localization, gesture recognition, screen-camera communication, and vehicular networking. We also outline important challenges that need to be addressed in order to design high-speed mobile networks using visible light communication.
Radio access networking challenges towards 2030
  • latva-aho
IEEE Standard for High Data Rate Wireless Multi-Media Networks–Amendment 2: 100 Gb/s Wireless Switched Point-to-Point Physical Layer