Article

Integrating Satellites and Mobile Edge Computing for 6G Wide-Area Edge Intelligence: Minimal Structures and Systematic Thinking

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

The sixth-generation (6G) network will shift its focus to supporting everything including various machine-type devices (MTDs) in an every-one-centric manner. To ubiquitously cover the MTDs working in rural and disastrous areas, satellite communications become indispensable, while mobile edge computing (MEC) also plays an increasingly crucial role. Their sophisticated integration enables wide-area edge intelligence which promises to facilitate globally-distributed customized services. In this article, we present typical use cases of integrated satellite-MEC networks and discuss the main challenges therein. Inspired by the protein structure and the systematic engineering methodology, we propose three minimal integrating structures, based on which a complex integrated satellite-MEC network can be treated as their extension and combination. We discuss the unique characteristics and key problems of each minimal structure. Accordingly, we establish an on-demand network orchestration framework to enrich the hierarchy of network management, which further leads to a process-oriented network optimization method. On that basis, a case study is utilized to showcase the benefits of on-demand network orchestration and process-oriented network optimization. Finally, we outline potential research issues to envision a more intelligent, more secure, and greener integrated network.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... When large-scale deployment of satellites is infeasible for budgetary or practical reasons, an innovative alternative is employing aerial platforms in the air [9]- [12]. Network operators may install high-altitude aerial platforms at distances closer to Chang-Sik Choi is an Assistant Professor of Dept. of EE, Hongik University, South Korea. ...
... email: chang-sik.choi@hongik.ac.kr satellites to receive and transmit data. These aerial platforms allow terrestrial gateways to communicate with satellites outside of their visibility scope, effectively extending the duration and range of downlink communications [12]. In other words, the use of aerial platforms can be a highly effective alternative to dense satellite deployments, offering equivalent or better coverage with fewer satellites by overcoming the geometric limitations of a sparse satellite network.. ...
... In [23]- [26], the network performance is determined solely by the satellites, and the Cox-based models do not consider any communication agents relaying messages from satellites to terrestrial gateways. As a result, the performance gains from aerial platforms anticipated in various studies, including [9]- [12], cannot be mathematically analyzed, leaving the gains from aerial platforms unexplored. ...
Article
Full-text available
Although a significant number satellites are deemed essential for facilitating diverse applications of satellite networks, aerial platforms are emerging as excellent alternatives for enabling reliable communications with fewer satellites. In scenarios with sparse satellite networks, aerial platforms participate in downlink communications, serving effectively as relays and providing comparable or even superior coverage compared to a large number of satellites. This paper explores the role of aerial platforms in assisting downlink communications, emphasizing their potential as an alternative to dense satellite networks. Firstly, we account for the space-time interconnected movement of satellites in orbits by establishing a stochastic geometry framework based on an isotropic satellite Cox point process. Using this model, we evaluate space-and-time performance metrics such as the number of orbits, the number of communicable satellites, and the connectivity probability, primarily assessing the geometric impact of aerial platforms. Subsequently, we analyze signal-to-noise ratio (SNR) coverage probability, end-to-end throughput, and association delay. Through examination of these performance metrics, we explicitly demonstrate how aerial platforms enhance downlink communications by improving various key network performance metrics that would have been achieved only by many satellites, thereby assessing their potential as an excellent alternative to dense satellite networks.
... MEC-empowered NTNs have been discussed in many studies. For instance, Lin et al. [22] proposed three minimal integrating structures of MEC and NTNs, and established an on-demand network orchestration framework. In [23], Kim et al. investigated the data upload scheduling and path planning scheme for space-air-ground integrated edge computing systems, aimed at minimizing the total system energy cost. ...
... In this first situation, we prove that under assumptions (22), (23) and (25), the expressions for T u (B u , R S u , F u , η u ) and V u (B u , R S u , F u , η u ) are shown in (26) and (27), which corresponds to (5a) and (6a). ...
... In the second situation, we assume that (22) and (23) still hold. This means that during the user-UAV transmission, both the to-be-computed data and the to-be-uploaded data are accumulating in the storage unit. ...
Article
Full-text available
Quick response to disasters is crucial for saving lives and reducing loss. This requires low-latency uploading of situation information to the remote command center. Since terrestrial infrastructures are often damaged in disaster areas, non-terrestrial networks (NTNs) are preferable to provide network coverage, and mobile edge computing (MEC) could be integrated to improve the latency performance. Nevertheless, the communications and computing in MEC-enabled NTNs are strongly coupled, which complicates the system design. In this paper, an edge information hub (EIH) that incorporates communication, computing and storage capabilities is proposed to synergize communication and computing and enable systematic design. We first address the joint data scheduling and resource orchestration problem to minimize the latency for uploading sensing data. The problem is solved using an optimal resource orchestration algorithm. On that basis, we propose the principles for resource configuration of the EIH considering payload constraints on size, weight and energy supply. Simulation results demonstrate the superiority of our proposed scheme in reducing the overall upload latency, thus enabling quick emergency rescue.
... operators may install high-altitude aerial platforms at distances closer to satellites to receive and transmit data with advanced equipment. These aerial platforms allow terrestrial gateways to communicate with satellites outside of their visibility scope, effectively extending the duration and range of downlink communications [12]. In other words, the use of aerial platforms can be a highly effective alternative to dense satellite deployments, offering equivalent or better coverage with fewer satellites by overcoming the geometric limitations of sparse satellite networks. ...
... Although [24], [25] showed that the Cox model approximates any existing or future constellations by the moment-matching method, the Cox-based models do not consider any communication agents relaying messages from satellites to terrestrial gateways. Hence, the performance gains from aerial platforms predicted by [9]- [12] cannot be mathematically analyzed, leaving the gains from aerial platforms not fully understood. ...
Preprint
Full-text available
Although a significant number satellites are deemed essential for facilitating diverse applications of satellite networks, aerial platforms are emerging as excellent alternatives for enabling reliable communications with fewer satellites. In scenarios with sparse satellite networks, aerial platforms participate in downlink communications, serving effectively as relays and providing comparable or even superior coverage compared to a large number of satellites. This paper explores the role of aerial platforms in assisting downlink communications, emphasizing their potential as an alternative to dense satellite networks. Firstly, we account for the space-time interconnected movement of satellites in orbits by establishing a stochastic geometry framework based on an isotropic satellite Cox point process. Using this model, we evaluate space-and-time performance metrics such as the number of orbits, the number of communicable satellites, and the connectivity probability, primarily assessing the geometric impact of aerial platforms. Subsequently, we analyze signal-to-noise ratio (SNR) coverage probability, end-to-end throughput, and association delay. Through examination of these performance metrics, we explicitly demonstrate how aerial platforms enhance downlink communications by improving various key network performance metrics that would have been achieved only by many satellites, thereby assessing their potential as an excellent alternative to dense satellite networks.
... Referring to [19], a complex integrated satellite-MEC network could be considered as an orchestration of three minimal network structures, which are regarded as the basic elements of the integrated satellite-MEC network. These minimal structures have unique network properties, and therefore differ in the enabled applications as well as the design challenges. ...
... Network adjustments on this timescale have yet to be investigated. Therefore, a novel network architecture that enables on-demand network adjustments at such a medium timescale needs to be considered [19]. ...
Article
Full-text available
The sixth-generation (6G) network is envisioned to shift its focus from the service requirements of human beings to those of Internet-of-Things (IoT) devices. Satellite communications are indispensable in 6G to support IoT devices operating in rural or disaster areas. However, satellite networks face the inherent challenges of low data rate and large latency, which may not support computation-intensive and delay-sensitive IoT applications. Mobile Edge Computing (MEC) is a burgeoning paradigm by extending cloud computing capabilities to the network edge. Using MEC technologies, the resource-limited IoT devices can access abundant computation resources with low latency, which enables the highly demanding applications while meeting strict delay requirements. Therefore, an integration of satellite communications and MEC technologies is necessary to better enable 6G IoT. In this survey, we provide a holistic overview of satellite-MEC integration. We first categorize the related studies based on three minimal structures and summarize current advances. For each minimal structure, we discuss the lessons learned and possible future directions. We also summarize studies considering the combination of minimal structures. Finally, we outline potential research issues to envision a more intelligent, more secure, and greener integrated satellite-MEC network.
... With the advancement of 5G and next-generation networks, the MEC (Multi-access Edge Computing) system is gaining attention [1,2]. MEC provides the advantage of significantly reducing latency by processing data at the network edge close to users rather than in a central cloud, which makes it ideal for services requiring real-time response. ...
Article
Full-text available
Reducing energy consumption in a MEC (Multi-Access Edge Computing) system is a critical goal, both for lowering operational expenses and promoting environmental sustainability. In this paper, we focus on the problem of managing the sleep state of MEC servers (MECSs) to decrease the overall energy consumption of a MEC system while providing users acceptable service delays. The proposed method achieves this objective through dynamic orchestration of MECS activation states based on systematic analysis of workload distribution patterns. To facilitate this optimization, we formulate the MECS sleep control mechanism as a constrained combinatorial optimization problem. To resolve the formulated problem, we take a deep-learning approach. We develop a task arrival rate predictor using a spatio-temporal graph convolution network (STGCN). We then integrate this predicted information with the queue length distribution to form the input state for our deep reinforcement learning (DRL) agent. To verify the effectiveness of our proposed framework, we conduct comprehensive simulation studies incorporating real-world operational datasets, with comparative evaluation against established metaheuristic optimization techniques. The results indicate that our method demonstrates robust performance in MECS state optimization, maintaining operational efficiency despite prediction uncertainties. Accordingly, the proposed approach yields substantial improvements in system performance metrics, including enhanced energy utilization efficiency, decreased service delay violation rate, and reduced computational latency in operational state determination.
... S ATELLITE-terrestrial networks (STINs) have been widely acknowledged as a promising solution to achieve global seamless coverage and pervasive connectivity for future networks [1], [2]. As an indispensable component of STINs, satellite networks can not only supplement existing terrestrial Yao Sun is with the James Watt School of Engineering, University of Glasgow, Glasgow G12 8QQ, U.K (e-mail: yao.sun@glasgow.ac.uk). ...
Article
Full-text available
Satellite edge computing (SEC) has emerged as an innovative paradigm for future satellite-terrestrial integrated networks (STINs), expanding computation services by sinking computing capabilities into Low-Earth-Orbit (LEO) satellites. However, the mobility of LEO satellites poses two key challenges to SEC: 1) constrained onboard computing and transmission capabilities caused by limited and dynamic energy supply, and 2) stochastic task arrivals within the satellites' coverage and timevarying channel conditions. To tackle these issues, it is imperative to design an optimal SEC offloading strategy that effectively exploits the available energy of LEO satellites to fulfill competing task demands for SEC. In this paper, we propose a dynamic offloading strategy (DOS) with the aim to minimize the overall completion time of arriving tasks in an SEC-assisted STIN, subject to the long-term energy constraints of the LEO satellite. Leveraging Lyapunov optimization theory, we first convert the original long-term stochastic problem into multiple deterministic one-slot problems parameterized by current system states. Then we use sub-problem decomposition to jointly optimize the task offloading, computing, and communication resource allocation strategies. We theoretically prove that DOS achieves near-optimal performance. Numerical results demonstrate that DOS significantly outperforms the other four baseline approaches in terms of task completion time and dropping rate.
... Although the BSs have stable energy supply, they may consume non-renewable energy resources and give rise to more carbon emissions when a large variety of applications such as extended reality (XR) [11] generate massive computation-intensive tasks. On the other hand, the satellites depend on the limited energy harvested from solar panels [12], and recent work [13] underscores that energy management for edge intelligence at the satellites is critical. Therefore, both the latency requirement and energy constraint need to be met when tackling the task offloading issue, to make edge intelligence at the satellites sustainable. ...
Preprint
Full-text available
This paper exploits the potential of edge intelligence empowered satellite-terrestrial networks, where users' computation tasks are offloaded to the satellites or terrestrial base stations. The computation task offloading in such networks involves the edge cloud selection and bandwidth allocations for the access and backhaul links, which aims to minimize the energy consumption under the delay and satellites' energy constraints. To address it, an alternating direction method of multipliers (ADMM)-inspired algorithm is proposed to decompose the joint optimization problem into small-scale subproblems. Moreover, we develop a hybrid quantum double deep Q-learning (DDQN) approach to optimize the edge cloud selection. This novel deep reinforcement learning architecture enables that classical and quantum neural networks process information in parallel. Simulation results confirm the efficiency of the proposed algorithm, and indicate that duality gap is tiny and a larger reward can be generated from a few data points compared to the classical DDQN.
... In [27], the authors improve the network management of multi-layer satellite networks by proposing a mega-constellations routing system called MaCRo based on SDN and MEC technologies. Thinking satellite-MEC networks from systematic engineering's perspective, three minimal integrating structures are proposed in [28]. ...
Article
Full-text available
Satellite networks can enhance global network coverage without geographical restrictions. By applying edge computing paradigm to satellite networks, they can provide communication and computation services for Internet of Remote Things (IoRT) mobile devices (IMDs) at any time. Thus, in 6G mobile systems, satellite networks serve as a complement to terrestrial networks. This paper places its emphasis on satellite-terrestrial integrated networks (STINs). Computation offloading and resource allocation (CORA) play a crucial role in STINs given the limited communication and computation resources of satellites. In this paper, we investigate the CORA to minimize system energy consumption while ensuring the latency tolerance via jointly optimizing the offloading decision and the allocation of radio and computation resources. Taking into account the dynamic characteristics of network conditions, the CORA problem is formulated as a Markov decision process (MDP). Subsequently, an algorithm based on twin delayed deep deterministic policy gradient (TD3) is designed to automatically determine the optimal decisions. Finally, the convergence and superiority of the proposed algorithm in terms of energy efficiency are evaluated through extensive simulation experiments.
... MEC-empowered NTNs have been discussed in many studies. For instance, Lin et al. [20] proposed three minimal integrating structures of MEC and NTNs, and established an on-demand network orchestration framework. In [21], Kim et al. investigated the data upload scheduling and path planning scheme for space-air-ground integrated edge computing systems, aimed at minimizing the total system energy cost. ...
Preprint
Quick response to disasters is crucial for saving lives and reducing loss. This requires low-latency uploading of situation information to the remote command center. Since terrestrial infrastructures are often damaged in disaster areas, non-terrestrial networks (NTNs) are preferable to provide network coverage, and mobile edge computing (MEC) could be integrated to improve the latency performance. Nevertheless, the communications and computing in MEC-enabled NTNs are strongly coupled, which complicates the system design. In this paper, an edge information hub (EIH) that incorporates communication, computing and storage capabilities is proposed to synergize communication and computing and enable systematic design. We first address the joint data scheduling and resource orchestration problem to minimize the latency for uploading sensing data. The problem is solved using an optimal resource orchestration algorithm. On that basis, we propose the principles for resource configuration of the EIH considering payload constraints on size, weight and energy supply. Simulation results demonstrate the superiority of our proposed scheme in reducing the overall upload latency, thus enabling quick emergency rescue.
... Edge intelligence enhances network resilience through its decentralized decision-making process, which empowers autonomous decision-making even during network disruptions [67]. This method eliminates the vulnerability of sensitive data during transmission, which improves security measures and boosts operational efficiency [68,69]. A major step forward, the integration of edge intelligence with high-frequency network operations improves the responsiveness, security, and agility of contemporary data-intensive environments [70,71]. ...
Article
Full-text available
The latest cellular technology, known as 5G-NR, is intended to significantly speed up and improve the effectiveness of wireless systems. A revolution in the telecom industry has been sparked by the widespread use of and increased reliance on cellular communication technology. Moreover, 5G and B5G technologies are expected to utilize an even higher-frequency range to achieve faster data transmission and lower latency communication. Consequently, while transmitting signals across various types of equipment and infrastructure, the general public is exposed to much higher frequencies of electromagnetic radiation. The increasing need for 5G NR base stations (gNodeB) has heightened public anxiety over potential negative health impacts. This study reviews recent research on the effects of electromagnetic waves on humans, particularly focusing on how these effects influence cognitive functions. Most research to date has not found significant differences in cognitive performance due to ubiquitous mobile communications. However, current research has largely been limited to 4G technologies, and the health effects of exposure to 5G user equipment (UE) and base stations in higher-frequency bands remain unexplored. If subsequent research suggests that exposure to high-frequency wireless networks significantly impacts cognitive functions, the deployment and acceptance of these technologies may face challenges and constraints. Therefore, such investigations are crucial for determining whether next-generation technologies pose no risk to individuals.
... The author in [23] describes the most significant factors for developing B5G/6G edge computing capabilities and applications in various industry sectors. The author [24] discusses the characteristics and major challenges of the proposed three frameworks for incorporating satellite and MEC systems. He also presents relevant research concerns for the development of a highly intelligent, highly secure, and essentially unified sustainable network. ...
Article
Full-text available
The work on perfecting the rapid proliferation of wireless technologies resulted in the development of wireless modeling standards, protocols, and control of wireless manipulators. Several mobile communication technology applications in different fields are dramatically revolutionized to deliver more value at less cost. Multiple-access Edge Computing (MEC) offers excellent advantages for Beyond 5G (B5G) and Sixth-Generation (6G) networks, reducing latency and bandwidth usage while increasing the capability of the edge to deliver multiple services to end users in real time. We propose a Cluster-based Multi-User Multi-Server (CMUMS) caching algorithm to optimize the MEC content caching mechanism and control the distribution of high-popular tasks. As part of our work, we address the problem of integer optimization of the content that will be cached and the list of hosting servers. Therefore, a higher direct hit rate will be achieved, a lower indirect hit rate will be achieved, and the overall time delay will be reduced. As a result of the implementation of this system model, maximum utilization of resources and development of a completely new level of services and innovative approaches will be possible.
Article
Computing offloading optimization for energy saving is becoming increasingly important in low-Earth orbit (LEO) satellite-terrestrial integrated networks (STINs) since battery techniques has not kept up with the demand of ground terminal devices. In this paper, we design a delay-based deep reinforcement learning (DRL) framework specifically for computation offloading decisions, which can effectively reduce the energy consumption. Additionally, we develop a multi-level feedback queue for computing allocation (RAMLFQ), which can effectively enhance the CPU's efficiency in task scheduling. We initially formulate the computation offloading problem with the system delay as Delay Markov Decision Processes (DMDPs), and then transform them into the equivalent standard Markov Decision Processes (MDPs). To solve the optimization problem effectively, we employ a double deep Q-network (DDQN) method, enhancing it with an augmented state space to better handle the unique challenges posed by system delays. Simulation results demonstrate that the proposed learning-based computing offloading algorithm achieves high levels of performance efficiency and attains a lower total cost compared to other existing offloading methods.
Article
As an indispensable architecture for future 6G communication networks, space-air-ground integrated network (SAGIN) integrates satellite networks, air networks and ground networks, greatly expanding the coverage of network space. Compared with the traditional mobile edge computing (MEC), the edge intelligence (EI) formed by combining artificial intelligence (AI) and MEC can intelligently process the edge data by embedding the AI algorithms into the edge devices with limited computing power. Therefore, this article considers applying EI to SAGIN to form the EI-driven SAGIN architecture, which can significantly enhance the communication, computing, sensing and storage capabilities of SAGIN architecture to solve the problem of efficient resource management for resource-constrained users. In this article, we first introduce the system network architecture and logical functional architecture, and give a detailed description of the components in the network architecture, and then discuss some key technologies in the system, including efficient resource utilization for microservice based on software defined network (SDN) and network function virtualization (NFV), deep reinforcement learning (DRL) based on knowledge graph for efficient storage and intelligent computing, and efficient and real-time sensing for massive information. Finally, we propose a DRL-based resource allocation and computation offloading algorithm for microservices (DRCAM) and evaluate the performance of the proposed algorithm. The simulation results show that, compared with the existing algorithms, the proposed algorithm could greatly reduce the system cost under different weights.
Article
Full-text available
The fifth-generation (5G) wireless communications have been deployed in many countries with the following features: wireless networks at 20 Gbps as peak data rate, a latency of 1 ms, reliability of 99.999%, maximum mobility of 500 km/h, a bandwidth of 1 GHz, and a capacity of 10 6 up to Mbps/m 2 . Nonetheless, the rapid growth of applications, such as extended/virtual reality (XR/VR), online gaming, telemedicine, cloud computing, smart cities, the Internet of Everything (IoE), and others, demand lower latency, higher data rates, ubiquitous coverage, and better reliability. These higher requirements are the main problems that have challenged 5G while concurrently encouraging researchers and practitioners to introduce viable solutions. In this review paper, the sixth-generation (6G) technology could solve the 5G limitations, achieve higher requirements, and support future applications. The integration of multiple access techniques, terahertz (THz), visible light communications (VLC), ultra-massive multiple-input multiple-output (um-MIMO), hybrid networks, cell-free massive MIMO, and artificial intelligence (AI)/machine learning (ML) have been proposed for 6G. The main contributions of this paper are a comprehensive review of the 6G vision, KPIs (key performance indicators), and advanced potential technologies proposed with operation principles. Besides, this paper reviewed multiple access and modulation techniques, concentrating on Filter-Bank Multicarrier (FBMC) as a potential technology for 6G. This paper ends by discussing potential applications with challenges and lessons identified from prior studies to pave the path for future research.
Article
The low Earth orbit (LEO) satellite edge computing paradigm provides remote sites with flexible, reliable, and scalable edge computing capabilities. Characterized by the orbital motion patterns and harsh space environments, the LEO satellite edge computing faces unique security challenges in terms of the secure collaboration of multiple satellites and the intellectual property protection of models. Under the unique space environment and security demands, we propose a secure satellite edge computing framework in this paper. By taking a remote electricity line outage identification use case as an example, our framework first achieves the secure delegation of the line outage identification task among multiple satellites, which is realized through a secure query (SQuery)(\mathsf {SQuery}) scheme to check the availability of the target time slot. Meanwhile, we also design a SHE-enabled secure inner-product encryption ( SSIPE\mathsf {SSIPE} ) protocol, to achieve the secure multinomial logistic regression (MLR) based line outage identification on-orbit. To reduce the complexity brought by the computationally intensive homomorphic multiplication between two ciphertexts, we further grasp the idea and design a “divide-and-conquer” based secure query ( DSQuery\mathsf {DSQuery} ) scheme, which converts this homomorphic multiplication operation between ciphertexts into the homomorphic addition operation. As far as we know, this is the first scheme investigating the secure task delegation among different satellites on-orbit. Besides, detailed security analyses are performed to demonstrate the security properties of confidentiality and authentication. In performance evaluations, we test and compare the computational and communication overhead of our scheme and other straightforward schemes. Simulation results show that the DSQuery\mathsf {DSQuery} scheme greatly reduces the computational cost, which saves the stringent on-orbit computation resources of LEO satellites.
Preprint
The sixth-generation (6G) network is envisioned to shift its focus from the service requirements of human beings' to those of Internet-of-Things (IoT) devices'. Satellite communications are indispensable in 6G to support IoT devices operating in rural or disastrous areas. However, satellite networks face the inherent challenges of low data rate and large latency, which may not support computation-intensive and delay-sensitive IoT applications. Mobile Edge Computing (MEC) is a burgeoning paradigm by extending cloud computing capabilities to the network edge. By utilizing MEC technologies, the resource-limited IoT devices can access abundant computation resources with low latency, which enables the highly demanding applications while meeting strict delay requirements. Therefore, an integration of satellite communications and MEC technologies is necessary to better enable 6G IoT. In this survey, we provide a holistic overview of satellite-MEC integration. We first discuss the main challenges of the integrated satellite-MEC network and propose three minimal integrating structures. For each minimal structure, we summarize the current advances in terms of their research topics, after which we discuss the lessons learned and future directions of the minimal structure. Finally, we outline potential research issues to envision a more intelligent, more secure, and greener integrated satellite-MEC network.
Article
Full-text available
The recent advances in low earth orbit (LEO) satellites enable the satellites to provide task processing capability for remote Internet of Things (IoT) mobile devices (IMDs) without proximal multi-access edge computing (MEC) servers. In this article, by leveraging the LEO satellites, a novel MEC framework for terrestrial-satellite IoT is proposed. With the aid of terrestrial-satellite terminal (TST), the computation offloading from IMDs to LEO satellites is divided into two stages in the ground and space segments. In order to minimize the weighted sum energy consumption of IMDs, we decompose the formulated problem into two layered subproblems: 1) the lower layer subproblem minimizing the latency of space segment, which is solved by sequential fractional programming with attaining the first-order optimality; 2) the upper layer subproblem which is solved by exploiting the convex structure and applying Lagrangian dual decomposition method. Based on the solutions to the two layered subproblems, an energy efficient computation offloading and resource allocation algorithm (E-CORA) is proposed. By simulations, it is shown that: i) there exists a specific amount of offloading bits which can minimize the energy consumption of IMDs and the proposed E-CORA outperforms full offloading and local computing only; ii) larger transmit power of the TST helps to save the energy of IMDs; and iii) by increasing the number of visible satellites, the ratio of offloading bits increases while the energy consumption of IMDs can be decreased.
Article
Full-text available
Mobile edge computing (MEC) enhanced satellite based internet of things (SAT-IoT) is an important complement for terrestrial networks based IoT, especially for the remote and depopulated areas. For MEC enhanced SAT-IoT networks with multiple satellites and multiple satellite gateways, the coupled user association, offloading decision, computing and communication resource allocation should be jointly optimized to minimize the latency and energy cost. In this paper, the latency and energy optimization for MEC enhanced SAT-IoT networks are formulated as a dynamic mixed-integer programming problem, which is hard to obtain the optimal solutions. To tackle this problem, we decompose the complex problem into two sub-problems. The first one is computing and communication resource allocation with fixed user association and offloading decision, and the second one is joint user association and offloading with optimal resource allocation. For the sub-problem of resource allocation, the optimal solution is proven to be obtained based on Lagrange multiplier method. And then, the second sub-problem is further formulated as a Markov decision process (MDP), and a joint user association and offloading decision with optimal resource allocation (JUAOD-ORA) is proposed based on deep reinforcement learning (DRL). Simulation results show that the proposed approach can achieve better long-term reward in terms of latency and energy cost.
Article
Full-text available
Mobile edge computing (MEC) is proposed as a new paradigm to meet the ever-increasing computation requirements, which is caused by the rapid growth of the Internet of Things (IoT) devices. As a supplement to the terrestrial network, satellites can provide communication to terrestrial devices in some harsh environments and natural disasters. Satellite edge computing is becoming an emerging topic and technology. In this paper, a game-theoretic approach to the optimization of computation offloading strategy in satellite edge computing is proposed. The system model for computation offloading in satellite edge computing is established, considering the intermittent terrestrial-satellite communication caused by satellites orbiting. We conduct a computation offloading game framework and compute the response time and energy consumption of a task based on the queuing theory as metrics of optimizing performance. The existence and uniqueness of the Nash equilibrium is theoretically proved, and an iterative algorithm is proposed to find the Nash equilibrium. Simulation results validate the proposed algorithm and show that the game-based offloading strategy can greatly reduce the average cost of a device.
Article
Mobile communication standards were developed for enhancing transmission and network performance by using more radio resources and improving spectrum and energy efficiency. How to effectively address diverse user requirements and guarantee everyone's Quality of Experience (QoE) remains an open problem. The Sixth Generation (6G) mobile systems will solve this problem by utilizing heterogenous network resources and pervasive intelligence to support everyone-centric customized services anywhere and anytime. In this article, we first coin the concept of Service Requirement Zone (SRZ) on the user side to characterize and visualize the integrated service requirements and preferences of specific tasks of individual users. On the system side, we further introduce the concept of User Satisfaction Ratio (USR) to evaluate the system's overall service ability of satisfying a variety of tasks with different SRZs. Then, we propose a network Artificial Intelligence (AI) architecture with integrated network resources and pervasive AI capabilities for supporting customized services with guaranteed QoEs. Finally, extensive simulations show that the proposed network AI architecture can consistently offer a higher USR performance than the cloud AI and edge AI architectures with respect to different task scheduling algorithms, random service requirements, and dynamic network conditions.
Article
The wide use of unmanned aerial vehicles provides a promising paradigm for improving air-ground services and applications (e.g., urban sensing, disaster relief) in air-ground integrated networks (AGINs). Digital twin (DT), which is an emerging technology that utilizes data, models, and intelligent algorithms to integrate cyber physical networks and digital virtual models, provides a real-time and dynamic simulation platform for strategy optimization and decision making in AGINs. Due to the openness and massive connectivity of AGINs, the security and reliability services in this system become an important issue. In this article, we investigate the DT envisioned secure federated aerial learning for AGINs via an aerial blockchain approach. Specifically, we propose a layered framework of DT envisioned AGINs, which comprises the construction segment, communication segment, aggregation segment, analysis segment, and operation segment. Based on this framework, we offer the applications of the proposed DT envisioned AGINs. To guarantee the security of data transmission in AGINs, we investigate the aerial blockchain-based approach for ensuring data security. Furthermore, we provide a case study of DT envisioned secure federated aerial computing in AGINs to validate the effectiveness of the proposed approach through designing the aerial blockchain and training model.
Article
In the upcoming sixth-generation (6G) era, the demand for constructing a wide-area time-sensitive Internet of Things (IoT) continues to increase. As conventional cellular technologies are difficult to directly use for wide-area time-sensitive IoT, it is beneficial to use non-terrestrial infrastructures, including satellites and unmanned aerial vehicles (UAVs). Thus, we can build a non-terrestrial network (NTN) using a cell-free architecture. Driven by the time-sensitive requirements and uneven distribution of IoT devices, the NTN must be empowered using mobile edge computing (MEC) while providing oasis-oriented on-demand coverage for devices. Nevertheless, communication and MEC systems are coupled with each other under the influence of a complex propagation environment in the MEC-empowered NTN, which makes it difficult to coordinate the resources. In this study, we propose a process-oriented framework to design communication and MEC systems in a time-division manner. In this framework, large-scale channel state information (CSI) is used to characterize the complex propagation environment at an affordable cost, where a nonconvex latency minimization problem is formulated. Subsequently, the approximated problem is provided, and it can be decomposed into sub-problems. These sub-problems are then solved iteratively. The simulation results demonstrated the superiority of the proposed process-oriented scheme over other algorithms, implied that the payload deployments of UAVs should be appropriately predesigned to improve the efficiency of using resources, and confirmed that it is advantageous to integrate NTN with MEC for wide-area time-sensitive IoT.
Article
In this paper, we investigate a satellite-aerial integrated edge computing network (SAIECN) to combine a low-earth-orbit (LEO) satellite and aerial high altitude platforms (HAPs) to provide edge computing services for ground user equipment (GUE). In the SAIECN, GUE’s computing tasks can be offloaded to HAP(s) or LEO satellite. In this paper, we minimize the weighted sum energy consumption of SAIECN via joint GUE association, multi-user multiple input and multiple output (MU-MIMO) transmit precoding, computation task assignment, and resource allocation. To solve the nonconvex problem, we decompose the optimization problem into four subproblems and solve each one iteratively. For the GUE association subproblem, quadratic transform based fractional programming (QTFP) and difference of convex function are utilized. The MU-MIMO transmit precoding subproblem is solved via QTFP and the weighted minimum mean-squared method. The computation task assignment is addressed using the classic interior point method while the computation resource allocation is derived in closed form. The numerical results show that the proposed SAIECN and the corresponding algorithm can solve the satellite based edge computing quite well and the energy cost is maintained at a relative low level.
Article
Benefit from the enhanced onboard processing capacities and high-speed satellite-terrestrial links, satellite edge computing has been regarded as a promising technique to facilitate the execution of the computation-intensive applications for satellite communication networks (SCNs). By deploying edge computing servers in satellite and gateway stations, SCNs can achieve significant performance gains of the computing capacities at the expense of extending the dimensions and complexity of resource management. Therefore, in this paper, we investigate the joint computing and communication resource management problem for SCNs to minimize the execution latency of the computation-intensive applications, while two different satellite edge computing scenarios and local execution are considered. Furthermore, the joint computing and communication resource allocation problem for the computation-intensive services is formulated as a mixed-integer programming problem. A game-theoretic and many-to-one matching theory-based scheme (JCCRA-GM) is proposed to achieve an approximate optimal solution. Numerical results show that the proposed method with low complexity can achieve almost the same weight-sum latency as the Brute-force method.
Article
Terrestrial communication networks mainly focus on users in urban areas but have poor coverage performance in harsh environments, such as mountains, deserts, and oceans. Satellites can be exploited to extend the coverage of terrestrial fifth-generation (5G) networks. However, satellites are restricted by their high latency and relatively low data rate. Consequently, the integration of terrestrial and satellite components has been widely studied, to take advantage of both sides and enable the seamless broadband coverage. Due to the significant differences between satellite communications (SatComs) and terrestrial communications (TerComs) in terms of channel fading, transmission delay, mobility, and coverage performance, the establishment of an efficient hybrid satellite-terrestrial network (HSTN) still faces many challenges. In general, it is difficult to decompose a HSTN into a sum of separate satellite and terrestrial links due to the complicated coupling relationships therein. To uncover the complete picture of HSTNs, we regard the HSTN as a combination of basic cooperative models that contain the main traits of satellite-terrestrial integration but are much simpler and thus more tractable than the large-scale heterogeneous HSTNs. In particular, we present three basic cooperative models, i.e., model X, model L, and model V, and provide a survey of the state-of-the-art technologies for each of them. We discuss future research directions towards establishing a cell-free, hierarchical, decoupled HSTN. We also outline open issues to envision an agile, smart, and secure HSTN for the sixth-generation (6G) ubiquitous Internet of Things (IoT).
Article
Low earth orbit (LEO) satellite networks can break through geographical restrictions and achieve global wireless coverage, which is an indispensable choice for future mobile communication systems. In this paper, we present a hybrid cloud and edge computing LEO satellite (CECLS) network with a three-tier computation architecture, which can provide ground users with heterogeneous computation resources and enable ground users to obtain computation services around the world. With the CECLS architecture, we investigate the computation offloading decisions to minimize the sum energy consumption of ground users, while satisfying the constraints in terms of the coverage time and the computation capability of each LEO satellite. The considered problem leads to a discrete and non-convex since the objective function and constraints contain binary variables, which makes it difficult to solve. To address this challenging problem, we convert the original non-convex problem into a linear programming problem by using the binary variables relaxation method. Then, we propose a distributed algorithm by leveraging the alternating direction method of multipliers (ADMM) to approximate the optimal solution with low computational complexity. Simulation results show that the proposed algorithm can effectively reduce the total energy consumption of ground users.
Article
Current fifth generation (5G) cellular networks mainly focus on the terrestrial scenario. Due to the difficulty of deploying communications infrastructure on the ocean, the performance of existing maritime communication networks (MCNs) is far behind 5G. This problem can be solved by using unmanned aerial vehicles (UAVs) as agile aerial platforms to enable on-demand maritime coverage, as a supplement to marine satellites and shore-based terrestrial based stations (TBSs). In this article, we study the integration of UAVs with existing MCNs, and investigate the potential gains of hybrid satellite-UAV-terrestrial networks for maritime coverage. Unlike the terrestrial scenario, vessels on the ocean keep to sea lanes and are sparsely distributed. This provides new opportunities to ease the scheduling of UAVs. Also, new challenges arise due to the more complicated maritime prorogation environment, as well as the mutual interference between UAVs and existing satellites/TBSs. We discuss these issues and show possible solutions considering practical constraints.
Article
Internet of things (IoT) computing offloading is a challenging issue, especially in remote areas where common edge/cloud infrastructure is unavailable. In this paper, we present a space-air-ground integrated network (SAGIN) edge/cloud computing architecture for offloading the computation-intensive applications considering remote energy-and computation-constraints, where flying unmanned aerial vehicles (UAVs) provide near-user edge computing and satellites provide access to the cloud computing. Firstly, for UAV edge servers, we propose a joint resource allocation and task scheduling approach to efficiently allocate the computing resources to virtual machines and schedule the offloaded tasks. Secondly, we investigate the computing offloading problem in SAGIN and propose a learning-based approach to learn the optimal offloading policy from the dynamic SAGIN environments. Specifically, we formulate the offloading decision making as a Markov decision process where the system state considers the network dynamics. To cope with the system dynamics and complexity, we propose a deep reinforcement learning-based computing offloading approach to learn the optimal offloading policy on-the-fly, where we adopt the policy gradient method to handle the large action space and actor-critic method to accelerate the learning process. Simulation results show that the proposed edge virtual machine allocation and task scheduling approach can achieve near-optimal performance with very low complexity, and that the proposed learning-based computing offloading algorithm not only converges fast, but also achieves a lower total cost compared with other offloading approaches.
Article
Different Internet of Things (IoT) applications demand different levels of intelligence and efficiency in processing data. Multi-tier computing, which integrates cloud, fog and edge computing technologies, will be required in order to deliver future IoT services.
Article
The high-speed satellite-terrestrial network (STN) is an indispensable alternative in future mobile communication systems. In this article, we first introduce the architecture and application scenarios of STNs, and then investigate possible ways to implement mobile edge computing (MEC) technique for QoS improvement in STNs. We propose satellite MEC (SMEC), in which a user equipment without a proximal MEC server can also enjoy MEC services via satellite links. We propose a dynamic network virtualization technique to integrate the network resources, and furtherly design a cooperative computation offloading (CCO) model to achieve parallel computation in STNs. Task scheduling models in SMEC are discussed in detail, and an elemental simulation is conducted to evaluate the performance of the proposed CCO model in SMEC.
Article
Due to the increasing demands of onboard sensor and autonomous processing, one of the principal needs and challenges for future spacecraft is onboard computing. Space computers must provide high performance and reliability (which are often at odds), using limited resources (power, size, weight, and cost), in an extremely harsh environment (due to radiation, temperature, vacuum, and vibration). As spacecraft shrink in size, while assuming a growing role for science and defense missions, the challenges for space computing become particularly acute. For example, processing capabilities on CubeSats (smaller class of SmallSats) have been extremely limited to date, often featuring microcontrollers with performance and reliability barely sufficient to operate the vehicle let alone support various sensor and autonomous applications. This article surveys the challenges and opportunities of onboard computers for small satellites (SmallSats) and focuses upon new concepts, methods, and technologies that are revolutionizing their capabilities, in terms of two guiding themes: hybrid computing and reconfigurable computing. These innovations are of particular need and value to CubeSats and other Smallsats. With new technologies, such as CHREC Space Processor (CSP), we demonstrate how system designers can exploit hybrid and reconfigurable computing on SmallSats to harness these advantages for a variety of purposes, and we highlight several recent missions by NASA and industry that feature these principles and technologies.
She is currently an Associate Professor at the School of Information Engineering
  • Yanmin Wang
Yanmin Wang (yanmin-226@163.com) received the B.S. degree from Shandong University, China, in 2008, and the Ph.D. degree from the Department of Electronic Engineering, Tsinghua University, Beijing, China, in 2013. She is currently an Associate Professor at the School of Information Engineering, Minzu University of China. Her research interests include distributed antenna systems and satellite networks.
He received his Ph.D. degree from the University of Alberta in 2006. He is currently working as a Professor in the Department of Engineering at the University of Durham
  • P R Shanghai
  • China
YunFei Chen received his B.E. and M.E. degrees in electronics engineering from Shanghai Jiaotong University, Shanghai, P.R.China, in 1998 and 2001, respectively. He received his Ph.D. degree from the University of Alberta in 2006. He is currently working as a Professor in the Department of Engineering at the University of Durham, U.K. His research interests include wireless communications, performance analysis, joint radar communications designs.
He has authored 4 books and more than 490 journal/conference papers, including 26 highly cited papers. His research interests include wireless channel measurements and modeling, 6G wireless communication networks, and electromagnetic information theory
  • Cheng-Xiang
Cheng-Xiang Wang is a Professor at Southeast University, and a part-time Professor with Purple Mountain Laboratories, China. He has authored 4 books and more than 490 journal/conference papers, including 26 highly cited papers. His research interests include wireless channel measurements and modeling, 6G wireless communication networks, and electromagnetic information theory.