No Arabic abstract
Task offloading in Internet of Vehicles (IoV) involves numerous steps and optimization variables such as: where to offload tasks, how to allocate computation resources, how to adjust offloading ratio and transmit power for offloading, and such optimization variables and hybrid combination features are highly coupled with each other. Thus, this is a fully challenge issue to optimize these variables for task offloading to sustainably reduce energy consumption with load balancing while ensuring that a task is completed before its deadline. In this paper, we first provide a Mixed Integer Nonlinear Programming Problem (MINLP) formulation for such task offloading under energy and deadline constraints in IoV. Furthermore, in order to efficiently solve the formulated MINLP, we decompose it into two subproblems, and design a low-complexity Joint Optimization for Energy Consumption and Task Processing Delay (JOET) algorithm to optimize selection decisions, resource allocation, offloading ratio and transmit power adjustment. We carry out extensive simulation experiments to validate JOET. Simulation results demonstrate that JOET outperforms many representative existing approaches in quickly converge and effectively reduce energy consumption and delay. Specifically, average energy consumption and task processing delay have been reduced by 15.93% and 15.78%, respectively, and load balancing efficiency has increased by 10.20%.
Edge computing-enhanced Internet of Vehicles (EC-IoV) enables ubiquitous data processing and content sharing among vehicles and terrestrial edge computing (TEC) infrastructures (e.g., 5G base stations and roadside units) with little or no human intervention, plays a key role in the intelligent transportation systems. However, EC-IoV is heavily dependent on the connections and interactions between vehicles and TEC infrastructures, thus will break down in some remote areas where TEC infrastructures are unavailable (e.g., desert, isolated islands and disaster-stricken areas). Driven by the ubiquitous connections and global-area coverage, space-air-ground integrated networks (SAGINs) efficiently support seamless coverage and efficient resource management, represent the next frontier for edge computing. In light of this, we first review the state-of-the-art edge computing research for SAGINs in this article. After discussing several existing orbital and aerial edge computing architectures, we propose a framework of edge computing-enabled space-air-ground integrated networks (EC-SAGINs) to support various IoV services for the vehicles in remote areas. The main objective of the framework is to minimize the task completion time and satellite resource usage. To this end, a pre-classification scheme is presented to reduce the size of action space, and a deep imitation learning (DIL) driven offloading and caching algorithm is proposed to achieve real-time decision making. Simulation results show the effectiveness of our proposed scheme. At last, we also discuss some technology challenges and future directions.
Internet of Things (IoT) is an innovative paradigm envisioned to provide massive applications that are now part of our daily lives. Millions of smart devices are deployed within complex networks to provide vibrant functionalities including communications, monitoring, and controlling of critical infrastructures. However, this massive growth of IoT devices and the corresponding huge data traffic generated at the edge of the network created additional burdens on the state-of-the-art centralized cloud computing paradigm due to the bandwidth and resources scarcity. Hence, edge computing (EC) is emerging as an innovative strategy that brings data processing and storage near to the end users, leading to what is called EC-assisted IoT. Although this paradigm provides unique features and enhanced quality of service (QoS), it also introduces huge risks in data security and privacy aspects. This paper conducts a comprehensive survey on security and privacy issues in the context of EC-assisted IoT. In particular, we first present an overview of EC-assisted IoT including definitions, applications, architecture, advantages, and challenges. Second, we define security and privacy in the context of EC-assisted IoT. Then, we extensively discuss the major classifications of attacks in EC-assisted IoT and provide possible solutions and countermeasures along with the related research efforts. After that, we further classify some security and privacy issues as discussed in the literature based on security services and based on security objectives and functions. Finally, several open challenges and future research directions for secure EC-assisted IoT paradigm are also extensively provided.
Pervasive applications are revolutionizing the perception that users have towards the environment. Indeed, pervasive applications perform resource intensive computations over large amounts of stream sensor data collected from multiple sources. This allows applications to provide richer and deep insights into the natural characteristics that govern everything that surrounds us. A key limitation of these applications is that they have high energy footprints, which in turn hampers the quality of experience of users. While cloud and edge computing solutions can be applied to alleviate the problem, these solutions are hard to adopt in existing architecture and far from become ubiquitous. Fortunately, cloudlets are becoming portable enough, such that they can be transported and integrated into any environment easily and dynamically. In this article, we investigate how cloudlets can be transported by unmanned autonomous vehicles (UAV)s to provide computation support on the edge. Based on our study, we develop GEESE, a novel UAVbased system that enables the dynamic deployment of an edge computing infrastructure through the cooperation of multiple UAVs carrying cloudlets. By using GEESE, we conduct rigorous experiments to analyze the effort to deliver cloudlets using aerial, ground, and underwater UAVs. Our results indicate that UAVs can work in a cooperative manner to enable edge computing in the wild.
As one of the most promising applications in future Internet of Things, Internet of Vehicles (IoV) has been acknowledged as a fundamental technology for developing the Intelligent Transportation Systems in smart cities. With the emergence of the sixth generation (6G) communications technologies, massive network infrastructures will be densely deployed and the number of network nodes will increase exponentially, leading to extremely high energy consumption. There has been an upsurge of interest to develop the green IoV towards sustainable vehicular communication and networking in the 6G era. In this paper, we present the main considerations for green IoV from five different scenarios, including the communication, computation, traffic, Electric Vehicles (EVs), and energy harvesting management. The literatures relevant to each of the scenarios are compared from the perspective of energy optimization (e.g., with respect to resource allocation, workload scheduling, routing design, traffic control, charging management, energy harvesting and sharing, etc.) and the related factors affecting energy efficiency (e.g., resource limitation, channel state, network topology, traffic condition, etc.). In addition, we introduce the potential challenges and the emerging technologies in 6G for developing green IoV systems. Finally, we discuss the research trends in designing energy-efficient IoV systems.
The heterogeneity of the Internet-of-things (IoT) network can be exploited as a dynamic computational resource environment for many devices lacking computational capabilities. A smart mechanism for allocating edge and mobile computers to match the need of devices requesting external computational resources is developed. In this paper, we employ the concept of Social IoT and machine learning to downgrade the complexity of allocating appropriate edge computers. We propose a framework that detects different communities of devices in SIoT enclosing trustworthy peers having strong social relations. Afterwards, we train a machine learning algorithm, considering multiple computational and non-computational features of the requester as well as the edge computers, to predict the total time needed to process the required task by the potential candidates belonging to the same community of the requester. By applying it to a real-world data set, we observe that the proposed framework provides encouraging results for mobile computer allocation.