ﻻ يوجد ملخص باللغة العربية
Undoubtedly, Mobile Augmented Reality (MAR) applications for 5G and Beyond wireless networks are witnessing a notable attention recently. However, they require significant computational and storage resources at the end device and/or the network via Edge Cloud (EC) support. In this work, a MAR service is considered under the lenses of microservices where MAR service components can be decomposed and anchored at different locations ranging from the end device to different ECs in order to optimize the overall service and network efficiency. To this end, we propose a mobility aware MAR service decomposition using a Long Short Term Memory (LSTM) deep neural network to provide efficient pro-active decision making in real-time. More specifically, the LSTM deep neural network is trained with optimal solutions derived from a mathematical programming formulation in an offline manner. Then, decision making at the inference stage is used to optimize service decomposition of MAR services. A wide set of numerical investigations reveal that the mobility aware LSTM deep neural network manage to outperform recently proposed schemes in terms of both decision making quality as well as computational time.
The combination of 5G and Multi-access Edge Computing (MEC) can significantly reduce application delay by lowering transmission delay and bringing computational capabilities closer to the end user. Therefore, 5G MEC could enable excellent user experi
Mobile Augmented Reality (MAR) mixes physical environments with user-interactive virtual annotations. Immersive MAR experiences are supported by computation-intensive tasks which rely on offloading mechanisms to ease device workloads. However, this i
We investigate a type of emerging user-assisted mobile applications or services, referred to as Dynamic Mobile Ad-hoc Crowd Service (DMACS), such as collaborative streaming via smartphones or location privacy protection through a crowd of smartphone
Mobile Augmented Reality (MAR) integrates computer-generated virtual objects with physical environments for mobile devices. MAR systems enable users to interact with MAR devices, such as smartphones and head-worn wearables, and performs seamless tran
We present an approach for visualizing mobile robots through an Augmented Reality headset when there is no line-of-sight visibility between the robot and the human. Three elements are visualized in Augmented Reality: 1) Robots 3D model to indicate it