ترغب بنشر مسار تعليمي؟ اضغط هنا

5G MEC Computation Handoff for Mobile Augmented Reality

100   0   0.0 ( 0 )
 نشر من قبل Pengyuan Zhou
 تاريخ النشر 2021
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

The combination of 5G and Multi-access Edge Computing (MEC) can significantly reduce application delay by lowering transmission delay and bringing computational capabilities closer to the end user. Therefore, 5G MEC could enable excellent user experience in applications like Mobile Augmented Reality (MAR), which are computation-intensive, and delay and jitter-sensitive. However, existing 5G handoff algorithms often do not consider the computational load of MEC servers, are too complex for real-time execution, or do not integrate easily with the standard protocol stack. Thus they can impair the performance of 5G MEC. To address this gap, we propose Comp-HO, a handoff algorithm that finds a local solution to the joint problem of optimizing signal strength and computational load. Additionally, Comp-HO can easily be integrated into current LTE and 5G base stations thanks to its simplicity and standard-friendly deployability. Specifically, we evaluate Comp-HO through a custom NS-3 simulator which we calibrate via MAR prototype measurements from a real-world 5G testbed. We simulate both Comp-HO and several classic handoff algorithms. The results show that, even without a global optimum, the proposed algorithm still significantly reduces the number of large delays, caused by congestion at MECs, at the expense of a small increase in transmission delay.



قيم البحث

اقرأ أيضاً

Mobile Augmented Reality (MAR) mixes physical environments with user-interactive virtual annotations. Immersive MAR experiences are supported by computation-intensive tasks which rely on offloading mechanisms to ease device workloads. However, this i ntroduces additional network traffic which in turn influences the motion-to-photon latency (a determinant of user-perceived quality of experience). Therefore, a proper transport protocol is crucial to minimise transmission latency and ensure sufficient throughput to support MAR performance. Relatedly, 5G, a potential MAR supporting technology, is widely believed to be smarter, faster, and more efficient than its predecessors. However, the suitability and performance of existing transport protocols in MAR in the 5G context has not been explored. Therefore, we present an evaluation of popular transport protocols, including UDP, TCP, MPEG-TS, RTP, and QUIC, with a MAR system on a real-world 5G testbed. We also compare with their 5G performance with LTE and WiFi. Our evaluation results indicate that TCP has the lowest round-trip-time on 5G, with a median of $15.09pm0.26$ ms, while QUIC appears to perform better on LTE. Through an additional test with varying signal quality (specifically, degrading secondary synchronisation signal reference signal received quality), we discover that protocol performance appears to be significantly impacted by signal quality.
145 - Lifan Mei , Jinrui Gou , Yujin Cai 2021
Mobile apps are increasingly relying on high-throughput and low-latency content delivery, while the available bandwidth on wireless access links is inherently time-varying. The handoffs between base stations and access modes due to user mobility pres ent additional challenges to deliver a high level of user Quality-of-Experience (QoE). The ability to predict the available bandwidth and the upcoming handoffs will give applications valuable leeway to make proactive adjustments to avoid significant QoE degradation. In this paper, we explore the possibility and accuracy of realtime mobile bandwidth and handoff predictions in 4G/LTE and 5G networks. Towards this goal, we collect long consecutive traces with rich bandwidth, channel, and context information from public transportation systems. We develop Recurrent Neural Network models to mine the temporal patterns of bandwidth evolution in fixed-route mobility scenarios. Our models consistently outperform the conventional univariate and multivariate bandwidth prediction models. For 4G & 5G co-existing networks, we propose a new problem of handoff prediction between 4G and 5G, which is important for low-latency applications like self-driving strategy in realistic 5G scenarios. We develop classification and regression based prediction models, which achieve more than 80% accuracy in predicting 4G and 5G handoffs in a recent 5G dataset.
Undoubtedly, Mobile Augmented Reality (MAR) applications for 5G and Beyond wireless networks are witnessing a notable attention recently. However, they require significant computational and storage resources at the end device and/or the network via E dge Cloud (EC) support. In this work, a MAR service is considered under the lenses of microservices where MAR service components can be decomposed and anchored at different locations ranging from the end device to different ECs in order to optimize the overall service and network efficiency. To this end, we propose a mobility aware MAR service decomposition using a Long Short Term Memory (LSTM) deep neural network to provide efficient pro-active decision making in real-time. More specifically, the LSTM deep neural network is trained with optimal solutions derived from a mathematical programming formulation in an offline manner. Then, decision making at the inference stage is used to optimize service decomposition of MAR services. A wide set of numerical investigations reveal that the mobility aware LSTM deep neural network manage to outperform recently proposed schemes in terms of both decision making quality as well as computational time.
This paper investigates an unmanned aerial vehicle (UAV)-assisted wireless powered mobile-edge computing (MEC) system, where the UAV powers the mobile terminals by wireless power transfer (WPT) and provides computation service for them. We aim to max imize the computation rate of terminals while ensuring fairness among them. Considering the random trajectories of mobile terminals, we propose a soft actor-critic (SAC)-based UAV trajectory planning and resource allocation (SAC-TR) algorithm, which combines off-policy and maximum entropy reinforcement learning to promote the convergence of the algorithm. We design the reward as a heterogeneous function of computation rate, fairness, and reaching of destination. Simulation results show that SAC-TR can quickly adapt to varying network environments and outperform representative benchmarks in a variety of situations.
Edge computing that leverages cloud resources to the proximity of user devices is seen as the future infrastructure for distributed applications. However, developing and deploying edge applications, that rely on cellular networks, is burdensome. Such network infrastructures are often based on proprietary components, each with unique programming abstractions and interfaces. To facilitate straightforward deployment of edge applications, we introduce OSS based RAN on OTA commercial spectrum with DevOps capabilities. OSS allows software modifications and integrations of the system components, e.g., EPC and edge hosts running applications, required for new data pipelines and optimizations not addressed in standardization. Such an OSS infrastructure enables further research and prototyping of novel end-user applications in an environment familiar to software engineers without telecommunications background. We evaluated the presented infrastructure with E2E OTA testing, resulting in 7.5MB/s throughput and latency of 21ms, which shows that the presented infrastructure provides low latency for edge applications.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا