ترغب بنشر مسار تعليمي؟ اضغط هنا

Crowd-MECS: A Novel Crowdsourcing Framework for Mobile Edge Caching and Sharing

104   0   0.0 ( 0 )
 نشر من قبل Changkun Jiang
 تاريخ النشر 2020
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Crowdsourced mobile edge caching and sharing (Crowd-MECS) is emerging as a promising content delivery paradigm by employing a large crowd of existing edge devices (EDs) to cache and share popular contents. The successful technology adoption of Crowd-MECS relies on a comprehensive understanding of the complicated economic interactions and strategic decision-making of different stakeholders. In this paper, we focus on studying the economic and strategic interactions between one content provider (CP) and a large crowd of EDs, where the EDs can decide whether to cache and share contents for the CP, and the CP can decide to share a certain revenue with EDs as the incentive of caching and sharing contents. We formulate such an interaction as a two-stage Stackelberg game. In Stage I, the CP aims to maximize its own profit by deciding the ratio of revenue shared with EDs. In Stage II, EDs aim to maximize their own payoffs by choosing to be agents who cache and share contents, and meanwhile gain a certain revenue from the CP, or requesters who do not cache but request contents in the on-demand fashion. We first analyze the EDs best responses and prove the existence and uniqueness of the equilibrium in Stage II by using the non-atomic game theory. Then, we identify the piece-wise structure and the unimodal feature of the CPs profit function, based on which we design a tailored low-complexity one-dimensional search algorithm to achieve the optimal revenue sharing ratio for the CP in Stage I. Simulation results show that both the CPs profit and the EDs total welfare can be improved significantly (e.g., by 120% and 50%, respectively) by using the proposed Crowd-MECS, comparing with the Non-MEC system where the CP serves all EDs directly.


قيم البحث

اقرأ أيضاً

Mobile Edge Caching is a promising technique to enhance the content delivery quality and reduce the backhaul link congestion, by storing popular content at the network edge or mobile devices (e.g. base stations and smartphones) that are proximate to content requesters. In this work, we study a novel mobile edge caching framework, which enables mobile devices to cache and share popular contents with each other via device-to-device (D2D) links. We are interested in the following incentive problem of mobile device users: whether and which users are willing to cache and share what contents, taking the user mobility and cost/reward into consideration. The problem is challenging in a large-scale network with a large number of users. We introduce the evolutionary game theory, an effective tool for analyzing large-scale dynamic systems, to analyze the mobile users content caching and sharing strategies. Specifically, we first derive the users best caching and sharing strategies, and then analyze how these best strategies change dynamically over time, based on which we further characterize the system equilibrium systematically. Simulation results show that the proposed caching scheme outperforms the existing schemes in terms of the total transmission cost and the cellular load. In particular, in our simulation, the total transmission cost can be reduced by 42.5%-55.2% and the cellular load can be reduced by 21.5%-56.4%.
This letter proposes two novel proactive cooperative caching approaches using deep learning (DL) to predict users content demand in a mobile edge caching network. In the first approach, a (central) content server takes responsibilities to collect inf ormation from all mobile edge nodes (MENs) in the network and then performs our proposed deep learning (DL) algorithm to predict the content demand for the whole network. However, such a centralized approach may disclose the private information because MENs have to share their local users data with the content server. Thus, in the second approach, we propose a novel distributed deep learning (DDL) based framework. The DDL allows MENs in the network to collaborate and exchange information to reduce the error of content demand prediction without revealing the private information of mobile users. Through simulation results, we show that our proposed approaches can enhance the accuracy by reducing the root mean squared error (RMSE) up to 33.7% and reduce the service delay by 36.1% compared with other machine learning algorithms.
This paper comprehensively studies a content-centric mobile network based on a preference learning framework, where each mobile user is equipped with a finite-size cache. We consider a practical scenario where each user requests a content file accord ing to its own preferences, which is motivated by the existence of heterogeneity in file preferences among different users. Under our model, we consider a single-hop-based device-to-device (D2D) content delivery protocol and characterize the average hit ratio for the following two file preference cases: the personalized file preferences and the common file preferences. By assuming that the model parameters such as user activity levels, user file preferences, and file popularity are unknown and thus need to be inferred, we present a collaborative filtering (CF)-based approach to learn these parameters. Then, we reformulate the hit ratio maximization problems into a submodular function maximization and propose two computationally efficient algorithms including a greedy approach to efficiently solve the cache allocation problems. We analyze the computational complexity of each algorithm. Moreover, we analyze the corresponding level of the approximation that our greedy algorithm can achieve compared to the optimal solution. Using a real-world dataset, we demonstrate that the proposed framework employing the personalized file preferences brings substantial gains over its counterpart for various system parameters.
Due to explosive growth of online video content in mobile wireless networks, in-network caching is becoming increasingly important to improve the end-user experience and reduce the Internet access cost for mobile network operators. However, caching i s a difficult problem due to the very large number of online videos and video requests,limited capacity of caching nodes, and limited bandwidth of in-network links. Existing solutions that rely on static configurations and average request arrival rates are insufficient to handle dynamic request patterns effectively. In this paper, we propose a dynamic collaborative video caching framework to be deployed in mobile networks. We decompose the caching problem into a content placement subproblem and a source-selection subproblem. We then develop SRS (System capacity Reservation Strategy) to solve the content placement subproblem, and LinkShare, an adaptive traffic-aware algorithm to solve the source selection subproblem. Our framework supports congestion avoidance and allows merging multiple requests for the same video into one request. We carry extensive simulations to validate the proposed schemes. Simulation results show that our SRS algorithm achieves performance within 1-3% of the optimal values and LinkShare significantly outperforms existing solutions.
119 - Lixing Chen , Jie Xu 2017
Mobile Edge Computing (MEC) pushes computing functionalities away from the centralized cloud to the proximity of data sources, thereby reducing service provision latency and saving backhaul network bandwidth. Although computation offloading has been extensively studied in the literature, service caching is an equally, if not more, important design topic of MEC, yet receives much less attention. Service caching refers to caching application services and their related data (libraries/databases) in the edge server, e.g. MEC-enabled Base Station (BS), enabling corresponding computation tasks to be executed. Since only a small number of services can be cached in resource-limited edge server at the same time, which services to cache has to be judiciously decided to maximize the system performance. In this paper, we investigate collaborative service caching in MEC-enabled dense small cell (SC) networks. We propose an efficient decentralized algorithm, called CSC (Collaborative Service Caching), where a network of small cell BSs optimize service caching collaboratively to address a number of key challenges in MEC systems, including service heterogeneity, spatial demand coupling, and decentralized coordination. Our algorithm is developed based on parallel Gibbs sampling by exploiting the special structure of the considered problem using graphing coloring. The algorithm significantly improves the time efficiency compared to conventional Gibbs sampling, yet guarantees provable convergence and optimality. CSC is further extended to the SC network with selfish BSs, where a coalitional game is formulated to incentivize collaboration. A coalition formation algorithm is developed by employing the merge-and-split rules and ensures the stability of the SC coalitions.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا