ﻻ يوجد ملخص باللغة العربية
We consider a set-valued online prediction problem in the context of network caching. Assume that multiple users are connected to several caches via a bipartite network. At any time slot, each user requests an arbitrary file chosen from a large catalog. A users request at a slot is met if the requested file is cached in at least one of the caches connected to the user. Our objective is to predict, prefetch, and optimally distribute the files on the caches to maximize the total number of cache hits in an online setting. The problem is non-trivial due to the non-convex and non-smooth nature of the objective function. In this paper, we propose $texttt{LeadCache}$ - an online caching policy based on the Follow-the-Perturbed-Leader paradigm. We show that the policy is regret-optimal up to a factor of $tilde{O}(n^{3/8}),$ where $n$ is the number of users. We design two efficient implementations of the $texttt{LeadCache}$ policy, one based on Pipage rounding and the other based on Madows sampling, each of which makes precisely one call to an LP-solver per iteration. With a Strong-Law-type assumption, we show that the total number of file fetches under $texttt{LeadCache}$ remains almost surely finite over an infinite horizon. Finally, we derive a tight regret lower bound using results from graph coloring. We conclude that the learning-based $texttt{LeadCache}$ policy decisively outperforms the known caching policies both theoretically and empirically.
For ultra-dense networks with wireless backhaul, caching strategy at small base stations (SBSs), usually with limited storage, is critical to meet massive high data rate requests. Since the content popularity profile varies with time in an unknown wa
Federated edge learning (FEEL) is a widely adopted framework for training an artificial intelligence (AI) model distributively at edge devices to leverage their data while preserving their data privacy. The execution of a power-hungry learning task a
Inter-operator spectrum sharing in millimeter-wave bands has the potential of substantially increasing the spectrum utilization and providing a larger bandwidth to individual user equipment at the expense of increasing inter-operator interference. Un
In this paper we investigate the performance of caching schemes based on fountain codes in a heterogeneous satellite network. We consider multiple cache-aided hubs which are connected to a geostationary satellite through backhaul links. With the aimo
We study noisy broadcast networks with local cache memories at the receivers, where the transmitter can pre-store information even before learning the receivers requests. We mostly focus on packet-erasure broadcast networks with two disjoint sets of