Do you want to publish a course? Click here

Online Edge Caching and Wireless Delivery in Fog-Aided Networks with Dynamic Content Popularity

120   0   0.0 ( 0 )
 Publication date 2017
and research's language is English




Ask ChatGPT about the research

Fog Radio Access Network (F-RAN) architectures can leverage both cloud processing and edge caching for content delivery to the users. To this end, F-RAN utilizes caches at the edge nodes (ENs) and fronthaul links connecting a cloud processor to ENs. Assuming time-invariant content popularity, existing information-theoretic analyses of content delivery in F-RANs rely on offline caching with separate content placement and delivery phases. In contrast, this work focuses on the scenario in which the set of popular content is time-varying, hence necessitating the online replenishment of the ENs caches along with the delivery of the requested files. The analysis is centered on the characterization of the long-term Normalized Delivery Time (NDT), which captures the temporal dependence of the coding latencies accrued across multiple time slots in the high signal-to-noise ratio regime. Online edge caching and delivery schemes are investigated for both serial and pipelined transmission modes across fronthaul and edge segments. Analytical results demonstrate that, in the presence of a time-varying content popularity, the rate of fronthaul links sets a fundamental limit to the long-term NDT of F- RAN system. Analytical results are further verified by numerical simulation, yielding important design insights.



rate research

Read More

In a Fog Radio Access Network (F-RAN) architecture, edge nodes (ENs), such as base stations, are equipped with limited-capacity caches, as well as with fronthaul links that can support given transmission rates from a cloud processor. Existing information-theoretic analyses of content delivery in F-RANs have focused on offline caching with separate content placement and delivery phases. In contrast, this work considers an online caching set-up, in which the set of popular files is time-varying and both cache replenishment and content delivery can take place in each time slot. The analysis is centered on the characterization of the long-term Normalized Delivery Time (NDT), which captures the temporal dependence of the coding latencies accrued across multiple time slots in the high signal-to- noise ratio regime. Online caching and delivery schemes based on reactive and proactive caching are investigated, and their performance is compared to optimal offline caching schemes both analytically and via numerical results.
Fog Radio Access Network (F-RAN) exploits cached contents at edge nodes (ENs) and fronthaul connection to the cloud for content delivery. Assuming dedicated fronthaul links between cloud and each EN, previous works focused on analyses of F-RANs using offline or online caching depending whether the content popularity is time-invariant or time-variant. Extension has been done for multicast fronthaul link connecting cloud to only two ENs and time-invariant popularity. In contrast, the scope of this work is on the case where multicast fronthaul link connects arbitrary number of ENs to the cloud and content popularity is time-variant. Normalized Delivery Time (NDT) is used as a performance measure and by investigating proactive online caching, analytical results reveal that the power scaling of fronthaul transmission sets a limit on the performance of F-RAN.
In this work, we propose a content caching and delivery strategy to maximize throughput capacity in cache-enabled wireless networks. To this end, efficient betweenness (EB), which indicates the ratio of content delivery paths passing through a node, is first defined to capture the impact of content caching and delivery on network traffic load distribution. Aided by EB, throughput capacity is shown to be upper bounded by the minimal ratio of successful delivery probability (SDP) to EB among all nodes. Through effectively matching nodes EB with their SDP, the proposed strategy improves throughput capacity with low computation complexity. Simulation results show that the gap between the proposed strategy and the optimal one (obtained through exhausted search) is kept smaller than 6%.
Device-to-Device (D2D) communication can support the operation of cellular systems by reducing the traffic in the network infrastructure. In this paper, the benefits of D2D communication are investigated in the context of a Fog-Radio Access Network (F-RAN) that leverages edge caching and fronthaul connectivity for the purpose of content delivery. Assuming offline caching, out-of-band D2D communication, and an F-RAN with two edge nodes and two user equipments, an information-theoretically optimal caching and delivery strategy is presented that minimizes the delivery time in the high signal-to-noise ratio regime. The delivery time accounts for the latency caused by fronthaul, downlink, and D2D transmissions. The proposed optimal strategy is based on a novel scheme for an X-channel with receiver cooperation that leverages tools from real interference alignment. Insights are provided on the regimes in which D2D communication is beneficial.
A Fog-Radio Access Network (F-RAN) is studied in which cache-enabled Edge Nodes (ENs) with dedicated fronthaul connections to the cloud aim at delivering contents to mobile users. Using an information-theoretic approach, this work tackles the problem of quantifying the potential latency reduction that can be obtained by enabling Device-to-Device (D2D) communication over out-of-band broadcast links. Following prior work, the Normalized Delivery Time (NDT) --- a metric that captures the high signal-to-noise ratio worst-case latency --- is adopted as the performance criterion of interest. Joint edge caching, downlink transmission, and D2D communication policies based on compress-and-forward are proposed that are shown to be information-theoretically optimal to within a constant multiplicative factor of two for all values of the problem parameters, and to achieve the minimum NDT for a number of special cases. The analysis provides insights on the role of D2D cooperation in improving the delivery latency.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا