Do you want to publish a course? Click here

Hierarchical Capacity Provisioning for Fog Computing

325   0   0.0 ( 0 )
 Added by Abbas Kiani
 Publication date 2018
and research's language is English




Ask ChatGPT about the research

The concept of fog computing is centered around providing computation resources at the edge of network, thereby reducing the latency and improving the quality of service. However, it is still desirable to investigate how and where at the edge of the network the computation capacity should be provisioned. To this end, we propose a hierarchical capacity provisioning scheme. In particular, we consider a two-tier network architecture consisting of shallow and deep cloudlets and explore the benefits of hierarchical capacity based on queueing analysis. Moreover, we explore two different network scenarios in which the network delay between the two tiers is negligible as well as the case that the deep cloudlet is located somewhere deeper in the network and thus the delay is significant. More importantly, we model the first network delay scenario with bufferless shallow cloudlets as well as the second scenario with finite-size buffer shallow cloudlets, and formulate an optimization problem for each model. We also use stochastic ordering to solve the optimization problem formulated for the first model and an upper bound based technique is proposed for the second model. The performance of the proposed scheme is evaluated via simulations in which we show the accuracy of the proposed upper bound technique as well as the queue length estimation approach for both randomly generated input and real trace data.



rate research

Read More

This paper studies edge caching in fog computing networks, where a capacity-aware edge caching framework is proposed by considering both the limited fog cache capacity and the connectivity capacity of base stations (BSs). By allowing cooperation between fog nodes and cloud data center, the average-download-time (ADT) minimization problem is formulated as a multi-class processor queuing process. We prove the convexity of the formulated problem and propose an Alternating Direction Method of Multipliers (ADMM)-based algorithm that can achieve the minimum ADT and converge much faster than existing algorithms. Simulation results demonstrate that the allocation of fog cache capacity and connectivity capacity of BSs needs to be balanced according to the network status. While the maximization of the edge-cache-hit-ratio (ECHR) by utilizing all available fog cache capacity is helpful when the BS connectivity capacity is sufficient, it is preferable to keep a lower ECHR and allocate more traffic to the cloud when the BS connectivity capacity is deficient.
Recently, fog computing has been introduced as a modern distributed paradigm and complement to cloud computing to provide services. Fog system extends storing and computing to the edge of the network, which can solve the problem about service computing of the delay-sensitive applications remarkably besides enabling the location awareness and mobility support. Load balancing is an important aspect of fog networks that avoids a situation with some under-loaded or overloaded fog nodes. Quality of Service (QoS) parameters such as resource utilization, throughput, cost, response time, performance, and energy consumption can be improved with load balancing. In recent years, some researches in load balancing techniques in fog networks have been carried out, but there is no systematic review to consolidate these studies. This article reviews the load-balancing mechanisms systematically in fog computing in four classifications, including approximate, exact, fundamental, and hybrid methods (published between 2013 and August 2020). Also, this article investigates load balancing metrics with all advantages and disadvantages related to chosen load balancing mechanisms in fog networks. The evaluation techniques and tools applied for each reviewed study are explored as well. Additionally, the essential open challenges and future trends of these mechanisms are discussed.
The fast growth of Internet-connected embedded devices demands for new capabilities at the network edge. These new capabilities are local processing, fast communications, and resource virtualization. The current work aims to address the previous capabilities by designing and deploying a new proposal, which offers on-demand activation of offline IoT fog computing assets via a Software Defined Networking (SDN) based solution combined with containerization and sensor virtualization. We present and discuss performance and functional outcomes from emulated tests made on our proposal. Analysing the performance results, the system latency has two parts. The first part is about the delay induced by limitations on the networking resources. The second part of the system latency is due to the on-demand activation of the required processing resources, which are initially powered off towards a more sustainable system operation. In addition, analysing the functional results, when a real IoT protocol is used, we evidence our proposal viability to be deployed with the necessary orchestration in distributed scenarios involving embedded devices, actuators, controllers, and brokers at the network edge.
Blockchain has revolutionized how transactions are conducted by ensuring secure and auditable peer-to-peer coordination. This is due to both the development of decentralization, and the promotion of trust among peers. Blockchain and fog computing are currently being evaluated as potential support for software and a wide spectrum of applications, ranging from banking practices and digital transactions to cyber-physical systems. These systems are designed to work in highly complex, sometimes even adversarial, environments, and to synchronize heterogeneous machines and manufacturing facilities in cyber computational space, and address critical challenges such as computational complexity, security, trust, and data management. Coupling blockchain with fog computing technologies has the potential to identify and overcome these issues. Thus, this paper presents the knowledge of blockchain and fog computing required to improve cyber-physical systems in terms of quality-of-service, data storage, computing and security.
As vehicles playing an increasingly important role in peoples daily life, requirements on safer and more comfortable driving experience have arisen. Connected vehicles (CVs) can provide enabling technologies to realize these requirements and have attracted widespread attentions from both academia and industry. These requirements ask for a well-designed computing architecture to support the Quality-of-Service (QoS) of CV applications. Computation offloading techniques, such as cloud, edge, and fog computing, can help CVs process computation-intensive and large-scale computing tasks. Additionally, different cloud/edge/fog computing architectures are suitable for supporting different types of CV applications with highly different QoS requirements, which demonstrates the importance of the computing architecture design. However, most of the existing surveys on cloud/edge/fog computing for CVs overlook the computing architecture design, where they (i) only focus on one specific computing architecture and (ii) lack discussions on benefits, research challenges, and system requirements of different architectural alternatives. In this paper, we provide a comprehensive survey on different architectural design alternatives based on cloud/edge/fog computing for CVs. The contributions of this paper are: (i) providing a comprehensive literature survey on existing proposed architectural design alternatives based on cloud/edge/fog computing for CVs, (ii) proposing a new classification of computing architectures based on cloud/edge/fog computing for CVs: computation-aided and computation-enabled architectures, (iii) presenting a holistic comparison among different cloud/edge/fog computing architectures for CVs based on functional requirements of CV systems, including advantages, disadvantages, and research challenges.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا