Do you want to publish a course? Click here

Studying of Directed Epidemic Algorithms Used to Ensure the Reliability of Publish/Subscribe Systems

دراسة خوارزميات نشر العدوى الموجّهة المُستَخدَمة لتأمين الوثوقية في أنظمة النشر/الاشتراك

1071   0   13   0 ( 0 )
 Publication date 2015
and research's language is العربية
 Created by Shamra Editor




Ask ChatGPT about the research

Publish/Subscribe Systems become increasingly important, mainly because they provide a full decoupling between the publishers and the subscribers, with respect to space, time, and synchronization. This research presents an overview of publish/subscribe systems and epidemic algorithms (especially the directed ones), and studies the performance of directed epidemic algorithms used to ensure the reliability of publish/subscribe systems. Adopting Scribe system and simulation using OMNET++, the paper evaluates different directed epidemic algorithms by considering many factors: targeted range width, number of rounds, number of nodes, and packet loss rate, on each of: the reliability, overhead, and mean latency.

References used
COSTA, P; MIGLIAVACCA, M.; PICCO, G. P; and CUGOLA, G. Introducing reliability in content-based publish subscribe through epidemic algorithms. In DEBS ’03: Proceedings of the 2nd international workshop on Distributed event-based systems, pages 1–8, New York, NY, USA, 2003. ACM
COULOURIS, G; DOLLIMORE, J; KINDBERG, T; and BLAIR, G. DISTRIBUTED SYSTEMS Concepts and Design. Fifth Edition, Addison Wesley,2011, ISBN 0-13-214301-1
ESPOSITO, C. A tutorial on reliability in publish/subscribe services. Proceedings of the 6th ACM International Conference on Distributed Event-Based Systems - DEBS ’12, pages 399-406, 2012
rate research

Read More

IOT sensors use the publish/subscribe model for communication to benefit from its decoupled nature with respect to space, time, and synchronization. Because of the heterogeneity of communicating parties, semantic decoupling is added as a fourth di mension. The added semantic decoupling complicates the matching process and reduces its efficiency. The proposed algorithm clusters subscriptions and events according to topic and performs the matching process within these clusters, which increases the throughput by reducing the matching time . Moreover, the accuracy of matching is improved when subscriptions must be fully approximated . This work shows the benefit of clustering, as well as the improvement in the matching accuracy and efficiency achieved using this approach.
Publish/subscribe (pub/sub) is a popular communication paradigm in the design of largescale distributed systems. We are witnessing an increasingly widespread use of pub/sub networks for a wide array of applications in industry , academia , financia l data dissemination , business process management and does not end in social networking sites which takes a large area of user interests and used network baseband . Social network interactions have grown exponentially in recent years to the order of billions of notifications generated by millions of users every day. So It has become very important to access in the field of publishing and subscription networks, especially peerto- peer (P2P) networks that used at first the Scribe data routing protocol. Since then, many developments and improvements have been made to create new designs. One of these designs is the Polder Cast protocol. The research studies most used protocols performance to assess the effectiveness of each in terms of the speed of the publication and subscription of topic , required resources and distribution of load .
The research presents molding and analytical study of several scheduling algorithms types in real-time multiprocessor systems. The performance of three scheduling algorithms have been analyzed : Earliest Deadline First Scheduling (EDF) , Least Laxi ty First Scheduling (LLF), and Earliest Deadline First until Zero Laxity Scheduling (EDZL). This paper considers the scheduling of n periodic, independed, and preempted tasks with implicit deadlines on a platform of m homogenous multiprocessor. It has compared in terms of the load on the processor (processor's busyness) , the number of migrations, and the number of preemptions and the number of times in which these algorithms did not succeed in achieving the time limits for tasks where the latter is considered the most important criterion in real time scheduling. It also considers scheduling growing task sets of periodic tasks starting from 4 task set up to 64 task set, in order to study the effect of increasing the number of tasks and processors also on the performance of the scheduling algorithms. As a result of research, the strengths and weaknesses in the performance of these three algorithms have presented. It is proposed the best type of real-time system to apply each algorithm according to the strengths of its performance.
With the increasing use of technologies and automation in different sides of modern life, the outage of electricity became a big issue that widely affects the daily life of most sectors like industrial, economical or even entertaining sector. So it became so necessary to achieve a high-reliability electrical system to insure the continuation of electricity supply to the end consumer. Consequently, in this research, we are studying a new method of service restoration using genetic algorithms to increase the reliability of distribution systems and improving its performance. The research includes a brief aver view of electrical systems reliability and the basics of Genetic Algorithms and the use of these techniques in dispatching centers. In addition we have designed a program in "MATLAB" environment to apply the service restoration technique using genetic algorithms, and the program has been tested on a case study with the relative results shown .
In this paper, we compare the performance of sporadic tasks scheduler algorithms on a multi-core platform in order to determine the best algorithm in terms of a set of parameters adopted by researchers in this field, which in turn gives us accurate details about the quality of such algorithms when applied to a set of sporadic tasks generated according to uniformed Logarithmic probability distribution. The simulation is done using Simso simulator, which proved the reliability of high performance by the testimony of many researchers in this field, as it provides the possibility of generating tasks according to specific probability distributions, and simulates accurate details related to the characteristics of random tasks.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا