ترغب بنشر مسار تعليمي؟ اضغط هنا

MCPLOTS: a particle physics resource based on volunteer computing

91   0   0.0 ( 0 )
 نشر من قبل Peter Zeiler Skands
 تاريخ النشر 2013
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

The mcplots.cern.ch web site (MCPLOTS) provides a simple online repository of plots made with high-energy-physics event generators, comparing them to a wide variety of experimental data. The repository is based on the HEPDATA online database of experimental results and on the RIVET Monte Carlo analysis tool. The repository is continually updated and relies on computing power donated by volunteers, via the LHC@HOME platform.



قيم البحث

اقرأ أيضاً

68 - David P. Anderson 2019
Volunteer computing is the use of consumer digital devices for high-throughput scientific computing. It can provide large computing capacity at low cost, but presents challenges due to device heterogeneity, unreliability, and churn. BOINC, a widely-u sed open-source middleware system for volunteer computing, addresses these challenges. We describe its features, architecture, and implementation.
The rise of fast communication media both at the core and at the edge has resulted in unprecedented numbers of sophisticated and intelligent wireless IoT devices. Tactile Internet has enabled the interaction between humans and machines within their e nvironment to achieve revolutionized solutions both on the move and in real-time. Many applications such as intelligent autonomous self-driving, smart agriculture and industrial solutions, and self-learning multimedia content filtering and sharing have become attainable through cooperative, distributed and decentralized systems, namely, volunteer computing. This article introduces a blockchain-enabled resource sharing and service composition solution through volunteer computing. Device resource, computing, and intelligence capabilities are advertised in the environment to be made discoverable and available for sharing with the aid of blockchain technology. Incentives in the form of on-demand service availability are given to resource and service providers to ensure fair and balanced cooperative resource usage. Blockchains are formed whenever a service request is initiated with the aid of fog and mobile edge computing (MEC) devices to ensure secure communication and service delivery for the participants. Using both volunteer computing techniques and tactile internet architectures, we devise a fast and reliable service provisioning framework that relies on a reinforcement learning technique. Simulation results show that the proposed solution can achieve high reward distribution, increased number of blockchain formations, reduced delays, and balanced resource usage among participants, under the premise of high IoT device availability.
The NEMO High Performance Computing Cluster at the University of Freiburg has been made available to researchers of the ATLAS and CMS experiments. Users access the cluster from external machines connected to the World-wide LHC Computing Grid (WLCG). This paper describes how the full software environment of the WLCG is provided in a virtual machine image. The interplay between the schedulers for NEMO and for the external clusters is coordinated through the ROCED service. A cloud computing infrastructure is deployed at NEMO to orchestrate the simultaneous usage by bare metal and virtualized jobs. Through the setup, resources are provided to users in a transparent, automatized, and on-demand way. The performance of the virtualized environment has been evaluated for particle physics applications.
Our predictions for particle physics processes are realized in a chain of complex simulators. They allow us to generate high-fidelity simulated data, but they are not well-suited for inference on the theory parameters with observed data. We explain w hy the likelihood function of high-dimensional LHC data cannot be explicitly evaluated, why this matters for data analysis, and reframe what the field has traditionally done to circumvent this problem. We then review new simulation-based inference methods that let us directly analyze high-dimensional data by combining machine learning techniques and information from the simulator. Initial studies indicate that these techniques have the potential to substantially improve the precision of LHC measurements. Finally, we discuss probabilistic programming, an emerging paradigm that lets us extend inference to the latent process of the simulator.
93 - G. Altarelli 2011
This is a Concluding Talk, not a Summary of the FPCP 2011 Conference. I will first make some comments on the status and the prospects of particle physics and then review some of the highlights that particularly impressed me at this Conference (a subjective choice).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا