ترغب بنشر مسار تعليمي؟ اضغط هنا

A Genetic Algorithm for Astroparticle Physics Studies

129   0   0.0 ( 0 )
 نشر من قبل Hong-Hao Zhang
 تاريخ النشر 2019
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Precision measurements of charged cosmic rays have recently been carried out by space-born (e.g. AMS-02), or ground experiments (e.g. HESS). These measured data are important for the studies of astro-physical phenomena, including supernova remnants, cosmic ray propagation, solar physics and dark matter. Those scenarios usually contain a number of free parameters that need to be adjusted by observed data. Some techniques, such as Markov Chain Monte Carlo and MultiNest, are developed in order to solve the above problem. However, it is usually required a computing farm to apply those tools. In this paper, a genetic algorithm for finding the optimum parameters for cosmic ray injection and propagation is presented. We find that this algorithm gives us the same best fit results as the Markov Chain Monte Carlo but consuming less computing power by nearly 2 orders of magnitudes.



قيم البحث

اقرأ أيضاً

The Sun is an excellent laboratory for astroparticle physics but remains poorly understood at GeV--TeV energies. Despite the immense relevance for both cosmic-ray propagation and dark matter searches, only in recent years has the Sun become a target for precision gamma-ray astronomy with the Fermi-LAT instrument. Among the most surprising results from the observations is a hard excess of GeV gamma-ray flux that strongly anti-correlates with solar activity, especially at the highest energies accessible to Fermi-LAT. Most of the observed properties of the gamma-ray emission cannot be explained by existing models of cosmic-ray interactions with the solar atmosphere. GeV--TeV gamma-ray observations of the Sun spanning an entire solar cycle would provide key insights into the origin of these gamma rays, and consequently improve our understanding of the Suns environment as well as the foregrounds for new physics searches, such as dark matter. These can be complemented with new observations with neutrinos and cosmic rays. Together these observations make the Sun a new testing ground for particle physics in dynamic environments.
Large amounts of deep optical images will be available in the near future, allowing statistically significant studies of low surface brightness structures such as intracluster light (ICL) in galaxy clusters. The detection of these structures requires efficient algorithms dedicated to this task, where traditional methods suffer difficulties. We present our new Detection Algorithm with Wavelets for Intracluster light Studies (DAWIS), developed and optimised for the detection of low surface brightness sources in images, in particular (but not limited to) ICL. DAWIS follows a multiresolution vision based on wavelet representation to detect sources, embedded in an iterative procedure called synthesis-by-analysis approach to restore the complete unmasked light distribution of these sources with very good quality. The algorithm is built so sources can be classified based on criteria depending on the analysis goal; we display in this work the case of ICL detection and the measurement of ICL fractions. We test the efficiency of DAWIS on 270 mock images of galaxy clusters with various ICL profiles and compare its efficiency to more traditional ICL detection methods such as the surface brightness threshold method. We also run DAWIS on a real galaxy cluster image, and compare the output to results obtained with previous multiscale analysis algorithms. We find in simulations that in average DAWIS is able to disentangle galaxy light from ICL more efficiently, and to detect a greater quantity of ICL flux due to the way it handles sky background noise. We also show that the ICL fraction, a metric used on a regular basis to characterise ICL, is subject to several measurement biases both on galaxies and ICL fluxes. In the real galaxy cluster image, DAWIS detects a faint and extended source with an absolute magnitude two orders brighter than previous multiscale methods.
109 - U.F. Katz 2019
Cherenkov light induced by fast charged particles in transparent dielectric media such as air or water is exploited by a variety of experimental techniques to detect and measure extraterrestrial particles impinging on Earth. A selection of detection principles is discussed and corresponding experiments are presented together with breakthrough-results they achieved. Some future developments are highlighted.
We present a toolbox of new techniques and concepts for the efficient forecasting of experimental sensitivities. These are applicable to a large range of scenarios in (astro-)particle physics, and based on the Fisher information formalism. Fisher inf ormation provides an answer to the question what is the maximum extractable information from a given observation?. It is a common tool for the forecasting of experimental sensitivities in many branches of science, but rarely used in astroparticle physics or searches for particle dark matter. After briefly reviewing the Fisher information matrix of general Poisson likelihoods, we propose very compact expressions for estimating expected exclusion and discovery limits (equivalent counts method). We demonstrate by comparison with Monte Carlo results that they remain surprisingly accurate even deep in the Poisson regime. We show how correlated background systematics can be efficiently accounted for by a treatment based on Gaussian random fields. Finally, we introduce the novel concept of Fisher information flux. It can be thought of as a generalization of the commonly used signal-to-noise ratio, while accounting for the non-local properties and saturation effects of background and instrumental uncertainties. It is a powerful and flexible tool ready to be used as core concept for informed strategy development in astroparticle physics and searches for particle dark matter.
Current and future astroparticle physics experiments are operated or are being built to observe highly energetic particles, high energy electromagnetic radiation and gravitational waves originating from all kinds of cosmic sources. The data volumes t aken by the experiments are large and expected to grow significantly during the coming years. This is a result of advanced research possibilities and improved detector technology. To cope with the substantially increasing data volumes of astroparticle physics projects it is important to understand the future needs for computing resources in this field. Providing these resources constitutes a larger fraction of the overall running costs of future infrastructures. This document presents the results of a survey made by APPEC with the help of computing experts of major projects and future initiatives in astroparticle physics, representatives of current Tier-1 and Tier-2 LHC computing centers, as well as specifically astroparticle physics computing centers, e.g. the Albert Einstein Institute for gravitational waves analysis in Hanover. In summary, the overall CPU usage and short-term disk and long-term (tape) storage space currently available for astroparticle physics projects computing services is of the order of one third of the central computing available for LHC data at the Tier-0 center at CERN. Till the end of the decade the requirements for computing resources are estimated to increase by a factor of 10. Furthermore, this document shall describe the diversity of astroparticle physics data handling and serve as a basis to estimate a distribution of computing and storage tasks among the major computing centers. (Abridged)
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا