ترغب بنشر مسار تعليمي؟ اضغط هنا

Self-Calibrating the Look-Elsewhere Effect: Fast Evaluation of the Statistical Significance Using Peak Heights

49   0   0.0 ( 0 )
 نشر من قبل Adrian Bayer
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

In experiments where one searches a large parameter space for an anomaly, one often finds many spurious noise-induced peaks in the likelihood. This is known as the look-elsewhere effect, and must be corrected for when performing statistical analysis. This paper introduces a method to calibrate the false alarm probability (FAP), or $p$-value, for a given dataset by considering the heights of the highest peaks in the likelihood. In the simplest form of self-calibration, the look-elsewhere-corrected $chi^2$ of a physical peak is approximated by the $chi^2$ of the peak minus the $chi^2$ of the highest noise-induced peak. Generalizing this concept to consider lower peaks provides a fast method to quantify the statistical significance with improved accuracy. In contrast to alternative methods, this approach has negligible computational cost as peaks in the likelihood are a byproduct of every peak-search analysis. We apply to examples from astronomy, including planet detection, periodograms, and cosmology.

قيم البحث

اقرأ أيضاً

The cosmic optical background is an important observable that constrains energy production in stars and more exotic physical processes in the universe, and provides a crucial cosmological benchmark against which to judge theories of structure formati on. Measurement of the absolute brightness of this background is complicated by local foregrounds like the Earths atmosphere and sunlight reflected from local interplanetary dust, and large discrepancies in the inferred brightness of the optical background have resulted. Observations from probes far from the Earth are not affected by these bright foregrounds. Here we analyze data from the Long Range Reconnaissance Imager (LORRI) instrument on NASAs New Horizons mission acquired during cruise phase outside the orbit of Jupiter, and find a statistical upper limit on the optical backgrounds brightness similar to the integrated light from galaxies. We conclude that a carefully performed survey with LORRI could yield uncertainties comparable to those from galaxy counting measurements.
Voronoi grids have been successfully used to represent density structures of gas in astronomical hydrodynamics simulations. While some codes are explicitly built around using a Voronoi grid, others, such as Smoothed Particle Hydrodynamics (SPH), use particle-based representations and can benefit from constructing a Voronoi grid for post-processing their output. So far, calculating the density of each Voronoi cell from SPH data has been done numerically, which is both slow and potentially inaccurate. This paper proposes an alternative analytic method, which is fast and accurate. We derive an expression for the integral of a cubic spline kernel over the volume of a Voronoi cell and link it to the density of the cell. Mass conservation is ensured rigorously by the procedure. The method can be applied more broadly to integrate a spherically symmetric polynomial function over the volume of a random polyhedron.
This white paper describes the science case for Very Long Baseline Interferometry (VLBI) and provides suggestions towards upgrade paths for the European VLBI Network (EVN). The EVN is a distributed long-baseline radio interferometric array, that oper ates at the very forefront of astronomical research. Recent results, together with the new science possibilities outlined in this vision document, demonstrate the EVNs potential to generate new and exciting results that will transform our view of the cosmos. Together with e-MERLIN, the EVN provides a range of baseline lengths that permit unique studies of faint radio sources to be made over a wide range of spatial scales. The science cases are reviewed in six chapters that cover the following broad areas: cosmology, galaxy formation and evolution, innermost regions of active galactic nuclei, explosive phenomena and transients, stars and stellar masers in the Milky Way, celestial reference frames and space applications. The document concludes with identifying the synergies with other radio, as well as multi-band/multi-messenger instruments, and provide the recommendations for future improvements. The appendices briefly describe other radio VLBI arrays, the technological framework for EVN developments, and a selection of spectral lines of astrophysical interest below 100 GHz. The document includes a glossary for non-specialists, and a list of acronyms at the end.
The Large Synoptic Survey Telescope is designed to provide an unprecedented optical imaging dataset that will support investigations of our Solar System, Galaxy and Universe, across half the sky and over ten years of repeated observation. However, ex actly how the LSST observations will be taken (the observing strategy or cadence) is not yet finalized. In this dynamically-evolving community white paper, we explore how the detailed performance of the anticipated science investigations is expected to depend on small changes to the LSST observing strategy. Using realistic simulations of the LSST schedule and observation properties, we design and compute diagnostic metrics and Figures of Merit that provide quantitative evaluations of different observing strategies, analyzing their impact on a wide range of proposed science projects. This is work in progress: we are using this white paper to communicate to each other the relative merits of the observing strategy choices that could be made, in an effort to maximize the scientific value of the survey. The investigation of some science cases leads to suggestions for new strategies that could be simulated and potentially adopted. Notably, we find motivation for exploring departures from a spatially uniform annual tiling of the sky: focusing instead on different parts of the survey area in different years in a rolling cadence is likely to have significant benefits for a number of time domain and moving object astronomy projects. The communal assembly of a suite of quantified and homogeneously coded metrics is the vital first step towards an automated, systematic, science-based assessment of any given cadence simulation, that will enable the scheduling of the LSST to be as well-informed as possible.
Laboratory astrophysics and complementary theoretical calculations are the foundations of astronomy and astrophysics and will remain so into the foreseeable future. The impact of laboratory astrophysics ranges from the scientific conception stage for ground-based, airborne, and space-based observatories, all the way through to the scientific return of these projects and missions. It is our understanding of the under-lying physical processes and the measurements of critical physical parameters that allows us to address fundamental questions in astronomy and astrophysics. In this regard, laboratory astrophysics is much like detector and instrument development at NASA, NSF, and DOE. These efforts are necessary for the success of astronomical research being funded by the agencies. Without concomitant efforts in all three directions (observational facilities, detector/instrument development, and laboratory astrophysics) the future progress of astronomy and astrophysics is imperiled. In addition, new developments in experimental technologies have allowed laboratory studies to take on a new role as some questions which previously could only be studied theoretically can now be addressed directly in the lab. With this in mind we, the members of the AAS Working Group on Laboratory Astrophysics, have prepared this State of the Profession Position Paper on the laboratory astrophysics infrastructure needed to ensure the advancement of astronomy and astrophysics in the next decade.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا