ترغب بنشر مسار تعليمي؟ اضغط هنا

77 - Iftach Sadeh 2020
The primary challenge in the study of explosive astrophysical transients is their detection and characterisation using multiple messengers. For this purpose, we have developed a new data-driven discovery framework, based on deep learning. We demonstr ate its use for searches involving neutrinos, optical supernovae, and gamma rays. We show that we can match or substantially improve upon the performance of state-of-the-art techniques, while significantly minimising the dependence on modelling and on instrument characterisation. Particularly, our approach is intended for near- and real-time analyses, which are essential for effective follow-up of detections. Our algorithm is designed to combine a range of instruments and types of input data, representing different messengers, physical regimes, and temporal scales. The methodology is optimised for agnostic searches of unexpected phenomena, and has the potential to substantially enhance their discovery prospects.
117 - Iftach Sadeh 2019
The next generation of observatories will facilitate the discovery of new types of astrophysical transients. The detection of such phenomena, whose characteristics are presently poorly constrained, will hinge on the ability to perform blind searches. We present a new algorithm for this purpose, based on deep learning. We incorporate two approaches, utilising anomaly detection and classification techniques. The first is model-independent, avoiding the use of background modelling and instrument simulations. The second method enables targeted searches, relying on generic spectral and temporal patterns as input. We compare our methodology with the existing approach to serendipitous detection of gamma-ray transients. We use our framework to derive the detection prospects of low-luminosity gamma-ray bursts with the upcoming Cherenkov Telescope Array. Our method is an unbiased, data-driven approach for multiwavelength and multi-messenger transient detection.
One of the central scientific goals of the next-generation Cherenkov Telescope Array (CTA) is the detection and characterization of gamma-ray bursts (GRBs). CTA will be sensitive to gamma rays with energies from about 20 GeV, up to a few hundred TeV. The energy range below 1 TeV is particularly important for GRBs. CTA will allow exploration of this regime with a ground-based gamma-ray facility with unprecedented sensitivity. As such, it will be able to probe radiation and particle acceleration mechanisms at work in GRBs. In this contribution, we describe POSyTIVE, the POpulation Synthesis Theory Integrated project for very high-energy emission. The purpose of the project is to make realistic predictions for the detection rates of GRBs with CTA, to enable studies of individual simulated GRBs, and to perform preparatory studies for time-resolved spectral analyses. The mock GRB population used by POSyTIVE is calibrated using the entire 40-year dataset of multi-wavelength GRB observations. As part of this project we explore theoretical models for prompt and afterglow emission of long and short GRBs, and predict the expected radiative output. Subsequent analyses are performed in order to simulate the observations with CTA, using the publicly available ctools and Gammapy frameworks. We present preliminary results of the design and implementation of this project.
206 - Iftach Sadeh 2019
The next generation of observatories will facilitate the discovery of new types of astrophysical transients. The detection of such phenomena, whose characteristics are presently poorly constrained, will hinge on the ability to perform blind searches. We present a new algorithm for this purpose, based on deep learning. We incorporate two approaches, utilising anomaly detection and classification techniques. The first is model-independent, avoiding the use of background modelling and instrument simulations. The second method enables targeted searches, relying on generic spectral and temporal patterns as input. We compare our methodology with the existing approach to serendipitous detection of gamma-ray transients. The algorithm is shown to be more robust, especially for non-trivial spectral features. We use our framework to derive the detection prospects of low-luminosity gamma-ray bursts with the upcoming Cherenkov Telescope Array. Our method is an unbiased, completely data-driven approach for multiwavelength and multi-messenger transient detection.
The Cherenkov Telescope Array (CTA) is the next generation gamma-ray observatory. CTA will incorporate about 100 imaging atmospheric Cherenkov telescopes (IACTs) at a southern site, and about 20 in the north. Previous IACT experiments have used up to five telescopes. Subsequently, the design of a graphical user interface (GUI) for the operator of CTA poses an interesting challenge. In order to create an effective interface, the CTA team is collaborating with experts from the field of Human-Computer Interaction. We present here our GUI prototype. The back-end of the prototype is a Python Web server. It is integrated with the observation execution system of CTA, which is based on the Alma Common Software (ACS). The back-end incorporates a redis database, which facilitates synchronization of GUI panels. redis is also used to buffer information collected from various software components and databases. The front-end of the prototype is based on Web technology. Communication between Web server and clients is performed using Web Sockets, where graphics are generated with the d3.js Javascript library.
The Cherenkov Telescope Array (CTA) is a planned gamma-ray observatory. CTA will incorporate about 100 imaging atmospheric Cherenkov telescopes (IACTs) at a Southern site, and about 20 in the North. Previous IACT experiments have used up to five tele scopes. Subsequently, the design of a graphical user interface (GUI) for the operator of CTA involves new challenges. We present a GUI prototype, the concept for which is being developed in collaboration with experts from the field of Human-Computer Interaction. The prototype is based on Web technology; it incorporates a Python web server, Web Sockets and graphics generated with the d3.js Javascript library.
We present ANNz2, a new implementation of the public software for photometric redshift (photo-z) estimation of Collister and Lahav (2004), which now includes generation of full probability distribution functions (PDFs). ANNz2 utilizes multiple machin e learning methods, such as artificial neural networks and boosted decision/regression trees. The objective of the algorithm is to optimize the performance of the photo-z estimation, to properly derive the associated uncertainties, and to produce both single-value solutions and PDFs. In addition, estimators are made available, which mitigate possible problems of non-representative or incomplete spectroscopic training samples. ANNz2 has already been used as part of the first weak lensing analysis of the Dark Energy Survey, and is included in the experiments first public data release. Here we illustrate the functionality of the code using data from the tenth data release of the Sloan Digital Sky Survey and the Baryon Oscillation Spectroscopic Survey. The code is available for download at https://github.com/IftachSadeh/ANNZ .
The gravitational redshift effect allows one to directly probe the gravitational potential in clusters of galaxies. Following up on Wojtak et al. [Nature (London) 477, 567 (2011)], we present a new measurement. We take advantage of new data from the tenth data release of the Sloan Digital Sky Survey and the Baryon Oscillation Spectroscopic Survey. We compare the spectroscopic redshift of the brightest cluster galaxies (BCGs) with that of galaxies at the outskirts of clusters, using a sample with an average cluster mass of $10^{14} M_{odot}$. We find that these galaxies have an average relative redshift of -11 km/s compared with that of BCGs, with a standard deviation of +7 and -5 km/s. Our measurement is consistent with that of Wojtak et al. However, our derived standard deviation is larger, as we take into account various systematic effects, beyond the size of the dataset. The result is in good agreement with the predictions from general relativity.
130 - Iftach Sadeh 2013
The dijet double-differential cross section is measured as a function of the dijet invariant mass, using data taken during 2010 and during 2011 with the ATLAS experiment at the LHC, with a center-of-mass energy of 7 TeV. The measurements are sensitiv e to invariant masses between 70 GeV and 4.27 TeV with center-of-mass jet rapidities up to 3.5. A novel technique to correct jets for pile-up (additional proton-proton collisions) in the 2011 data is developed and subsequently used in the measurement. The data are found to be consistent with fixed-order NLO pQCD predictions provided by NLOJET++. The results constitute a stringent test of pQCD, in an energy regime previously unexplored. The dijet analysis is a confidence building step for the extraction of the signal of hard double parton scattering (DPS) in four-jet events, and subsequent extraction of the effective overlap area between the interacting protons, expressed in terms of the variable, sigma(eff). The measurement of DPS is performed using the 2010 ATLAS data. The rate of DPS events is estimated using a neural network. A clear signal is observed, under the assumption that the DPS signal can be represented by a random combination of exclusive dijet production. The fraction of DPS candidate events is determined to be f(DPS) = 0.081 +- 0.004 (stat.) +0.025-0.014 (syst.) in the analyzed phase-space of four-jet topologies. Combined with the measurement of the dijet and four-jet cross sections in the appropriate phase-space regions, the effective cross section is found to be sigma(eff) = 16.0 +0.5-0.8 (stat.) +1.9-3.5 (syst.) mb. This result is consistent within the quoted uncertainties with previous measurements of sigma(eff) at center-of-mass energies between 63 GeV and 7 TeV, using several final states.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا