ترغب بنشر مسار تعليمي؟ اضغط هنا

Guiding New Physics Searches with Unsupervised Learning

423   0   0.0 ( 0 )
 نشر من قبل Andrea De Simone
 تاريخ النشر 2018
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We propose a new scientific application of unsupervised learning techniques to boost our ability to search for new phenomena in data, by detecting discrepancies between two datasets. These could be, for example, a simulated standard-model background, and an observed dataset containing a potential hidden signal of New Physics. We build a statistical test upon a test statistic which measures deviations between two samples, using a Nearest Neighbors approach to estimate the local ratio of the density of points. The test is model-independent and non-parametric, requiring no knowledge of the shape of the underlying distributions, and it does not bin the data, thus retaining full information from the multidimensional feature space. As a proof-of-concept, we apply our method to synthetic Gaussian data, and to a simulated dark matter signal at the Large Hadron Collider. Even in the case where the background can not be simulated accurately enough to claim discovery, the technique is a powerful tool to identify regions of interest for further study.

قيم البحث

اقرأ أيضاً

We reframe common tasks in jet physics in probabilistic terms, including jet reconstruction, Monte Carlo tuning, matrix element - parton shower matching for large jet multiplicity, and efficient event generation of jets in complex, signal-like region s of phase space. We also introduce Ginkgo, a simplified, generative model for jets, that facilitates research into these tasks with techniques from statistics, machine learning, and combinatorial optimization. We review some of the recent research in this direction that has been enabled with Ginkgo. We show how probabilistic programming can be used to efficiently sample the showering process, how a novel trellis algorithm can be used to efficiently marginalize over the enormous number of clustering histories for the same observed particles, and how dynamic programming, A* search, and reinforcement learning can be used to find the maximum likelihood clustering in this enormous search space. This work builds bridges with work in hierarchical clustering, statistics, combinatorial optmization, and reinforcement learning.
SND@LHC is an approved experiment equipped to detect scattering of neutrinos produced in the far-forward direction at the LHC, and aimed to measure their properties. In addition, the detector has a potential to search for new feebly interacting parti cles (FIPs) that may be produced in proton-proton collisions. In this paper, we discuss FIPs signatures at SND@LHC considering two classes of particles: stable FIPs that may be detected via their scattering, and unstable FIPs that decay inside the detector. We estimate the sensitivity of SND@LHC to probe scattering of leptophobic dark matter, and to detect decays of neutrino, scalar, and vector portal particles. Finally, we also compare and qualitatively analyze the potential of SND@LHC and FASER/FASER{ u} experiments for these searches.
We propose a novel way to search for feebly interacting massive particles, exploiting two properties of systems involving collisions between high energy electrons and intense laser pulses. The first property is that the electron-intense-laser collisi on results in a large flux of hard photons, as the laser behaves effectively as a thick medium. The second property is that the emitted photons free-stream inside the laser and thus for them the laser behaves effectively as a very thin medium. Combining these two features implies that the electron-intense-laser collision is an apparatus which can efficiently convert UV electrons to a large flux of hard, co-linear photons. We further propose to direct this unique large and hard flux of photons onto a physical dump which in turn is capable of producing feebly interacting massive particles, in a region of parameters that has never been probed before. We denote this novel apparatus as ``optical dump or NPOD (new physics search with optical dump). The proposed LUXE experiment at Eu.XFEL has all the required basic ingredients of the above experimental concept. We discuss how this concept can be realized in practice by adding a detector after the last physical dump of the experiment to reconstruct the two-photon decay product of a new spin-0 particle. We show that even with a relatively short dump, the search can still be background free. Remarkably, even with a 40 TW laser, which corresponds to the initial run, and definitely with a 350 TW laser, of the main run with one year of data taking, LUXE-NPOD will be able to probe uncharted territory of both models of pseudo-scalar and scalar fields, and in particular probe natural of scalar theories for masses above 100 MeV.
Photonic Quantum Computers provides several benefits over the discrete qubit-based paradigm of quantum computing. By using the power of continuous-variable computing we build an anomaly detection model to use on searches for New Physics. Our model us es Gaussian Boson Sampling, a $#$P-hard problem and thus not efficiently accessible to classical devices. This is used to create feature vectors from graph data, a natural format for representing data of high-energy collision events. A simple K-means clustering algorithm is used to provide a baseline method of classification. We then present a novel method of anomaly detection, combining the use of Gaussian Boson Sampling and a quantum extension to K-means known as Q-means. This is found to give equivalent results compared to the classical clustering version while also reducing the $mathcal{O}$ complexity, with respect to the samples feature-vector length, from $mathcal{O}(N)$ to $mathcal{O}(mbox{log}(N))$. Due to the speed of the sampling algorithm and the feasibility of near-term photonic quantum devices, anomaly detection at the trigger level can become practical in future LHC runs.
We present the prospects of an angular analysis of the $Lambda_b to Lambda(1520)ell^+ell^-$ decay. Using the expected yield in the current dataset collected at the LHCb experiment, as well as the foreseen ones after the LHCb upgrades, sensitivity stu dies are presented to determine the experimental precision on angular observables related to the lepton distribution and their potential to identify New Physics. The forward-backward lepton asymmetry at low dilepton invariant mass is particularly promising. NP scenarios favoured by the current anomalies in $bto sell^+ell^-$ decays can be distinguished from the SM case with the data collected between the Run 3 and the Upgrade 2 of the LHCb experiment.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا