ترغب بنشر مسار تعليمي؟ اضغط هنا

A semi-supervised Machine Learning search for never-seen Gravitational-Wave sources

70   0   0.0 ( 0 )
 نشر من قبل Tom Marianer
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

By now, tens of gravitational-wave (GW) events have been detected by the LIGO and Virgo detectors. These GWs have all been emitted by compact binary coalescence, for which we have excellent predictive models. However, there might be other sources for which we do not have reliable models. Some are expected to exist but to be very rare (e.g., supernovae), while others may be totally unanticipated. So far, no unmodeled sources have been discovered, but the lack of models makes the search for such sources much more difficult and less sensitive. We present here a search for unmodeled GW signals using semi-supervised machine learning. We apply deep learning and outlier detection algorithms to labeled spectrograms of GW strain data, and then search for spectrograms with anomalous patterns in public LIGO data. We searched $sim 13%$ of the coincident data from the first two observing runs. No candidates of GW signals were detected in the data analyzed. We evaluate the sensitivity of the search using simulated signals, we show that this search can detect spectrograms containing unusual or unexpected GW patterns, and we report the waveforms and amplitudes for which a $50%$ detection rate is achieved.

قيم البحث

اقرأ أيضاً

Photometric variability detection is often considered as a hypothesis testing problem: an object is variable if the null-hypothesis that its brightness is constant can be ruled out given the measurements and their uncertainties. Uncorrected systemati c errors limit the practical applicability of this approach to high-amplitude variability and well-behaving data sets. Searching for a new variability detection technique that would be applicable to a wide range of variability types while being robust to outliers and underestimated measurement uncertainties, we propose to consider variability detection as a classification problem that can be approached with machine learning. We compare several classification algorithms: Logistic Regression (LR), Support Vector Machines (SVM), k-Nearest Neighbors (kNN) Neural Nets (NN), Random Forests (RF) and Stochastic Gradient Boosting classifier (SGB) applied to 18 features (variability indices) quantifying scatter and/or correlation between points in a light curve. We use a subset of OGLE-II Large Magellanic Cloud (LMC) photometry (30265 light curves) that was searched for variability using traditional methods (168 known variable objects identified) as the training set and then apply the NN to a new test set of 31798 OGLE-II LMC light curves. Among 205 candidates selected in the test set, 178 are real variables, 13 low-amplitude variables are new discoveries. We find that the considered machine learning classifiers are more efficient (they find more variables and less false candidates) compared to traditional techniques that consider individual variability indices or their linear combination. The NN, SGB, SVM and RF show a higher efficiency compared to LR and kNN.
104 - H.G. Khachatryan 2021
We consider a machine learning algorithm to detect and identify strong gravitational lenses on sky images. First, we simulate different artificial but very close to reality images of galaxies, stars and strong lenses, using six different methods, i.e . two for each class. Then we deploy a convolutional neural network architecture to classify these simulated images. We show that after neural network training process one achieves about 93 percent accuracy. As a simple test for the efficiency of the convolutional neural network, we apply it on an real Einstein cross image. Deployed neural network classifies it as gravitational lens, thus opening a way for variety of lens search applications of the deployed machine learning scheme.
Many continuous gravitational wave searches are affected by instrumental spectral lines that could be confused with a continuous astrophysical signal. Several techniques have been developed to limit the effect of these lines by penalising signals tha t appear in only a single detector. We have developed a general method, using a convolutional neural network, to reduce the impact of instrumental artefacts on searches that use the SOAP algorithm. The method can identify features in corresponding frequency bands of each detector and classify these bands as containing a signal, an instrumental line, or noise. We tested the method against four different data-sets: Gaussian noise with time gaps, data from the final run of Initial LIGO (S6) with signals added, the reference S6 mock data challenge data set and signals injected into data from the second advanced LIGO observing run (O2). Using the S6 mock data challenge data set and at a 1% false alarm probability we showed that at 95% efficiency a fully-automated SOAP search has a sensitivity corresponding to a coherent signal-to-noise ratio of 110, equivalent to a sensitivity depth of 10 Hz$^{-1/2}$, making this automated search competitive with other searches requiring significantly more computing resources and human intervention.
In this paper, we report on the construction of a deep Artificial Neural Network (ANN) to localize simulated gravitational wave signals in the sky with high accuracy. We have modelled the sky as a sphere and have considered cases where the sphere is divided into 18, 50, 128, 1024, 2048 and 4096 sectors. The sky direction of the gravitational wave source is estimated by classifying the signal into one of these sectors based on its right ascension and declination values for each of these cases. In order to do this, we have injected simulated binary black hole gravitational wave signals of component masses sampled uniformly between 30-80 solar mass into Gaussian noise and used the whitened strain values to obtain the input features for training our ANN. We input features such as the delays in arrival times, phase differences and amplitude ratios at each of the three detectors Hanford, Livingston and Virgo, from the raw time-domain strain values as well as from analytic
Gravitational waves from the coalescence of compact-binary sources are now routinely observed by Earth bound detectors. The most sensitive search algorithms convolve many different pre-calculated gravitational waveforms with the detector data and loo k for coincident matches between different detectors. Machine learning is being explored as an alternative approach to building a search algorithm that has the prospect to reduce computational costs and target more complex signals. In this work we construct a two-detector search for gravitational waves from binary black hole mergers using neural networks trained on non-spinning binary black hole data from a single detector. The network is applied to the data from both observatories independently and we check for events coincident in time between the two. This enables the efficient analysis of large quantities of background data by time-shifting the independent detector data. We find that while for a single detector the network retains $91.5%$ of the sensitivity matched filtering can achieve, this number drops to $83.9%$ for two observatories. To enable the network to check for signal consistency in the detectors, we then construct a set of simple networks that operate directly on data from both detectors. We find that none of these simple two-detector networks are capable of improving the sensitivity over applying networks individually to the data from the detectors and searching for time coincidences.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا