ترغب بنشر مسار تعليمي؟ اضغط هنا

A Mock Data Challenge for the Einstein Gravitational-Wave Telescope

243   0   0.0 ( 0 )
 نشر من قبل Tania Regimbau
 تاريخ النشر 2012
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Einstein Telescope (ET) is conceived to be a third generation gravitational-wave observatory. Its amplitude sensitivity would be a factor ten better than advanced LIGO and Virgo and it could also extend the low-frequency sensitivity down to 1--3 Hz, compared to the 10--20 Hz of advanced detectors. Such an observatory will have the potential to observe a variety of different GW sources, including compact binary systems at cosmological distances. ETs expected reach for binary neutron star (BNS) coalescences is out to redshift $zsimeq 2$ and the rate of detectable BNS coalescences could be as high as one every few tens or hundreds of seconds, each lasting up to several days. %in the sensitive frequency band of ET. With such a signal-rich environment, a key question in data analysis is whether overlapping signals can be discriminated. In this paper we simulate the GW signals from a cosmological population of BNS and ask the following questions: Does this population create a confusion background that limits ETs ability to detect foreground sources? How efficient are current algorithms in discriminating overlapping BNS signals? Is it possible to discern the presence of a population of signals in the data by cross-correlating data from different detectors in the ET observatory? We find that algorithms currently used to analyze LIGO and Virgo data are already powerful enough to detect the sources expected in ET, but new algorithms are required to fully exploit ET data.



قيم البحث

اقرأ أيضاً

The observation of binary neutron star merger GW170817, along with its optical counterpart, provided the first constraint on the Hubble constant $H_0$ using gravitational wave standard sirens. When no counterpart is identified, a galaxy catalog can b e used to provide the necessary redshift information. However, the true host might not be contained in a catalog which is not complete out to the limit of gravitational-wave detectability. These electromagnetic and gravitational-wave selection effects must be accounted for. We describe and implement a method to estimate $H_0$ using both the counterpart and the galaxy catalog standard siren methods. We perform a series of mock data analyses using binary neutron star mergers to confirm our ability to recover an unbiased estimate of $H_0$. Our simulations used a simplified universe with no redshift uncertainties or galaxy clustering, but with different magnitude-limited catalogs and assumed host galaxy properties, to test our treatment of both selection effects. We explore how the incompleteness of catalogs affects the final measurement of $H_0$, as well as the effect of weighting each galaxys likelihood of being a host by its luminosity. In our most realistic simulation, where the simulated catalog is about three times denser than the density of galaxies in the local universe, we find that a 4.4% measurement precision can be reached using galaxy catalogs with 50% completeness and $sim 250$ binary neutron star detections with sensitivity similar to that of Advanced LIGOs second observing run.
We describe the design of a gravitational wave timing array, a novel scheme that can be used to search for low-frequency gravitational waves by monitoring continuous gravitational waves at higher frequencies. We show that observations of gravitationa l waves produced by Galactic binaries using a space-based detector like LISA provide sensitivity in the nanohertz to microhertz band. While the expected sensitivity of this proposal is not competitive with other methods, it fills a gap in frequency space around the microhertz regime, which is above the range probed by current pulsar timing arrays and below the expected direct frequency coverage of LISA. The low-frequency extension of sensitivity does not require any experimental design change to space-based gravitational wave detectors, and can be achieved with the data products that would already be collected by them.
The Mock LISA Data Challenges are a program to demonstrate LISA data-analysis capabilities and to encourage their development. Each round of challenges consists of one or more datasets containing simulated instrument noise and gravitational waves fro m sources of undisclosed parameters. Participants analyze the datasets and report best-fit solutions for the source parameters. Here we present the results of the third challenge, issued in Apr 2008, which demonstrated the positive recovery of signals from chirping Galactic binaries, from spinning supermassive--black-hole binaries (with optimal SNRs between ~ 10 and 2000), from simultaneous extreme-mass-ratio inspirals (SNRs of 10-50), from cosmic-string-cusp bursts (SNRs of 10-100), and from a relatively loud isotropic background with Omega_gw(f) ~ 10^-11, slightly below the LISA instrument noise.
The Mock LISA Data Challenges are a programme to demonstrate and encourage the development of LISA data-analysis capabilities, tools and techniques. At the time of this workshop, three rounds of challenges had been completed, and the next was about t o start. In this article we provide a critical analysis of entries to the latest completed round, Challenge 1B. The entries confirm the consolidation of a range of data-analysis techniques for Galactic and massive--black-hole binaries, and they include the first convincing examples of detection and parameter estimation of extreme--mass-ratio inspiral sources. In this article we also introduce the next round, Challenge 3. Its data sets feature more realistic waveform models (e.g., Galactic binaries may now chirp, and massive--black-hole binaries may precess due to spin interactions), as well as new source classes (bursts from cosmic strings, isotropic stochastic backgrounds) and more complicated nonsymmetric instrument noise.
The Mock LISA Data Challenges are a program to demonstrate LISA data-analysis capabilities and to encourage their development. Each round of challenges consists of several data sets containing simulated instrument noise and gravitational-wave sources of undisclosed parameters. Participants are asked to analyze the data sets and report the maximum information about source parameters. The challenges are being released in rounds of increasing complexity and realism: in this proceeding we present the results of Challenge 2, issued in January 2007, which successfully demonstrated the recovery of signals from supermassive black-hole binaries, from ~20,000 overlapping Galactic white-dwarf binaries, and from the extreme-mass-ratio inspirals of compact objects into central galactic black holes.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا