ترغب بنشر مسار تعليمي؟ اضغط هنا

Recent years have seen a burgeoning interest in using pulsar timing arrays (PTAs) as gravitational-wave (GW) detectors. To date, that interest has focused mainly on three particularly promising source types: supermassive--black-hole binaries, cosmic strings, and the stochastic background from early-Universe phase transitions. In this paper, by contrast, our aim is to investigate the PTA potential for discovering unanticipated sources. We derive significant constraints on the available discovery space based solely on energetic and statistical considerations: we show that a PTA detection of GWs at frequencies above ~3.e-5 Hz would either be an extraordinary coincidence or violate cherished beliefs; we show that for PTAs GW memory can be more detectable than direct GWs, and that, as we consider events at ever higher redshift, the memory effect increasingly dominates an events total signal-to-noise ratio. The paper includes also a simple analysis of the effects of pulsar red noise in PTA searches, and a demonstration that the effects of periodic GWs in the 10^-8 -- 10^-4.5 Hz band would not be degenerate with small errors in standard pulsar parameters (except in a few narrow bands).
67 - Curt Cutler 2011
Rapidly rotating, slightly non-axisymmetric neutron stars emit nearly periodic gravitational waves (GWs), quite possibly at levels detectable by ground-based GW interferometers. We refer to these sources as GW pulsars. For any given sky position and frequency evolution, the F-statistic is the optimal (frequentist) statistic for the detection of GW pulsars. However, in all-sky searches for previously unknown GW pulsars, it would be computationally intractable to calculate the (fully coherent) F-statistic at every point of a (suitably fine) grid covering the parameter space: the number of gridpoints is many orders of magnitude too large for that. Here we introduce a phase-relaxed F-statistic, which we denote F_pr, for incoherently combining the results of fully coherent searches over short time intervals. We estimate (very roughly) that for realistic searches, our F_pr is ~10-15% more sensitive than the semi-coherent F-statistic that is currently used. Moreover, as a byproduct of computing F_pr, one obtains a rough determination of the time-evolving phase offset between ones template and the true signal imbedded in the detector noise. Almost all the ingredients that go into calculating F_pr are already implemented in LAL, so we expect that relatively little additional effort would be required to develop a search code that uses F_pr.
We show that the Big Bang Observer (BBO), a proposed space-based gravitational-wave (GW) detector, would provide ultra-precise measurements of cosmological parameters. By detecting ~300,000 compact-star binaries, and utilizing them as standard sirens , BBO would determine the Hubble constant to 0.1%, and the dark energy parameters w_0 and w_a to ~0.01 and 0.1,resp. BBOs dark-energy figure-of-merit would be approximately an order of magnitude better than all other proposed dark energy missions. To date, BBO has been designed with the primary goal of searching for gravitational waves from inflation. To observe this inflationary background, BBO would first have to detect and subtract out ~300,000 merging compact-star binaries, out to z~5. It is precisely this foreground which would enable high-precision cosmology. BBO would determine the luminosity distance to each binary to ~percent accuracy. BBOs angular resolution would be sufficient to uniquely identify the host galaxy for most binaries; a coordinated optical/infrared observing campaign could obtain the redshifts. Combining the GW-derived distances and EM-derived redshifts for such a large sample of objects leads to extraordinarily tight constraints on cosmological parameters. Such ``standard siren measurements of cosmology avoid many of the systematic errors associated with other techniques. We also show that BBO would be an exceptionally powerful gravitational lensing mission, and we briefly discuss other astronomical uses of BBO.
Data analysis for the proposed Laser Interferometer Space Antenna (LISA) will be complicated by the huge number of sources in the LISA band. Throughout much of the band, galactic white dwarf binaries (GWDBs) are sufficiently dense in frequency space that it will be impossible to resolve most of them, and confusion noise from the unresolved Galactic binaries will dominate over instrumental noise in determining LISAs sensitivity to other sources in that band. Confusion noise from unresolved extreme-mass-ratio inspirals (EMRIs) could also contribute significantly to LISAs total noise curve. To date, estimates of the effect of LISAs confusion noise on matched-filter searches and their detection thresholds have generally approximated the noise as Gaussian, based on the Central Limit Theorem. However in matched-filter searches, the appropriate detection threshold for a given class of signals may be located rather far out on the tail of the signal-to-noise probability distribution, where a priori it is unclear whether the Gaussian approximation is reliable. Using the Edgeworth expansion and the theory of large deviations, we investigate the probability distribution of the usual matched-filter detection statistic, far out on the tail of the distribution. We apply these tools to four somewhat idealiz
Gravitational waves from the inspiral and coalescence of supermassive black-hole (SMBH) binaries with masses ~10^6 Msun are likely to be among the strongest sources for the Laser Interferometer Space Antenna (LISA). We describe a three-stage data-ana lysis pipeline designed to search for and measure the parameters of SMBH binaries in LISA data. The first stage uses a time-frequency track-search method to search for inspiral signals and provide a coarse estimate of the black-hole masses m_1, m_2 and of the coalescence time of the binary t_c. The second stage uses a sequence of matched-filter template banks, seeded by the first stage, to improve the measurement accuracy of the masses and coalescence time. Finally, a Markov Chain Monte Carlo search is used to estimate all nine physical parameters of the binary. Using results from the second stage substantially shortens the Markov Chain burn-in time and allows us to determine the number of SMBH-binary signals in the data before starting parameter estimation. We demonstrate our analysis pipeline using simulated data from the first LISA Mock Data Challenge. We discuss our plan for improving this pipeline and the challenges that will be faced in real LISA data analysis.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا