ترغب بنشر مسار تعليمي؟ اضغط هنا

Constraining duty cycles through a Bayesian technique

72   0   0.0 ( 0 )
 نشر من قبل Patrizia Romano
 تاريخ النشر 2014
  مجال البحث فيزياء
والبحث باللغة English
 تأليف P. Romano




اسأل ChatGPT حول البحث

The duty cycle (DC) of astrophysical sources is generally defined as the fraction of time during which the sources are active. However, DCs are generally not provided with statistical uncertainties, since the standard approach is to perform Monte Carlo bootstrap simulations to evaluate them, which can be quite time consuming for a large sample of sources. As an alternative, considerably less time-consuming approach, we derived the theoretical expectation value for the DC and its error for sources whose state is one of two possible, mutually exclusive states, inactive (off) or flaring (on), as based on a finite set of independent observational data points. Following a Bayesian approach, we derived the analytical expression for the posterior, the conjugated distribution adopted as prior, and the expectation value and variance. We applied our method to the specific case of the inactivity duty cycle (IDC) for supergiant fast X-ray transients. We also studied IDC as a function of the number of observations in the sample. Finally, we compare the results with the theoretical expectations. We found excellent agreement with our findings based on the standard bootstrap method. Our Bayesian treatment can be applied to all sets of independent observations of two-state sources, such as active galactic nuclei, X-ray binaries, etc. In addition to being far less time consuming than bootstrap methods, the additional strength of this approach becomes obvious when considering a well-populated class of sources ($N_{rm src} geq 50$) for which the prior can be fully characterized by fitting the distribution of the observed DCs for all sources in the class, so that, through the prior, one can further constrain the DC of a new source by exploiting the information acquired on the DC distribution derived from the other sources. [Abridged]

قيم البحث

اقرأ أيضاً

This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical dataset containing spatial information and performs a tessellation ba sed on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real Integral-Field Spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of two. Our analysis reveals that the algorithm prioritizes conservation of all the statistically-significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a `black box to improve the signal-to-noise ratio, but as a new approach to characterize spatially-resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn .
We performed simulations of a large number of so-called very faint X-ray transient sources from surveys obtained using the X-ray telescope aboard the Neil Gehrels emph{Swift} Observatory on two Galactic globular clusters, and the Galactic Center. We calculated the ratio between the duty cycle we input in our simulations and the one we measure after the simulations. We found that fluctuations in outburst duration and recurrence times affect our estimation of the duty cycle more than non detected outbursts. This biases our measures to overestimate the simulated duty cycle of sources. Moreover, we determined that compact surveys are necessary to detect outbursts with short duration because they could fall in gaps between observations, if such gaps are longer than their duration. On the other hand, long surveys are necessary to detect sources with low duty cycle because the smallest duty cycle a survey can observe is given by the ratio between the shortest outburst duration and the total length of the survey. If one has a limited amount of observing time, these two effects are competing, and a compromise is required which is set by the goals of the proposed survey. We have also performed simulations with several artificial survey strategies in order to evaluate the optimal observing campaign aimed at detecting transients as well as at having the most accurate estimates of the duty cycle. As expected, the best campaign would be a regular and dense monitoring that extends for a very long period. The closest real example of such a dataset is the monitoring of the Galactic Centre.
The lunar Askaryan technique, which involves searching for Askaryan radio pulses from particle cascades in the outer layers of the Moon, is a method for using the lunar surface as an extremely large detector of ultra-high-energy particles. The high t ime resolution required to detect these pulses, which have a duration of around a nanosecond, puts this technique in a regime quite different from other forms of radio astronomy, with a unique set of associated technical challenges which have been addressed in a series of experiments by various groups. Implementing the methods and techniques developed by these groups for detecting lunar Askaryan pulses will be important for a future experiment with the Square Kilometre Array (SKA), which is expected to have sufficient sensitivity to allow the first positive detection using this technique. Key issues include correction for ionospheric dispersion, beamforming, efficient triggering, and the exclusion of spurious events from radio-frequency interference. We review the progress in each of these areas, and consider the further progress expected for future application with the SKA.
We study the mass of quasar-hosting dark matter halos at z $sim$ 6 and further constrain the fraction of dark matter halos hosting an active quasar $f_{on}$ and the quasar beaming angle $i_{rm max}$ using observations of CII lines in the literature. We make assumptions that (1) more massive halos host brighter quasars, (2) a fraction of the halos host active quasars with a certain beaming angle, (3) cold gas in galaxies has rotational velocity $V_{rm circ}=alpha V_{rm max}$, and that (4) quasars point randomly on the sky. We find that for a choice of specific $alpha gtrsim 1$, the most likely solution has $f_{rm on} < 0.01$, corresponding to a small duty cycle of quasar activity. However, if we marginalize over $alpha$, for some choices of a prior a second solution with $f_{rm on}=1$ appears. Overall, our the constraints are highly sensitive to $alpha$ and hence inconclusive. Stronger constraints on $f_{rm on}$ can be made if we better understand the dynamics of cold gas in these galaxies.
Binary neutron star mergers are rich laboratories for physics, accessible with ground-based interferometric gravitational-wave detectors such as the Advanced LIGO and Advanced Virgo. If a neutron star remnant survives the merger, it can emit gravitat ional waves that might be detectable with the current or next generation detectors. The physics of the long-lived post-merger phase is not well understood and makes modelling difficult. In particular the phase of the gravitational-wave signal is not well modelled. In this paper, we explore methods for using long duration post-merger gravitational-wave signals to constrain the parameters and the properties of the remnant. We develop a phase-agnostic likelihood model that uses only the spectral content for parameter estimation and demonstrate the calculation of a Bayesian upper limit in the absence of a signal. With the millisecond magnetar model, we show that for an event like GW170817, the ellipticity of a long-lived remnant can be constrained to less than about 0.5 in the parameter space used.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا