No Arabic abstract
In time domain astronomy, recurrent transients present a special problem: how to infer total populations from limited observations. Monitoring observations may give a biassed view of the underlying population due to limitations on observing time, visibility and instrumental sensitivity. A similar problem exists in the life sciences, where animal populations (such as migratory birds) or disease prevalence, must be estimated from sparse and incomplete data. The class of methods termed Capture-Recapture is used to reconstruct population estimates from time-series records of encounters with the study population. This paper investigates the performance of Capture-Recapture methods in astronomy via a series of numerical simulations. The Blackbirds code simulates monitoring of populations of transients, in this case accreting binary stars (neutron star or black hole accreting from a stellar companion) under a range of observing strategies. We first generate realistic light-curves for populations of binaries with contrasting orbital period distributions. These models are then randomly sampled at observing cadences typical of existing and planned monitoring surveys. The classical capture-recapture methods, Lincoln-Peterson, Schnabel estimators, related techniques, and newer methods implemented in the Rcapture package are compared. A general exponential model based on the radioactive decay law is introduced, and demonstrated to recover (at 95% confidence) the underlying population abundance and duty cycle, in a fraction of the observing visits (10-50%) required to discover all the sources in the simulation. Capture-Recapture is a promising addition to the toolbox of time domain astronomy, and methods implemented in R by the biostats community can be readily called from within Python.
Celestial objects exhibit a wide range of variability in brightness at different wavebands. Surprisingly, the most common methods for characterizing time series in statistics -- parametric autoregressive modeling -- is rarely used to interpret astronomical light curves. We review standard ARMA, ARIMA and ARFIMA (autoregressive moving average fractionally integrated) models that treat short-memory autocorrelation, long-memory $1/f^alpha$ `red noise, and nonstationary trends. Though designed for evenly spaced time series, moderately irregular cadences can be treated as evenly-spaced time series with missing data. Fitting algorithms are efficient and software implementations are widely available. We apply ARIMA models to light curves of four variable stars, discussing their effectiveness for different temporal characteristics. A variety of extensions to ARIMA are outlined, with emphasis on recently developed continuous-time models like CARMA and CARFIMA designed for irregularly spaced time series. Strengths and weakness of ARIMA-type modeling for astronomical data analysis and astrophysical insights are reviewed.
Estimation of population size using incomplete lists (also called the capture-recapture problem) has a long history across many biological and social sciences. For example, human rights and other groups often construct partial and overlapping lists of victims of armed conflicts, with the hope of using this information to estimate the total number of victims. Earlier statistical methods for this setup either use potentially restrictive parametric assumptions, or else rely on typically suboptimal plug-in-type nonparametric estimators; however, both approaches can lead to substantial bias, the former via model misspecification and the latter via smoothing. Under an identifying assumption that two lists are conditionally independent given measured covariate information, we make several contributions. First, we derive the nonparametric efficiency bound for estimating the capture probability, which indicates the best possible performance of any estimator, and sheds light on the statistical limits of capture-recapture methods. Then we present a new estimator, and study its finite-sample properties, showing that it has a double robustness property new to capture-recapture, and that it is near-optimal in a non-asymptotic sense, under relatively mild nonparametric conditions. Next, we give a method for constructing confidence intervals for total population size from generic capture probability estimators, and prove non-asymptotic near-validity. Finally, we study our methods in simulations, and apply them to estimate the number of killings and disappearances attributable to different groups in Peru during its internal armed conflict between 1980 and 2000.
NASAs Swift satellite has completed ten years of amazing discoveries in time domain astronomy. Its primary mission is to chase gamma-ray bursts (GRBs), but due to its scheduling flexibility it has subsequently become a prime discovery machine for new types of behavior. The list of major discoveries in GRBs and other transients includes the long-lived X-ray afterglows and flares from GRBs, the first accurate localization of short GRBs, the discovery of GRBs at high redshift (z>8), supernova shock break-out from SN Ib, a jetted tidal disruption event, an ultra-long class of GRBs, high energy emission from flare stars, novae and supernovae with unusual characteristics, magnetars with glitches in their spin periods, and a short GRB with evidence of an accompanying kilonova. Swift has developed a dynamic synergism with ground based observatories. In a few years gravitational wave observatories will come on-line and provide exciting new transient sources for Swift to study.
White noise methods are a powerful tool for characterizing the computation performed by neural systems. These methods allow one to identify the feature or features that a neural system extracts from a complex input, and to determine how these features are combined to drive the systems spiking response. These methods have also been applied to characterize the input/output relations of single neurons driven by synaptic inputs, simulated by direct current injection. To interpret the results of white noise analysis of single neurons, we would like to understand how the obtained feature space of a single neuron maps onto the biophysical properties of the membrane, in particular the dynamics of ion channels. Here, through analysis of a simple dynamical model neuron, we draw explicit connections between the output of a white noise analysis and the underlying dynamical system. We find that under certain assumptions, the form of the relevant features is well defined by the parameters of the dynamical system. Further, we show that under some conditions, the feature space is spanned by the spike-triggered average and its successive order time derivatives.
The large-scale surveys such as PTF, CRTS and Pan-STARRS-1 that have emerged within the past 5 years or so employ digital databases and modern analysis tools to accentuate research into Time Domain Astronomy (TDA). Preparations are underway for LSST which, in another 6 years, will usher in the second decade of modern TDA. By that time the Digital Access to a Sky Century @ Harvard (DASCH) project will have made available to the community the full sky Historical TDA database and digitized images for a century (1890--1990) of coverage. We describe the current DASCH development and some initial results, and outline plans for the production scanning phase and data distribution which is to begin in 2012. That will open a 100-year window into temporal astrophysics, revealing rare transients and (especially) astrophysical phenomena that vary on time-scales of a decade. It will also provide context and archival comparisons for the deeper modern surveys