Do you want to publish a course? Click here

readPTU: a Python Library to Analyse Time Tagged Time Resolved Data

70   0   0.0 ( 0 )
 Publication date 2019
  fields Physics
and research's language is English




Ask ChatGPT about the research

readPTU is a python package designed to analyze time-correlated single-photon counting data. The use of the library promotes the storage of the complete time arrival information of the photons and full flexibility in post-processing data for analysis. The library supports the computation of time resolved signal with external triggers and second order autocorrelation function analysis can be performed using multiple algorithms that provide the user with different trade-offs with regards to speed and accuracy. Additionally, a thresholding algorithm to perform time post-selection is also available. The library has been designed with performance and extensibility in mind to allow future users to implement support for additional file extensions and algorithms without having to deal with low level details. We demonstrate the performance of readPTU by analyzing the second-order autocorrelation function of the resonance fluorescence from a single quantum dot in a two-dimensional semiconductor.

rate research

Read More

The MOST (Microvariability & Oscillations of STars) satellite obtains ultraprecise photometry from space with high sampling rates and duty cycles. Astronomical photometry or imaging missions in low Earth orbits, like MOST, are especially sensitive to scattered light from Earthshine, and all these missions have a common need to extract target information from voluminous data cubes. They consist of upwards of hundreds of thousands of two-dimensional CCD frames (or sub-rasters) containing from hundreds to millions of pixels each, where the target information, superposed on background and instrumental effects, is contained only in a subset of pixels (Fabry Images, defocussed images, mini-spectra). We describe a novel reduction technique for such data cubes: resolving linear correlations of target and background pixel intensities. This stepwise multiple linear regression removes only those target variations which are also detected in the background. The advantage of regression analysis versus background subtraction is the appropriate scaling, taking into account that the amount of contamination may differ from pixel to pixel. The multivariate solution for all pairs of target/background pixels is minimally invasive of the raw photometry while being very effective in reducing contamination due to, e.g., stray light. The technique is tested and demonstrated with both simulated oscillation signals and real MOST photometry.
Since Bandt and Pompes seminal work, permutation entropy has been used in several applications and is now an essential tool for time series analysis. Beyond becoming a popular and successful technique, permutation entropy inspired a framework for mapping time series into symbolic sequences that triggered the development of many other tools, including an approach for creating networks from time series known as ordinal networks. Despite the increasing popularity, the computational development of these methods is fragmented, and there were still no efforts focusing on creating a unified software package. Here we present ordpy, a simple and open-source Python module that implements permutation entropy and several of the principal methods related to Bandt and Pompes framework to analyze time series and two-dimensional data. In particular, ordpy implements permutation entropy, Tsallis and Renyi permutation entropies, complexity-entropy plane, complexity-entropy curves, missing ordinal patterns, ordinal networks, and missing ordinal transitions for one-dimensional (time series) and two-dimensional (images) data as well as their multiscale generalizations. We review some theoretical aspects of these tools and illustrate the use of ordpy by replicating several literature results.
LISA is the upcoming space-based Gravitational Wave telescope. LISA Pathfinder, to be launched in the coming years, will prove and verify the detection principle of the fundamental Doppler link of LISA on a flight hardware identical in design to that of LISA. LISA Pathfinder will collect a picture of all noise disturbances possibly affecting LISA, achieving the unprecedented pureness of geodesic motion necessary for the detection of gravitational waves. The first steps of both missions will crucially depend on a very precise calibration of the key system parameters. Moreover, robust parameters estimation is of fundamental importance in the correct assessment of the residual force noise, an essential part of the data processing for LISA. In this paper we present a maximum likelihood parameter estimation technique in time domain being devised for this calibration and show its proficiency on simulated data and validation through Monte Carlo realizations of independent noise runs. We discuss its robustness to non-standard scenarios possibly arising during the real-life mission, as well as its independence to the initial guess and non-gaussianities. Furthermore, we apply the same technique to data produced in mission-like fashion during operational exercises with a realistic simulator provided by ESA.
PyUnfold is a Python package for incorporating imperfections of the measurement process into a data analysis pipeline. In an ideal world, we would have access to the perfect detector: an apparatus that makes no error in measuring a desired quantity. However, in real life, detectors have finite resolutions, characteristic biases that cannot be eliminated, less than full detection efficiencies, and statistical and systematic uncertainties. By building a matrix that encodes a detectors smearing of the desired true quantity into the measured observable(s), a deconvolution can be performed that provides an estimate of the true variable. This deconvolution process is known as unfolding. The unfolding method implemented in PyUnfold accomplishes this deconvolution via an iterative procedure, providing results based on physical expectations of the desired quantity. Furthermore, tedious book-keeping for both statistical and systematic errors produces precise final uncertainty estimates.
The GERDA and Majorana experiments will search for neutrinoless double-beta decay of germanium-76 using isotopically enriched high-purity germanium detectors. Although the experiments differ in conceptual design, they share many aspects in common, and in particular will employ similar data analysis techniques. The collaborations are jointly developing a C++ software library, MGDO, which contains a set of data objects and interfaces to encapsulate, store and manage physical quantities of interest, such as waveforms and high-purity germanium detector geometries. These data objects define a common format for persistent data, whether it is generated by Monte Carlo simulations or an experimental apparatus, to reduce code duplication and to ease the exchange of information between detector systems. MGDO also includes general-purpose analysis tools that can be used for the processing of measured or simulated digital signals. The MGDO design is based on the Object-Oriented programming paradigm and is very flexible, allowing for easy extension and customization of the components. The tools provided by the MGDO libraries are used by both GERDA and Majorana.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا