No Arabic abstract
While the radio detection of cosmic rays has advanced to a standard method in astroparticle physics, the radio detection of neutrinos is just about to start its full bloom. The successes of pilot-arrays have to be accompanied by the development of modern and flexible software tools to ensure rapid progress in reconstruction algorithms and data processing. We present NuRadioReco as such a modern Python-based data analysis tool. It includes a suitable data-structure, a database-implementation of a time-dependent detector, modern browser-based data visualization tools, and fully separated analysis modules. We describe the framework and examples, as well as new reconstruction algorithms to obtain the full three-dimensional electric field from distributed antennas which is needed for high-precision energy reconstruction of particle showers.
Ultra high energy neutrinos ($E_ u > 10^{16.5}$eV$)$ are efficiently measured via radio signals following a neutrino interaction in ice. An antenna placed $mathcal{O}$(15 m) below the ice surface will measure two signals for the vast majority of events (90% at $E_ u$=$10^{18}$eV$)$: a direct pulse and a second delayed pulse from a reflection off the ice surface. This allows for a unique identification of neutrinos against backgrounds arriving from above. Furthermore, the time delay between the direct and reflected signal (DnR) correlates with the distance to the neutrino interaction vertex, a crucial quantity to determine the neutrino energy. In a simulation study, we derive the relation between time delay and distance and study the corresponding experimental uncertainties in estimating neutrino energies. We find that the resulting contribution to the energy resolution is well below the natural limit set by the unknown inelasticity in the initial neutrino interaction. We present an in-situ measurement that proves the experimental feasibility of this technique. Continuous monitoring of the local snow accumulation in the vicinity of the transmit and receive antennas using this technique provide a precision of $mathcal{O}$(1 mm) in surface elevation, which is much better than that needed to apply the DnR technique to neutrinos.
Imaging by aperture synthesis from interferometric data is a well-known, but is a strong ill-posed inverse problem. Strong and faint radio sources can be imaged unambiguously using time and frequency integration to gather more Fourier samples of the sky. However, these imagers assumes a steady sky and the complexity of the problem increases when transients radio sources are also present in the data. Hopefully, in the context of transient imaging, the spatial and temporal information are separable which enable extension of an imager fit for a steady sky. We introduce independent spatial and temporal wavelet dictionaries to sparsely represent the transient in both spatial domain and temporal domain. These dictionaries intervenes in a new reconstruction method developed in the Compressed Sensing (CS) framework and using a primal-dual splitting algorithm. According to the preliminary tests in different noise regimes, this new Time-agile (or 2D-1D) method seems to be efficient in detecting and reconstructing the transients temporal dependence.
Radio detection of air showers produced by ultra-high energy cosmic rays is a cost-effective technique for the next generation of sparse arrays. The performance of this technique strongly depends on the environmental background, which has different constituents, namely anthropogenic radio frequency interference, synchrotron galactic radiation and others. These components have recognizable features, which can help for background suppression. A powerful method for handling this is the application of convolution neural networks with a specific architecture called autoencoder. By suppressing unwanted signatures, the autoencoder keeps the signal-like ones. We have successfully developed and trained an autoencoder, which is now applied to the data from Tunka-Rex. We show the procedures of the training and optimization of the network including benchmarks of different architectures. Using the autoencoder, we improved the standard analysis of Tunka-Rex in order to lower the threshold of the detection. This enables the reconstructing of sub-threshold events with energies lower than 0.1 EeV with satisfactory angular and energy resolutions.
Starting in summer 2021, the Radio Neutrino Observatory in Greenland (RNO-G) will search for astrophysical neutrinos at energies >10 PeV by detecting the radio emission from particle showers in the ice around Summit Station, Greenland. We present an extensive simulation study that shows how RNO-G will be able to measure the energy of such particle cascades, which will in turn be used to estimate the energy of the incoming neutrino that caused them. The location of the neutrino interaction is determined using the differences in arrival times between channels and the electric field of the radio signal is reconstructed using a novel approach based on Information Field Theory. Based on these properties, the shower energy can be estimated. We show that this method can achieve an uncertainty of 13% on the logarithm of the shower energy after modest quality cuts and estimate how this can constrain the energy of the neutrino. The method presented in this paper is applicable to all similar radio neutrino detectors, such as the proposed radio array of IceCube-Gen2.
We have developed a flexible radio-frequency readout system suitable for a variety of superconducting detectors commonly used in millimeter and submillimeter astrophysics, including Kinetic Inductance detectors (KIDs), Thermal KID bolometers (TKIDs), and Quantum Capacitance Detectors (QCDs). Our system avoids custom FPGA-based readouts and instead uses commercially available software radio hardware for ADC/DAC and a GPU to handle real time signal processing. Because this system is written in common C++/CUDA, the range of different algorithms that can be quickly implemented make it suitable for the readout of many others cryogenic detectors and for the testing of different and possibly more effective data acquisition schemes.