No Arabic abstract
A physical field has an infinite number of degrees of freedom since it has a field value at each location of a continuous space. Therefore, it is impossible to know a field from finite measurements alone and prior information on the field is essential for field inference. An information theory for fields is needed to join the measurement and prior information into probabilistic statements on field configurations. Such an information field theory (IFT) is built upon the language of mathematical physics, in particular on field theory and statistical mechanics. IFT permits the mathematical derivation of optimal imaging algorithms, data analysis methods, and even computer simulation schemes. The application of IFT algorithms to astronomical datasets provides high fidelity images of the Universe and facilitates the search for subtle statistical signals from the Big Bang. The concepts of IFT might even pave the road to novel computer simulations that are aware of their own uncertainties.
In this paper we present the coordinates of 67 55 x 55 patches of sky which have the rare combination of both high stellar surface density (>0.5 arcmin^{-2} with 13<R<16.5 mag) and low extinction (E(B-V)<0.1). These fields are ideal for adaptive-optics based follow-up of extragalactic targets. One region of sky, situated near Baades Window, contains most of the patches we have identified. Our optimal field, centered at RA: 7h24m3s, Dec: -1deg2715, has an additional advantage of being accessible from both hemispheres. We propose a figure of merit for quantifying real-world adaptive optics performance, and use this to analyze the performance of multi-conjugate adaptive optics in these fields. We also compare our results to those that would be obtained in existing deep fields. In some cases adaptive optics observations undertaken in the fields given in this paper would be orders of magnitude more efficient than equivalent observations undertaken in existing deep fields.
We construct the spin flaglet transform, a wavelet transform to analyze spin signals in three dimensions. Spin flaglets can probe signal content localized simultaneously in space and frequency and, moreover, are separable so that their angular and radial properties can be controlled independently. They are particularly suited to analyzing of cosmological observations such as the weak gravitational lensing of galaxies. Such observations have a unique 3D geometrical setting since they are natively made on the sky, have spin angular symmetries, and are extended in the radial direction by additional distance or redshift information. Flaglets are constructed in the harmonic space defined by the Fourier-Laguerre transform, previously defined for scalar functions and extended here to signals with spin symmetries. Thanks to various sampling theorems, both the Fourier-Laguerre and flaglet transforms are theoretically exact when applied to bandlimited signals. In other words, in numerical computations the only loss of information is due to the finite representation of floating point numbers. We develop a 3D framework relating the weak lensing power spectrum to covariances of flaglet coefficients. We suggest that the resulting novel flaglet weak lensing estimator offers a powerful alternative to common 2D and 3D approaches to accurately capture cosmological information. While standard weak lensing analyses focus on either real or harmonic space representations (i.e., correlation functions or Fourier-Bessel power spectra, respectively), a wavelet approach inherits the advantages of both techniques, where both complicated sky coverage and uncertainties associated with the physical modeling of small scales can be handled effectively. Our codes to compute the Fourier-Laguerre and flaglet transforms are made publicly available.
In this Thesis, several results in quantum information theory are collected, most of which use entropy as the main mathematical tool. *While a direct generalization of the Shannon entropy to density matrices, the von Neumann entropy behaves differently. A long-standing open question is, whether there are quantum analogues of unconstrained non-Shannon type inequalities. Here, a new constrained non-von-Neumann type inequality is proven, a step towards a conjectured unconstrained inequality by Linden and Winter. *IID quantum state merging can be optimally achieved using the decoupling technique. The one-shot results by Berta et al. and Anshu at al., however, had to bring in additional mathematical machinery. We introduce a natural generalized decoupling paradigm, catalytic decoupling, that can reproduce the aforementioned results when used analogously to the application of standard decoupling in the asymptotic case. *Port based teleportation, a variant of standard quantum teleportation protocol, cannot be implemented perfectly. We prove several lower bounds on the necessary number of output ports N to achieve port based teleportation for given error and input dimension, showing that N diverges uniformly in the dimension of the teleported quantum system, for vanishing error. As a byproduct, a new lower bound for the size of the program register for an approximate universal programmable quantum processor is derived. *In the last part, we give a new definition for information-theoretic quantum non-malleability, strengthening the previous definition by Ambainis et al. We show that quantum non-malleability implies secrecy, analogous to quantum authentication. Furthermore, non-malleable encryption schemes can be used as a primitive to build authenticating encryption schemes. We also show that the strong notion of authentication recently proposed by Garg et al. can be fulfilled using 2-designs.
The initial conditions of cosmological simulations are commonly drawn from a Gaussian ensemble. The limited number of modes inside a simulation volume gives rise to statistical fluctuations known as textit{sample variance}, limiting the accuracy of simulation predictions. Fixed fields offer an alternative initialization strategy; they have the same power spectrum as standard Gaussian fields but without intrinsic amplitude scatter at linear order. Paired fixed fields consists of two fixed fields with opposite phases that cancel phase correlations which otherwise induce second-order scatter in the non-linear power spectrum. We study the statistical properties of those fields for 19 different quantities at different redshifts through a large set of 600 N-body and 506 state-of-the-art magneto-hydrodynamic simulations covering a wide range of scales, mass and spatial resolutions. We find that paired fixed simulations do not introduce a bias on any of the examined quantities. We quantify the statistical improvement brought by these simulations, over standard ones, on different power spectra such as matter, halos, CDM, gas, stars, black-holes and magnetic fields, finding that they can reduce their variance by factors as large as $10^6$. We quantify the improvement achieved by fixing and by pairing, showing that sample variance in some quantities can be highly suppressed by pairing after fixing. Paired fixed simulations do not change the scatter in quantities such as the probability distribution function of matter density, or the halo, void or stellar mass functions. We argue that procedures aiming at reducing the sample variance of those quantities are unlikely to work. Our results show that paired fixed simulations do not affect either mean relations or scatter of galaxy properties, and suggest that the information embedded in 1-pt statistics is highly complementary to that in clustering.
The emergence of a complex, large-scale organisation of cosmic matter into the Cosmic Web is a beautiful exemplification of how complexity can be produced by simple initial conditions and simple physical laws. In the epoch of Big Data in astrophysics, connecting the stunning variety of multi-messenger observations to the complex interplay of fundamental physical processes is an open challenge. In this contribution, I discuss a few relevant applications of Information Theory to the task of objectively measuring the complexity of modern numerical simulations of the Universe. When applied to cosmological simulations, complexity analysis makes it possible to measure the total information necessary to model the cosmic web. It also allow us to monitor which physical processes are mostly responsible for the emergence of complex dynamical behaviour across cosmic epochs and environments, and possibly to improve mesh refinement strategies in the future.