Do you want to publish a course? Click here

Gargantuan chaotic gravitational three-body systems and their irreversibility to the Planck length

346   0   0.0 ( 0 )
 Added by Tjarda Boekholt
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

Chaos is present in most stellar dynamical systems and manifests itself through the exponential growth of small perturbations. Exponential divergence drives time irreversibility and increases the entropy in the system. A numerical consequence is that integrations of the N-body problem unavoidably magnify truncation and rounding errors to macroscopic scales. Hitherto, a quantitative relation between chaos in stellar dynamical systems and the level of irreversibility remained undetermined. In this work we study chaotic three-body systems in free fall initially using the accurate and precise N-body code Brutus, which goes beyond standard double-precision arithmetic. We demonstrate that the fraction of irreversible solutions decreases as a power law with numerical accuracy. This can be derived from the distribution of amplification factors of small initial perturbations. Applying this result to systems consisting of three massive black holes with zero total angular momentum, we conclude that up to five percent of such triples would require an accuracy of smaller than the Planck length in order to produce a time-reversible solution, thus rendering them fundamentally unpredictable.



rate research

Read More

We present a new symplectic integrator designed for collisional gravitational $N$-body problems which makes use of Kepler solvers. The integrator is also reversible and conserves 9 integrals of motion of the $N$-body problem to machine precision. The integrator is second order, but the order can easily be increased by the method of citeauthor{yos90}. We use fixed time step in all tests studied in this paper to ensure preservation of symplecticity. We study small $N$ collisional problems and perform comparisons with typically used integrators. In particular, we find comparable or better performance when compared to the 4th order Hermite method and much better performance than adaptive time step symplectic integrators introduced previously. We find better performance compared to SAKURA, a non-symplectic, non-time-reversible integrator based on a different two-body decomposition of the $N$-body problem. The integrator is a promising tool in collisional gravitational dynamics.
Gravitational softening length is one of the key parameters to properly set up a cosmological $N$-body simulation. In this paper, we perform a large suit of high-resolution $N$-body simulations to revise the optimal softening scheme proposed by Power et al. (P03). Our finding is that P03 optimal scheme works well but is over conservative. Using smaller softening lengths than that of P03 can achieve higher spatial resolution and numerically convergent results on both circular velocity and density profiles. However using an over small softening length overpredicts matter density at the inner most region of dark matter haloes. We empirically explore a better optimal softening scheme based on P03 form and find that a small modification works well. This work will be useful for setting up cosmological simulations.
We use the latest Planck constraints, and in particular constraints on the derived parameters (Hubble constant and age of the Universe) for the local universe and compare them with local measurements of the same quantities. We propose a way to quantify whether cosmological parameters constraints from two different experiments are in tension or not. Our statistic, T, is an evidence ratio and therefore can be interpreted with the widely used Jeffreys scale. We find that in the framework of the LCDM model, the Planck inferred two dimensional, joint, posterior distribution for the Hubble constant and age of the Universe is in strong tension with the local measurements; the odds being ~ 1:50. We explore several possibilities for explaining this tension and examine the consequences both in terms of unknown errors and deviations from the LCDM model. In some one-parameter LCDM model extensions, tension is reduced whereas in other extensions, tension is instead increased. In particular, small total neutrino masses are favored and a total neutrino mass above 0.15 eV makes the tension highly significant (odds ~ 1:150). A consequence of accepting this interpretation of the tension is that the degenerate neutrino hierarchy is highly disfavoured by cosmological data and the direct hierarchy is slightly favored over the inverse.
The Loschmidt echo (LE) is a measure of the sensitivity of quantum mechanics to perturbations in the evolution operator. It is defined as the overlap of two wave functions evolved from the same initial state but with slightly different Hamiltonians. Thus, it also serves as a quantification of irreversibility in quantum mechanics. In this thesis the LE is studied in systems that have a classical counterpart with dynamical instability, that is, classically chaotic. An analytical treatment that makes use of the semiclassical approximation is presented. It is shown that, under certain regime of the parameters, the LE decays exponentially. Furthermore, for strong enough perturbations, the decay rate is given by the Lyapunov exponent of the classical system. Some particularly interesting examples are given. The analytical results are supported by thorough numerical studies. In addition, some regimes not accessible to the theory are explored, showing that the LE and its Lyapunov regime present the same form of universality ascribed to classical chaos. In a sense, this is evidence that the LE is a robust temporal signature of chaos in the quantum realm. Finally, the relation between the LE and the quantum to classical transition is explored, in particular with the theory of decoherence. Using two different approaches, a semiclassical approximation to Wigner functions and a master equation for the LE, it is shown that the decoherence rate and the decay rate of the LE are equal. The relationship between these quantities results mutually beneficial, in terms of the broader resources of decoherence theory and of the possible experimental realization of the LE.
The European Space Agencys Planck satellite was launched on 14 May 2009, and has been surveying the sky stably and continuously since 13 August 2009. Its performance is well in line with expectations, and it will continue to gather scientific data until the end of its cryogenic lifetime. We give an overview of the history of Planck in its first year of operations, and describe some of the key performance aspects of the satellite. This paper is part of a package submitted in conjunction with Plancks Early Release Compact Source Catalogue, the first data product based on Planck to be released publicly. The package describes the scientific performance of the Planck payload, and presents results on a variety of astrophysical topics related to the sources included in the Catalogue, as well as selected topics on diffuse emission.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا