No Arabic abstract
Curiously, our Universe was born in a low entropy state, with abundant free energy to power stars and life. The form that this free energy takes is usually thought to be gravitational: the Universe is almost perfectly smooth, and so can produce sources of energy as matter collapses under gravity. It has recently been argued that a more important source of low-entropy energy is nuclear: the Universe expands too fast to remain in nuclear statistical equilibrium (NSE), effectively shutting off nucleosynthesis in the first few minutes, providing leftover hydrogen as fuel for stars. Here, we fill in the astrophysical details of this scenario, and seek the conditions under which a Universe will emerge from early nucleosynthesis as almost-purely iron. In so doing, we identify a hitherto-overlooked character in the story of the origin of the second law: matter-antimatter asymmetry.
Understanding the universe is hampered by the elusiveness of its most common constituent, cold dark matter. Almost impossible to observe, dark matter can be studied effectively by means of simulation and there is probably no other research field where simulation has led to so much progress in the last decade. Cosmological N-body simulations are an essential tool for evolving density perturbations in the nonlinear regime. Simulating the formation of large-scale structures in the universe, however, is still a challenge due to the enormous dynamic range in spatial and temporal coordinates, and due to the enormous computer resources required. The dynamic range is generally dealt with by the hybridization of numerical techniques. We deal with the computational requirements by connecting two supercomputers via an optical network and make them operate as a single machine. This is challenging, if only for the fact that the supercomputers of our choice are separated by half the planet, as one is located in Amsterdam and the other is in Tokyo. The co-scheduling of the two computers and the gridification of the code enables us to achieve a 90% efficiency for this distributed intercontinental supercomputer.
In the standard model of cosmology, the Universe began its expansion with an anomalously low entropy, which then grew dramatically to much larger values consistent with the physical conditions at decoupling, roughly 380,000 years after the Big Bang. There does not appear to be a viable explanation for this `unnatural history, other than via the generalized second law of thermodynamics (GSL), in which the entropy of the bulk, S_bulk, is combined with the entropy of the apparent (or gravitational) horizon, S_h. This is not completely satisfactory either, however, since this approach seems to require an inexplicable equilibrium between the bulk and horizon temperatures. In this paper, we explore the thermodynamics of an alternative cosmology known as the R_h=ct universe, which has thus far been highly successful in resolving many other problems or inconsistencies in LCDM. We find that S_bulk is constant in this model, eliminating the so-called initial entropy problem simply and elegantly. The GSL may still be relevant, however, principally in selecting the arrow of time, given that S_h ~ t^2 in this model.
In the context stellar reionization in the standard cold dark matter model, we analyze observations at z~6 and are able to draw three significant conclusions with respect to star formation and the state of the intergalactic medium (IGM) at z~6. (1) An initial stellar mass function (IMF) more efficient, by a factor of 10-20, in producing ionizing photons than the standard Salpeter IMF is required at z~6. This may be achieved by having either (A) a metal-enriched IMF with and a lower mass cutoff of >= 30Msun or (B) 2-4% of stellar mass being Population III massive metal-free stars at z~6. While there is no compelling physical reason or observational evidence to support (A), (B) could be fulfilled plausibly by continued existence of some pockets of uncontaminated, metal-free gas for star formation. (2) The volume-weighted neutral fraction of the IGM of <f_HI>_V~ 10^-4 at z=5.8 inferred from the SDSS observations of QSO absorption spectra provides enough information to ascertain that reionization is basically complete with at most ~0.1-1% of IGM that is un-ionized at z=5.8. (3) Barring some extreme evolution of the IMF, the neutral fraction of the IGM is expected to rise quickly toward high redshift from the point of HII bubble percolation, with the mean neutral fraction of the IGM expected to reach 6-12% at z=6.5, 13-27% at z=7.7 and 22-38% at z=8.8.
It is well established that between 380000 and 1 billion years after the Big Bang the Inter Galactic Medium (IGM) underwent a phase transformation from cold and fully neutral to warm (~10^4 K) and ionized. Whether this phase transformation was fully driven and completed by photoionization by young hot stars is a question of topical interest in cosmology. AIMS. We propose here that besides the ultraviolet radiation from massive stars, feedback from accreting black holes in high-mass X-ray binaries (BH-HMXBs) was an additional, important source of heating and reionization of the IGM in regions of low gas density at large distances from star-forming galaxies. METHODS. We use current theoretical models on the formation and evolution of primitive massive stars of low metallicity, and the observations of compact stellar remnants in the near and distant universe, to infer that a significant fraction of the first generations of massive stars end up as BH-HMXBs. The total number of energetic ionizing photons from an accreting stellar black hole in an HMXB is comparable to the total number of ionizing photons of its progenitor star. However, the X-ray photons emitted by the accreting black hole are capable of producing several secondary ionizations and the ionizing power of the resulting black hole could be greater than that of its progenitor. Feedback by the large populations of BH-HMXBs heats the IGM to temperatures of ~10^4 K and maintains it ionized on large distance scales. BH-HMXBs determine the early thermal history of the universe and mantain it as ionized over large volumes of space in regions of low density. This has a direct impact on the properties of the faintest galaxies at high redshifts, the smallest dwarf galaxies in the local universe, and on the existing and future surveys at radio wavelengths of atomic hydrogen in the early universe.
We outline the prospects for performing pioneering radio weak gravitational lensing analyses using observations from a potential forthcoming JVLA Sky Survey program. A large-scale survey with the JVLA can offer interesting and unique opportunities for performing weak lensing studies in the radio band, a field which has until now been the preserve of optical telescopes. In particular, the JVLA has the capacity for large, deep radio surveys with relatively high angular resolution, which are the key characteristics required for a successful weak lensing study. We highlight the potential advantages and unique aspects of performing weak lensing in the radio band. In particular, the inclusion of continuum polarisation information can greatly reduce noise in weak lensing reconstructions and can also remove the effects of intrinsic galaxy alignments, the key astrophysical systematic effect that limits weak lensing at all wavelengths. We identify a VLASS deep fields program (total area ~10-20 square degs), to be conducted at L-band and with high-resolution (A-array configuration), as the optimal survey strategy from the point of view of weak lensing science. Such a survey will build on the unique strengths of the JVLA and will remain unsurpassed in terms of its combination of resolution and sensitivity until the advent of the Square Kilometre Array. We identify the best fields on the JVLA-accessible sky from the point of view of overlapping with existing deep optical and near infra-red data which will provide crucial redshift information and facilitate a host of additional compelling multi-wavelength science.