No Arabic abstract
In the present paper, we investigate the cosmographic problem using the bias-variance trade-off. We find that both the z-redshift and the $y=z/(1+z)$-redshift can present a small bias estimation. It means that the cosmography can describe the supernova data more accurately. Minimizing risk, it suggests that cosmography up to the second order is the best approximation. Forecasting the constraint from future measurements, we find that future supernova and redshift drift can significantly improve the constraint, thus having the potential to solve the cosmographic problem. We also exploit the values of cosmography on the deceleration parameter and equation of state of dark energy $w(z)$. We find that supernova cosmography cannot give stable estimations on them. However, much useful information was obtained, such as that the cosmography favors a complicated dark energy with varying $w(z)$, and the derivative $dw/dz<0$ for low redshift. The cosmography is helpful to model the dark energy.
The DNA molecule, apart from carrying the genetic information, plays a crucial role in a variety of biological processes and find applications in drug design, nanotechnology and nanoelectronics. The molecule undergoes significant structural transitions under the influence of forces due to physiological and non-physiological environments. Here, we summarize the insights gained from simulations and single-molecule experiments on the structural transitions and mechanics of DNA under force, as well as its elastic properties, in various environmental conditions, and discuss appealing future directions.
Mass loss rate formulae are derived from observations or from suites of models. For theoretical models, the following have all been identified as factors greatly influencing the atmospheric structure and mass loss rates: Pulsation with piston amplitude scaling appropriately with stellar L; dust nucleation and growth, with radiation pressure and grain-gas interactions and appropriate dependence on temperature and density; non-grey opacity with at least 51 frequency samples; non-LTE and departures from radiative equilibrium in the compressed and expanding flows; and non-equilibrium processes affecting the composition (grain formation; molecular chemistry). No one set of models yet includes all the factors known to be important. In fact, it is very difficult to construct a model that can simultaneously include these factors and be useful for computing spectra. Therefore, although theoretical model grids are needed to separate the effects of M,L,R and/or $T_{mathrm{eff}}$ or Z on the mass loss rates, these models must be carefully checked against observations. Getting the right order of magnitude for the mass loss rate is only the first step in such a comparison, and is not sufficient to determine whether the mass loss formula is correct. However, there are observables that do test the validity of mass loss formulae as they depend directly on $dlog dot M/dlog L$, $dlog dot M/dlog R$, or $dlog dot M/dlog P$.
String theory has transformed our understanding of geometry, topology and spacetime. Thus, for this special issue of Foundations of Physics commemorating Forty Years of String Theory, it seems appropriate to step back and ask what we do not understand. As I will discuss, time remains the least understood concept in physical theory. While we have made significant progress in understanding space, our understanding of time has not progressed much beyond the level of a century ago when Einstein introduced the idea of space-time as a combined entity. Thus, I will raise a series of open questions about time, and will review some of the progress that has been made as a roadmap for the future.
Temperature, the central concept of thermal physics, is one of the most frequently employed physical quantities in common practice. Even though the operative methods of the temperature measurement are described in detail in various practical instructions and textbooks, the rigorous treatment of this concept is almost lacking in the current literature. As a result, the answer to a simple question of what the temperature is is by no means trivial and unambiguous. There is especially an appreciable gap between the temperature as introduced in the frame of statistical theory and the only experimentally observable quantity related to this concept, phenomenological temperature. Just the logical and epistemological analysis of the present concept of phenomenological temperature is the kernel of the contribution.
Polarized foregrounds are going to be a serious challenge for detecting CMB cosmological B-modes. Both diffuse Galactic emission and extragalactic sources contribute significantly to the power spectrum on large angular scales. At low frequencies, Galactic synchrotron emission will dominate with fractional polarization $sim 20-40%$ at high latitudes while radio sources can contribute significantly even on large ($sim 1^{circ}$) angular scales. Nevertheless, simulations suggest that a detection at the level of $r=0.001$ might be achievable if the foregrounds are not too complex.