ترغب بنشر مسار تعليمي؟ اضغط هنا

How is really decelerating the expansion of SN1993J?

187   0   0.0 ( 0 )
 نشر من قبل Eduardo Ros
 تاريخ النشر 2002
  مجال البحث فيزياء
والبحث باللغة English
 تأليف J.M. Marcaide




اسأل ChatGPT حول البحث

SN1993J is to date the radio supernova whose evolution has been monitored in greatest detail and the one which holds best promise for a comprehensive theoretical-observational analysis. The shell-like radio structure of SN1993J has expanded in general accord with models of shock excited emission, showing almost circular symmetry for over 8 years, except for a bright feature at the south-eastern region of the shell that has been observed at every epoch. The spectrum of SN1993J has flattened from alpha =-1 to alpha =-0.67 (S_( u) propto nu**(alpha)). The decelerated expansion can be modeled well with a single slope but apparently better with two slopes. There are also intriguing hints of structure in the expansion curve. The results by the two VLBI groups carrying out this research show general agreement, but also some differences. A comparison of the optical and VLBI results about the details of the deceleration show some discrepancies.



قيم البحث

اقرأ أيضاً

A rarity among supernova, SN 1993J in M81 can be studied with high spatial resolution. Its radio power and distance permit VLBI observations to monitor the expansion of its angular structure. This radio structure was previously revealed to be shell-l ike and to be undergoing a self-similar expansion at a constant rate. From VLBI observations at the wavelengths of 3.6 and 6 cm in the period 6 to 42 months after explosion, we have discovered that the expansion is decelerating. Our measurement of this deceleration yields estimates of the density profiles of the supernova ejecta and circumstellar material in standard supernova explosion models.
During the past five years the Bayesian deep learning community has developed increasingly accurate and efficient approximate inference procedures that allow for Bayesian inference in deep neural networks. However, despite this algorithmic progress a nd the promise of improved uncertainty quantification and sample efficiency there are---as of early 2020---no publicized deployments of Bayesian neural networks in industrial practice. In this work we cast doubt on the current understanding of Bayes posteriors in popular deep neural networks: we demonstrate through careful MCMC sampling that the posterior predictive induced by the Bayes posterior yields systematically worse predictions compared to simpler methods including point estimates obtained from SGD. Furthermore, we demonstrate that predictive performance is improved significantly through the use of a cold posterior that overcounts evidence. Such cold posteriors sharply deviate from the Bayesian paradigm but are commonly used as heuristic in Bayesian deep learning papers. We put forward several hypotheses that could explain cold posteriors and evaluate the hypotheses through experiments. Our work questions the goal of accurate posterior approximations in Bayesian deep learning: If the true Bayes posterior is poor, what is the use of more accurate approximations? Instead, we argue that it is timely to focus on understanding the origin of the improved performance of cold posteriors.
130 - John G. Hartnett 2011
The Hubble law, determined from the distance modulii and redshifts of galaxies, for the past 80 years, has been used as strong evidence for an expanding universe. This claim is reviewed in light of the claimed lack of necessary evidence for time dila tion in quasar and gamma-ray burst luminosity variations and other lines of evidence. It is concluded that the observations could be used to describe either a static universe (where the Hubble law results from some as-yet-unknown mechanism) or an expanding universe described by the standard Lambda cold dark matter model. In the latter case, size evolution of galaxies is necessary for agreement with observations. Yet the simple non-expanding Euclidean universe fits most data with the least number of assumptions. From this review it is apparent that there are still many unanswered questions in cosmology and the title question of this paper is still far from being answered.
124 - A. De Angelis 2008
Recent findings by gamma-ray Cherenkov telescopes suggest a higher transparency of the Universe to very-high-energy (VHE) photons than expected from current models of the Extragalactic Background Light. It has been shown that such transparency can be naturally explained by the DARMA scenario, in which the photon mixes with a new, very light, axion-like particle predicted by many extensions of the Standard Model of elementary particles. We discuss the implications of DARMA for the VHE gamma-ray spectra of blazars, and show that it successfully accounts for the observed correlation between spectral slope and redshift by adopting for far-away sources the same emission spectrum characteristic of nearby ones. DARMA also predicts the observed blazar spectral index to become asymptotically independent of redshift for far-away sources. Our prediction can be tested with the satellite-borne Fermi/LAT detector as well as with the ground-based Cherenkov telescopes HESS, MAGIC, CANGAROOIII, VERITAS and the Extensive Air Shower arrays ARGO-YBJ and MILAGRO.
88 - A. Tartaglia 1998
The EPR paradox and the meaning of the Bell inequality are discussed. It is shown that considering the quantum objects as carrying with them instruction kits telling them what to do when meeting a measurement apparatus any paradox disappears. In this view the quantum state is characterized by the prescribed behaviour rather than by the specific value a parameter assumes as a result of an interaction.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا