Do you want to publish a course? Click here

Why Is There Something, Rather Than Nothing?

208   0   0.0 ( 0 )
 Added by Sean Carroll
 Publication date 2018
  fields Physics
and research's language is English




Ask ChatGPT about the research

It seems natural to ask why the universe exists at all. Modern physics suggests that the universe can exist all by itself as a self-contained system, without anything external to create or sustain it. But there might not be an absolute answer to why it exists. I argue that any attempt to account for the existence of something rather than nothing must ultimately bottom out in a set of brute facts; the universe simply is, without ultimate cause or explanation.



rate research

Read More

165 - A.O.Barvinsky 2007
The path integral over Euclidean geometries for the recently suggested density matrix of the Universe is shown to describe a microcanonical ensemble in quantum cosmology. This ensemble corresponds to a uniform (weight one) distribution in phase space of true physical variables, but in terms of the observable spacetime geometry it is peaked about complex saddle-points of the {em Lorentzian} path integral. They are represented by the recently obtained cosmological instantons limited to a bounded range of the cosmological constant. Inflationary cosmologies generated by these instantons at late stages of expansion undergo acceleration whose low-energy scale can be attained within the concept of dynamically evolving extra dimensions. Thus, together with the bounded range of the early cosmological constant, this cosmological ensemble suggests the mechanism of constraining the landscape of string vacua and, simultaneously, a possible solution to the dark energy problem in the form of the quasi-equilibrium decay of the microcanonical state of the Universe.
We consider constraints on primordial black holes (PBHs) in the mass range $( 10^{-18}text{-}10^{15} ),M_{odot}$ if the dark matter (DM) comprises weakly interacting massive particles (WIMPs) which form halos around them and generate $gamma$-rays by annihilations. We first study the formation of the halos and find that their density profile prior to WIMP annihilations evolves to a characteristic power-law form. Because of the wide range of PBH masses considered, our analysis forges an interesting link between previous approaches to this problem. We then consider the effect of the WIMP annihilations on the halo profile and the associated generation of $gamma$-rays. The observed extragalactic $gamma$-ray background implies that the PBH DM fraction is $f^{}_{rm PBH} lesssim 2 times 10^{-9},( m_{chi} / {rm TeV} )^{1.1}$ in the mass range $2 times 10^{-12},M_{odot},( m_{chi} / {rm TeV} )^{-3.2} lesssim M lesssim 5 times 10^{12},M_{odot},( m_{chi} / {rm TeV} )^{1.1}$, where $m_{chi}$ and $M$ are the WIMP and PBH masses, respectively. This limit is independent of $M$ and therefore applies for any PBH mass function. For $M lesssim 2times 10^{-12},M_{odot},( m_{chi}/ {rm TeV} )^{-3.2}$, the constraint on $f^{}_{rm PBH}$ is a decreasing function of $M$ and PBHs could still make a significant DM contribution at very low masses. We also consider constraints on WIMPs if the DM is mostly PBHs. If the merging black holes recently discovered by LIGO/Virgo are of primordial origin, this would rule out the standard WIMP DM scenario. More generally, the WIMP DM fraction cannot exceed $10^{-4}$ for $M > 10^{-9},M_{odot}$ and $m_{chi} > 10,$GeV. There is a region of parameter space, with $M lesssim 10^{-11},M_{odot}$ and $m_{chi} lesssim 100,$GeV, in which WIMPs and PBHs can both provide some but not all of the DM, so that one requires a third DM candidate.
111 - Hoi-Kwong Lo 2005
We study quantum key distribution with standard weak coherent states and show, rather counter-intuitively, that the detection events originated from vacua can contribute to secure key generation rate, over and above the best prior art result. Our proof is based on a communication complexity/quantum memory argument.
232 - Herbert L. Roitblat 2020
In legal eDiscovery, the parties are required to search through their electronically stored information to find documents that are relevant to a specific case. Negotiations over the scope of these searches are often based on a fear that something will be missed. This paper continues an argument that discovery should be based on identifying the facts of a case. If a search process is less than complete (if it has Recall less than 100%), it may still be complete in presenting all of the relevant available topics. In this study, Latent Dirichlet Allocation was used to identify 100 topics from all of the known relevant documents. The documents were then categorized to about 80% Recall (i.e., 80% of the relevant documents were found by the categorizer, designated the hit set and 20% were missed, designated the missed set). Despite the fact that less than all of the relevant documents were identified by the categorizer, the documents that were identified contained all of the topics derived from the full set of documents. This same pattern held whether the categorizer was a naive Bayes categorizer trained on a random selection of documents or a Support Vector Machine trained with Continuous Active Learning (which focuses evaluation on the most-likely-to-be-relevant documents). No topics were identified in either categorizers missed set that were not already seen in the hit set. Not only is a computer-assisted search process reasonable (as required by the Federal Rules of Civil Procedure), it is also complete when measured by topics.
Entanglement has long stood as one of the characteristic features of quantum mechanics, yet recent developments have emphasized the importance of quantumness beyond entanglement for quantum foundations and technologies. We demonstrate that entanglement cannot entirely capture the worst-case sensitivity in quantum interferometry, when quantum probes are used to estimate the phase imprinted by a Hamiltonian, with fixed energy levels but variable eigenbasis, acting on one arm of an interferometer. This is shown by defining a bipartite entanglement monotone tailored to this interferometric setting and proving that it never exceeds the so-called interferometric power, a quantity which relies on more general quantum correlations beyond entanglement and captures the relevant resource. We then prove that the interferometric power can never increase when local commutativity-preserving operations are applied to qubit probes, an important step to validate such a quantity as a genuine quantum correlations monotone. These findings are accompanied by a room-temperature nuclear magnetic resonance experimental investigation, in which two-qubit states with extremal (maximal and minimal) interferometric power at fixed entanglement are produced and characterized.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا