By combining the jet quenching Monte Carlo JEWEL with a realistic hydrodynamic model for the background we investigate the sensitivity of jet observables to details of the medium model and quantify the influence of the energy and momentum lost by jets on the background evolution. On the level of event averaged source terms the effects are small and are caused mainly by the momentum transfer.
Energy and momentum loss of jets in heavy ion collisions can affect the fluid dynamic evolution of the medium. We determine realistic event-by-event averages and correlation functions of the local energy-momentum transfer from hard particles to the soft sector using the jet-quenching Monte-Carlo code JEWEL combined with a hydrodynamic model for the background. The expectation values for source terms due to jets in a typical (minimum bias) event affect the fluid dynamic evolution mainly by their momentum transfer. This leads to a small increase in flow. The presence of hard jets in the event constitutes only a minor correction.
We extend a recent computation of the dependence of the free energy, F, on the noncommutative scale $theta$ to theories with very different UV sensitivity. The temperature dependence of $F$ strongly suggests that a reduced number of degrees of freedom contributes to the free energy in the non-planar sector, $F_{rm np}$, at high temperature. This phenomenon seems generic, independent of the UV sensitivity, and can be traced to modes whose thermal wavelengths become smaller than the noncommutativity scale. The temperature dependence of $F_{rm np}$ can then be calculated at high temperature using classical statistical mechanics, without encountering a UV catastrophe even in large number of dimensions. This result is a telltale sign of the low number of degrees of freedom contributing to $F$ in the non-planar sector at high temperature. Such behavior is in marked contrast to what would happen in a field theory with a random set of higher derivative interactions.
Quantum information processing shows advantages in many tasks, including quantum communication and computation, comparing to its classical counterpart. The essence of quantum processing lies on the fundamental difference between classical and quantum states. For a physical system, the coherent superposition on a computational basis is different from the statistical mixture of states in the same basis. Such coherent superposition endows the possibility of generating true random numbers, realizing parallel computing, and other classically impossible tasks such as quantum Bernoulli factory. Considering a system that consists of multiple parts, the coherent superposition that exists nonlocally on different systems is called entanglement. By properly manipulating entanglement, it is possible to realize computation and simulation tasks that are intractable with classical means. Investigating quantumness, coherent superposition, and entanglement can shed light on the original of quantum advantages and lead to the design of new quantum protocols. This thesis mainly focuses on the interplay between quantumness and two information tasks, randomness generation and selftesting quantum information processing. We discuss how quantumness can be used to generate randomness and show that randomness can in turn be used to quantify quantumness. In addition, we introduce the Bernoulli factory problem and present the quantum advantage with only coherence in both theory and experiment. Furthermore, we show a method to witness entanglement that is independent of the realization of the measurement. We also investigate randomness requirements in selftesting tasks and propose a random number generation scheme that is independent of the randomness source.
In the classic type I seesaw mechanism with very heavy right-handed (RH) neutrinos, it is possible to account for dark matter via RH neutrino portal couplings to a feebly interacting massive particle (FIMP) dark sector. However, for large RH neutrino masses, gravity can play an important role. We study the interplay between the neutrino portal through the right-handed neutrinos and the gravity portal through the massless spin-2 graviton in producing dark matter particles in the early universe. As a concrete example, we consider the minimal and realistic Littlest Seesaw model with two RH neutrinos, augmented with a dark scalar and a dark fermion charged under a global $U(1)_D$ dark symmetry. In the model, the usual seesaw neutrino Yukawa couplings and the right-handed neutrino masses (the lightest being about $5times 10^{10}$ GeV) are fixed by neutrino oscillations data and leptogenesis. Hence, we explore the parameter space of the two RH neutrino portal couplings, the two dark particle masses and the reheating temperature of the universe, where the correct dark matter relic abundance is achieved through the freeze-in mechanism. In particular, we highlight which class of processes dominate the dark matter production. We find that, despite the presence of the gravity portal, the dark matter production relies on the usual seesaw neutrino Yukawa coupling in some regions of the parameter space, so realising a direct link between dark matter and neutrino phenomenology. Finally, we report the threshold values for the neutrino portal couplings below which the neutrino portal is irrelevant and the Planckian Interacting Dark Matter paradigm is preserved.
The spinor-helicity formalism has proven to be very efficient in the calculation of scattering amplitudes in quantum field theory, while the loop tree duality (LTD) representation of multi-loop integrals exhibits appealing and interesting advantages with respect to other approaches. In view of the most recent developments in LTD, we exploit the synergies with the spinor-helicity formalism to analyse illustrative one- and two-loop scattering processes. We focus our discussion on the local UV renormalisation of IR and UV finite amplitudes and present a fully automated numerical implementation that provides efficient expressions which are integrable directly in four space-time dimensions.