Runaway Relaxion from Finite Density


الملخص بالإنكليزية

Finite density effects can destabilize the metastable vacua in relaxion models. Focusing on stars as nucleation seeds, we derive the conditions that lead to the formation and runaway of a relaxion bubble of a lower energy minimum than in vacuum. The resulting late-time phase transition in the universe allows us to set new constraints on the parameter space of relaxion models. We also find that similar instabilities can be triggered by the large electromagnetic fields around rotating neutron stars.

تحميل البحث