(Machine) Learning amplitudes for faster event generation


Abstract in English

We propose to replace the exact amplitudes used in MC event generators for trained Machine Learning regressors, with the aim of speeding up the evaluation of {it slow} amplitudes. As a proof of concept, we study the process $gg to ZZ$ whose LO amplitude is loop induced. We show that gradient boosting machines like $texttt{XGBoost}$ can predict the fully differential distributions with errors below $0.1 %$, and with prediction times $mathcal{O}(10^3)$ faster than the evaluation of the exact function. This is achieved with training times $sim 7$ minutes and regressors of size $lesssim 30$~Mb. These results suggest a possible new avenue to speed up MC event generators.

Download