Predicting bulge to total luminosity ratio of galaxies using deep learning


Abstract in English

We present a deep learning model to predict the r-band bulge-to-total light ratio (B/T) of nearby galaxies using their multi-band JPEG images alone. Our Convolutional Neural Network (CNN) based regression model is trained on a large sample of galaxies with reliable decomposition into the bulge and disk components. The existing approaches to estimate the B/T use galaxy light-profile modelling to find the best fit. This method is computationally expensive, prohibitively so for large samples of galaxies, and requires a significant amount of human intervention. Machine learning models have the potential to overcome these shortcomings. In our CNN model, for a test set of 20000 galaxies, 85.7 per cent of the predicted B/T values have absolute error (AE) less than 0.1. We see further improvement to 87.5 per cent if, while testing, we only consider brighter galaxies (with r-band apparent magnitude < 17) with no bright neighbours. Our model estimates B/T for the 20000 test galaxies in less than a minute. This is a significant improvement in inference time from the conventional fitting pipelines, which manage around 2-3 estimates per minute. Thus, the proposed machine learning approach could potentially save a tremendous amount of time, effort and computational resources while predicting B/T reliably, particularly in the era of next-generation sky surveys such as the Legacy Survey of Space and Time (LSST) and the Euclid sky survey which will produce extremely large samples of galaxies.

Download