Measuring Galaxy Star Formation Rates From Integrated Photometry: Insights from Color-Magnitude Diagrams of Resolved Stars


Abstract in English

We use empirical star formation histories (SFHs), measured from HST-based resolved star color-magnitude diagrams, as input into population synthesis codes to model the broadband spectral energy distributions (SEDs) of ~50 nearby dwarf galaxies (6.5 < log M/M_* < 8.5, with metallicities ~10% solar). In the presence of realistic SFHs, we compare the modeled and observed SEDs from the ultraviolet (UV) through near-infrared (NIR) and assess the reliability of widely used UV-based star formation rate (SFR) indicators. In the FUV through i bands, we find that the observed and modeled SEDs are in excellent agreement. In the Spitzer 3.6micron and 4.5micron bands, we find that modeled SEDs systematically over-predict observed luminosities by up to ~0.2 dex, depending on treatment of the TP-AGB stars in the synthesis models. We assess the reliability of UV luminosity as a SFR indicator, in light of independently constrained SFHs. We find that fluctuations in the SFHs alone can cause factor of ~2 variations in the UV luminosities relative to the assumption of a constant SFH over the past 100 Myr. These variations are not strongly correlated with UV-optical colors, implying that correcting UV-based SFRs for the effects of realistic SFHs is difficult using only the broadband SED. Additionally, for this diverse sample of galaxies, we find that stars older than 100 Myr can contribute from <5% to100% of the present day UV luminosity, highlighting the challenges in defining a characteristic star formation timescale associated with UV emission. We do find a relationship between UV emission timescale and broadband UV-optical color, though it is different than predictions based on exponentially declining SFH models. Our findings have significant implications for the comparison of UV-based SFRs across low-metallicity populations with diverse SFHs.

Download