Molecular clouds interacting with supernova remnants may be subject to a greatly enhanced irradiation by cosmic rays produced at the shocked interface between the ejecta and the molecular gas. Over the past decade, broad-band observations have provided important clues about these relativistic particles and indicate that they may dominate over the locally observed cosmic-ray population by a significant amount. In this paper, we estimate the enhancement and find that the cosmic ray energy density can be up to $sim$1000 times larger in the molecular cloud than in the field. This enhancement can last for a few Myr and leads to a corresponding increase in the ionization fraction, which has important consequences for star formation. Ionization fractions in] molecular cloud cores determine, in part, the rate of ambipolar diffusion, an important process in core formation and pre-collapse evolution. Ionization fractions in newly formed circumstellar disks affect the magneto-rotational instability mechanism, which in turn affects the rate of disk accretion. As estimated here, the increased ionization acts to increase the ambipolar diffusion time by a factor of $sim30$ and thereby suppresses star formation. In contrast, the increased ionization fraction reduces the sizes of dead zones in accretion disks (by up to an order of magnitude) and thus increases disk accretion rates (by a comparable factor).