With current and upcoming experiments such as WFIRST, Euclid and LSST, we can observe up to billions of galaxies. While such surveys cannot obtain spectra for all observed galaxies, they produce galaxy magnitudes in color filters. This data set behaves like a high-dimensional nonlinear surface, an excellent target for machine learning. In this work, we use a lightcone of semianalytic galaxies tuned to match CANDELS observations from Lu et al. (2014) to train a set of neural networks on a set of galaxy physical properties. We add realistic photometric noise and use trained neural networks to predict stellar masses and average star formation rates on real CANDELS galaxies, comparing our predictions to SED fitting results. On semianalytic galaxies, we are nearly competitive with template-fitting methods, with biases of $0.01$ dex for stellar mass, $0.09$ dex for star formation rate, and $0.04$ dex for metallicity. For the observed CANDELS data, our results are consistent with template fits on the same data at $0.15$ dex bias in $M_{rm star}$ and $0.61$ dex bias in star formation rate. Some of the bias is driven by SED-fitting limitations, rather than limitations on the training set, and some is intrinsic to the neural network method. Further errors are likely caused by differences in noise properties between the semianalytic catalogs and data. Our results show that galaxy physical properties can in principle be measured with neural networks at a competitive degree of accuracy and precision to template-fitting methods.