A promising approach to improve climate-model simulations is to replace traditional subgrid parameterizations based on simplified physical models by machine learning algorithms that are data-driven. However, neural networks (NNs) often lead to instabilities and climate drift when coupled to an atmospheric model. Here we learn an NN parameterization from a high-resolution atmospheric simulation in an idealized domain by coarse graining the model equations and output. The NN parameterization has a structure that ensures physical constraints are respected, and it leads to stable simulations that replicate the climate of the high-resolution simulation with similar accuracy to a successful random-forest parameterization while needing far less memory. We find that the simulations are stable for a variety of NN architectures and horizontal resolutions, and that an NN with substantially reduced numerical precision could decrease computational costs without affecting the quality of simulations.