One of the largest sources of uncertainty in stellar models is caused by the treatment of convection in stellar envelopes. One dimensional stellar models often make use of the mixing length or equivalent approximations to describe convection, all of which depend on various free parameters. There have been attempts to rectify this by using 3D radiative-hydrodynamic simulations of stellar convection, and in trying to extract an equivalent mixing length from the simulations. In this paper we show that the entropy of the deeper, adiabatic layers in these simulations can be expressed as a simple function of og g and log T_{eff} which holds potential for calibrating stellar models in a simple and more general manner.