From calibration to parameter learning: Harnessing the scaling effects of big data in geoscientific modeling


Abstract in English

The behaviors and skills of models in many geosciences, e.g., hydrology and ecosystem sciences, strongly depend on spatially varying parameters that need calibration. Here we propose a novel differentiable parameter learning (dPL) framework that solves a pattern recognition problem and learns a more robust, universal mapping. Crucially, dPL exhibits virtuous scaling curves not previously demonstrated to geoscientists: as training data collectively increases, dPL achieves better performance, more physical coherence, and better generalization, all with orders-of-magnitude lower computational cost. We demonstrate examples of calibrating models to soil moisture and streamflow, where dPL drastically outperformed state-of-the-art evolutionary and regionalization methods, or requires ~12.5% the training data to achieve the similar performance. The generic scheme promotes the integration of deep learning and process-based models, without mandating reimplementation.

Download