ﻻ يوجد ملخص باللغة العربية
Over the last couple of years, machine learning parameterizations have emerged as a potential way to improve the representation of sub-grid processes in Earth System Models (ESMs). So far, all studies were based on the same three-step approach: first a training dataset was created from a high-resolution simulation, then a machine learning algorithm was fitted to this dataset, before the trained algorithm was implemented in the ESM. The resulting online simulations were frequently plagued by instabilities and biases. Here, coupled online learning is proposed as a way to combat these issues. Coupled learning can be seen as a second training stage in which the pretrained machine learning parameterization, specifically a neural network, is run in parallel with a high-resolution simulation. The high-resolution simulation is kept in sync with the neural network-driven ESM through constant nudging. This enables the neural network to learn from the tendencies that the high-resolution simulation would produce if it experienced the states the neural network creates. The concept is illustrated using the Lorenz 96 model, where coupled learning is able to recover the true parameterizations. Further, detailed algorithms for the implementation of coupled learning in 3D cloud-resolving models and the super parameterization framework are presented. Finally, outstanding challenges and issues not resolved by this approach are discussed.
A new framework is proposed for the evaluation of stochastic subgrid-scale parameterizations in the context of MAOOAM, a coupled ocean-atmosphere model of intermediate complexity. Two physically-based parameterizations are investigated, the first one
Even with todays immense computational resources, climate models cannot resolve every cloud in the atmosphere or eddying swirl in the ocean. However, collectively these small-scale turbulent processes play a key role in setting Earths climate. Climat
We propose a simple, but efficient and accurate machine learning (ML) model for developing high-dimensional potential energy surface. This so-called embedded atom neural network (EANN) approach is inspired by the well-known empirical embedded atom me
A simple method for adding uncertainty to neural network regression tasks via estimation of a general probability distribution is described. The methodology supports estimation of heteroscedastic, asymmetric uncertainties by a simple modification of
Graph neural networks (GNNs) are naturally distributed architectures for learning representations from network data. This renders them suitable candidates for decentralized tasks. In these scenarios, the underlying graph often changes with time due t