Do you want to publish a course? Click here

Towards Physically-consistent, Data-driven Models of Convection

86   0   0.0 ( 0 )
 Added by Tom Beucler
 Publication date 2020
and research's language is English




Ask ChatGPT about the research

Data-driven algorithms, in particular neural networks, can emulate the effect of sub-grid scale processes in coarse-resolution climate models if trained on high-resolution climate simulations. However, they may violate key physical constraints and lack the ability to generalize outside of their training set. Here, we show that physical constraints can be enforced in neural networks, either approximately by adapting the loss function or to within machine precision by adapting the architecture. As these physical constraints are insufficient to guarantee generalizability, we additionally propose to physically rescale the training and validation data to improve the ability of neural networks to generalize to unseen climates.



rate research

Read More

A promising approach to improve climate-model simulations is to replace traditional subgrid parameterizations based on simplified physical models by machine learning algorithms that are data-driven. However, neural networks (NNs) often lead to instabilities and climate drift when coupled to an atmospheric model. Here we learn an NN parameterization from a high-resolution atmospheric simulation in an idealized domain by coarse graining the model equations and output. The NN parameterization has a structure that ensures physical constraints are respected, and it leads to stable simulations that replicate the climate of the high-resolution simulation with similar accuracy to a successful random-forest parameterization while needing far less memory. We find that the simulations are stable for a variety of NN architectures and horizontal resolutions, and that an NN with substantially reduced numerical precision could decrease computational costs without affecting the quality of simulations.
149 - S. Raia , M. Alvioli , M. Rossi 2013
Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are based on deterministic laws. These models extend spatially the static stability models adopted in geotechnical engineering, and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the operation of the existing models lays in the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of rainfall-induced shallow landslides. For the purpose, we have modified the Transient Rainfall Infiltration and Grid-Based Regional Slope-Stability Analysis (TRIGRS) code. The new code (TRIGRS-P) adopts a probabilistic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. [..]
Artificial neural-networks have the potential to emulate cloud processes with higher accuracy than the semi-empirical emulators currently used in climate models. However, neural-network models do not intrinsically conserve energy and mass, which is an obstacle to using them for long-term climate predictions. Here, we propose two methods to enforce linear conservation laws in neural-network emulators of physical models: Constraining (1) the loss function or (2) the architecture of the network itself. Applied to the emulation of explicitly-resolved cloud processes in a prototype multi-scale climate model, we show that architecture constraints can enforce conservation laws to satisfactory numerical precision, while all constraints help the neural-network better generalize to conditions outside of its training set, such as global warming.
In recent years, there has been growing interest in using Precipitable Water Vapor (PWV) derived from Global Positioning System (GPS) signal delays to predict rainfall. However, the occurrence of rainfall is dependent on a myriad of atmospheric parameters. This paper proposes a systematic approach to analyze various parameters that affect precipitation in the atmosphere. Different ground-based weather features like Temperature, Relative Humidity, Dew Point, Solar Radiation, PWV along with Seasonal and Diurnal variables are identified, and a detailed feature correlation study is presented. While all features play a significant role in rainfall classification, only a few of them, such as PWV, Solar Radiation, Seasonal and Diurnal features, stand out for rainfall prediction. Based on these findings, an optimum set of features are used in a data-driven machine learning algorithm for rainfall prediction. The experimental evaluation using a four-year (2012-2015) database shows a true detection rate of 80.4%, a false alarm rate of 20.3%, and an overall accuracy of 79.6%. Compared to the existing literature, our method significantly reduces the false alarm rates.
A formulation is developed to assimilate ocean-wave data into the Numerical Flow Analysis (NFA) code. NFA is a Cartesian-based implicit Large-Eddy Simulation (LES) code with Volume of Fluid (VOF) interface capturing. The sequential assimilation of data into NFA permits detailed analysis of ocean-wave physics with higher bandwidths than is possible using either other formulations, such as High-Order Spectral (HOS) methods, or field measurements. A framework is provided for assimilating the wavy and vortical portions of the flow. Nudging is used to assimilate wave data at low wavenumbers, and the wave data at high wavenumbers form naturally through nonlinear interactions, wave breaking, and wind forcing. Similarly, the vertical profiles of the mean vortical flow in the wind and the wind drift are nudged, and the turbulent fluctuations are allowed to form naturally. As a demonstration, the results of a HOS of a JONSWAP wave spectrum are assimilated to study short-crested seas in equilibrium with the wind. Log profiles are assimilated for the mean wind and the mean wind drift. The results of the data assimilations are (1) Windrows form under the action of breaking waves and the formation of swirling jets; (2) The crosswind and cross drift meander; (3) Swirling jets are organized into Langmuir cells in the upper oceanic boundary layer; (4) Swirling jets are organized into wind streaks in the lower atmospheric boundary layer; (5) The length and time scales of the Langmuir cells and the wind streaks increase away from the free surface; (6) Wave growth is very dynamic especially for breaking waves; (7) The effects of the turbulent fluctuations in the upper ocean on wave growth need to be considered together with the turbulent fluctuations in the lower atmosphere; and (8) Extreme events are most likely when waves are not in equilibrium.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا