ترغب بنشر مسار تعليمي؟ اضغط هنا

A data-driven method for the stochastic parametrisation of subgrid-scale tropical convective area fraction

91   0   0.0 ( 0 )
 نشر من قبل Georg Gottwald A.
 تاريخ النشر 2015
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Observations of tropical convection from precipitation radar and the concurring large-scale atmospheric state at two locations (Darwin and Kwajalein) are used to establish effective stochastic models to parameterise subgrid-scale tropical convective activity. Two approaches are presented which rely on the assumption that tropical convection induces a stationary equilibrium distribution. In the first approach we parameterise convection variables such as convective area fraction as an instantaneous random realisation conditioned on the large-scale vertical velocities according to a probability density function estimated from the observations. In the second approach convection variables are generated in a Markov process conditioned on the large-scale vertical velocity, allowing for non-trivial temporal correlations. Despite the different prevalent atmospheric and oceanic regimes at the two locations, with Kwajalein being exposed to a purely oceanic weather regime and Darwin exhibiting land-sea interaction, we establish that the empirical measure for the convective variables conditioned on large-scale mid-level vertical velocities for the two locations are close. This allows us to train the stochastic models at one location and then generate time series of convective activity at the other location. The proposed stochastic subgrid-scale models adequately reproduce the statistics of the observed convective variables and we discuss how they may be used in future scale-independent mass-flux convection parameterisations.



قيم البحث

اقرأ أيضاً

Stochastic parametrisations are used in weather and climate models to improve the representation of unpredictable unresolved processes. When compared to a deterministic model, a stochastic model represents `model uncertainty, i.e., sources of error i n the forecast due to the limitations of the forecast model. We present a technique for systematically deriving new stochastic parametrisations or for constraining existing stochastic approaches. A high-resolution model simulation is coarse-grained to the desired forecast model resolution. This provides the initial conditions and forcing data needed to drive a Single Column Model (SCM). By comparing the SCM parametrised tendencies with the evolution of the high resolution model, we can estimate the error in the SCM tendencies that a stochastic parametrisation seeks to represent. We use this approach to assess the physical basis of the widely used Stochastically Perturbed Parametrisation Tendencies (SPPT) scheme. We find justification for the multiplicative nature of SPPT, and for the use of spatio-temporally correlated stochastic perturbations. We find evidence that the stochastic perturbation should be positively skewed, indicating that occasional large-magnitude positive perturbations are physically realistic. However other key assumptions of SPPT are less well justified, including coherency of the stochastic perturbations with height, coherency of the perturbations for different physical parametrisation schemes, and coherency for different prognostic variables. Relaxing these SPPT assumptions allows for an error model that explains a larger fractional variance than traditional SPPT. In particular, we suggest that independently perturbing the tendencies associated with different parametrisation schemes is justifiable, and would improve the realism of the SPPT approach.
Production in an economy is a set of firms activities as suppliers and customers; a firm buys goods from other firms, puts value added and sells products to others in a giant network of production. Empirical study is lacking despite the fact that the structure of the production network is important to understand and make models for many aspects of dynamics in economy. We study a nation-wide production network comprising a million firms and millions of supplier-customer links by using recent statistical methods developed in physics. We show in the empirical analysis scale-free degree distribution, disassortativity, correlation of degree to firm-size, and community structure having sectoral and regional modules. Since suppliers usually provide credit to their customers, who supply it to theirs in turn, each link is actually a creditor-debtor relationship. We also study chains of failures or bankruptcies that take place along those links in the network, and corresponding avalanche-size distribution.
A stochastic subgrid-scale parameterization based on the Ruelles response theory and proposed in Wouters and Lucarini [2012] is tested in the context of a low-order coupled ocean-atmosphere model for which a part of the atmospheric modes are consider ed as unresolved. A natural separation of the phase-space into an invariant set and its complement allows for an analytical derivation of the different terms involved in the parameterization, namely the average, the fluctuation and the long memory terms. In this case, the fluctuation term is an additive stochastic noise. Its application to the low-order system reveals that a considerable correction of the low-frequency variability along the invariant subset can be obtained, provided that the coupling is sufficiently weak. This new approach of scale separation opens new avenues of subgrid-scale parameterizations in multiscale systems used for climate forecasts.
This paper demonstrates the efficacy of data-driven localization mappings for assimilating satellite-like observations in a dynamical system of intermediate complexity. In particular, a sparse network of synthetic brightness temperature measurements is simulated using an idealized radiative transfer model and assimilated to the monsoon-Hadley multicloud model, a nonlinear stochastic model containing several thousands of model coordinates. A serial ensemble Kalman filter is implemented in which the empirical correlation statistics are improved using localization maps obtained from a supervised learning algorithm. The impact of the localization mappings is assessed in perfect model observing system simulation experiments (OSSEs) as well as in the presence of model errors resulting from the misspecification of key convective closure parameters. In perfect model OSSEs, the localization mappings that use adjacent correlations to improve the correlation estimated from small ensemble sizes produce robust accurate analysis estimates. In the presence of model error, the filter skills of the localization maps trained on perfect and imperfect model data are comparable.
We study the cluster dynamics of multichannel (multivariate) time series by representing their correlations as time-dependent networks and investigating the evolution of network communities. We employ a node-centric approach that allows us to track t he effects of the community evolution on the functional roles of individual nodes without having to track entire communities. As an example, we consider a foreign exchange market network in which each node represents an exchange rate and each edge represents a time-dependent correlation between the rates. We study the period 2005-2008, which includes the recent credit and liquidity crisis. Using dynamical community detection, we find that exchange rates that are strongly attached to their community are persistently grouped with the same set of rates, whereas exchange rates that are important for the transfer of information tend to be positioned on the edges of communities. Our analysis successfully uncovers major trading changes that occurred in the market during the credit crisis.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا