Do you want to publish a course? Click here

Resolution dependency of sinking Lagrangian particles in ocean general circulation models

81   0   0.0 ( 0 )
 Added by Peter Nooteboom
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

Any type of non-buoyant material in the ocean is transported horizontally by currents during its sinking journey. This lateral transport can be far from negligible for small sinking velocities. To estimate its magnitude and direction, the material is often modelled as a set of Lagrangian particles advected by current velocities that are obtained from Ocean General Circulation Models (OGCMs). State-of-the-art OGCMs are strongly eddying, similar to the real ocean, providing results with a spatial resolution on the order of 10 km on a daily frequency. While the importance of eddies in OGCMs is well-appreciated in the physical oceanographic community, other marine research communities may not. To demonstrate how much the absence of mesoscale features in low-resolution models influences the Lagrangian particle transport, we simulate the transport of sinking Lagrangian particles using low- and high-resolution global OGCMs, and assess the lateral transport differences resulting from the difference in spatial and temporal model resolution. We find major differences between the transport in the non-eddying OGCM and in the eddying OGCM. Addition of stochastic noise to the particle trajectories in the non-eddying OGCM parameterises the effect of eddies well in some cases. The effect of a coarser temporal resolution (5-daily) is smaller compared to a coarser spatial resolution (0.1$^{circ}$ versus 1$^{circ}$ horizontally). We recommend to use sinking Lagrangian particles, representing e.g. marine snow, microplankton or sinking plastic, only with velocity fields from eddying OGCMs, requiring high-resolution models in e.g. paleoceanographic studies. To increase the accessibility of our particle trace simulations, we launch planktondrift.science.uu.nl, an online tool to reconstruct the surface origin of sedimentary particles in a specific location.



rate research

Read More

Dissolved manganese (Mn) is a biologically essential element, and its oxidised form is involved in the removal of trace elements from ocean waters. Recently, a large number of highly accurate Mn measurements have been obtained in the Atlantic, Indian and Arctic Oceans as part of the GEOTRACES programme. The goal of this study is to combine these new observations with state-of-the-art modelling to give new insights into the main sources and redistribution of Mn throughout the ocean. To this end, we simulate the distribution of dissolved Mn using a global-scale circulation model. This first model includes simple parameterisations to account, realistically, for the sources, processes and sinks of Mn in the ocean. Whereas oxidation and (photo)reduction, as well as aggregation and settling are parameterised in the model, biological uptake is not yet taken into account by the model. Our model reproduces observations accurately and provides the following insights: - The high surface concentrations of manganese are caused by the combination of photoreduction and sources to the upper ocean. The most important sources are dust, then sediments, and, more locally, rivers. - Results show that surface Mn in the Atlantic Ocean moves downwards into the North Atlantic Deep Water, but because of strong removal rates the Mn does not propagate southwards. - There is a mostly homogeneous background concentration of dissolved Mn of about 0.10 to 0.15 nM throughout most of the deep ocean. The model reproduces this by means of a threshold on manganese oxides of 25 pM, suggesting that a minimal concentration of Mn is needed before aggregation and removal become efficient. - The observed sharp hydrothermal signals are produced by assuming both a high source and a strong removal of Mn near hydrothermal vents.
146 - S.V. Prants 2015
Dynamical systems theory approach has been successfully used in physical oceanography for the last two decades to study mixing and transport of water masses in the ocean. The basic theoretical ideas have been borrowed from the phenomenon of chaotic advection in fluids, an analogue of dynamical Hamiltonian chaos in mechanics. The starting point for analysis is a velocity field obtained by this or that way. Being motivated by successful applications of that approach to simplified analytic models of geophysical fluid flows, researchers now work with satellite-derived velocity fields and outputs of sophisticated numerical models of ocean circulation. This review article gives an introduction to some of the basic concepts and methods used to study chaotic mixing and transport in the ocean and a brief overview of recent results with some practical applications of Lagrangian tools to monitor spreading of Fukushima-derived radionuclides in the ocean.
As Ocean General Circulation Models (OGCMs) move into the petascale age, where the output from global high-resolution model runs can be of the order of hundreds of terabytes in size, tools to analyse the output of these models will need to scale up too. Lagrangian Ocean Analysis, where virtual particles are tracked through hydrodynamic fields, is an increasingly popular way to analyse OGCM output, by mapping pathways and connectivity of biotic and abiotic particulates. However, the current software stack of Lagrangian Ocean Analysis codes is not dynamic enough to cope with the increasing complexity, scale and need for customisation of use-cases. Furthermore, most community codes are developed for stand-alone use, making it a nontrivial task to integrate virtual particles at runtime of the OGCM. Here, we introduce the new Parcels code, which was designed from the ground up to be sufficiently scalable to cope with petascale computing. We highlight its API design that combines flexibility and customisation with the ability to optimise for HPC workflows, following the paradigm of domain-specific languages. Parcels is primarily written in Python, utilising the wide range of tools available in the scientific Python ecosystem, while generating low-level C-code and using Just-In-Time compilation for performance-critical computation. We show a worked-out example of its API, and validate the accuracy of the code against seven idealised test cases. This version~0.9 of Parcels is focussed on laying out the API, with future work concentrating on optimisation, efficiency and at-runtime coupling with OGCMs.
In the past decades, boreal summers have been characterized by an increasing number of extreme weather events in the Northern Hemisphere extratropics, including persistent heat waves, droughts and heavy rainfall events with significant social, economic and environmental impacts. Many of these events have been associated with the presence of anomalous large-scale atmospheric circulation patterns, in particular persistent blocking situations, i.e., nearly stationary spatial patterns of air pressure. To contribute to a better understanding of the emergence and dynamical properties of such situations, we construct complex networks representing the atmospheric circulation based on Lagrangian trajectory data of passive tracers advected within the atmospheric flow. For these Lagrangian flow networks, we study the spatial patterns of selected node properties prior to, during and after different atmospheric blocking events in Northern Hemisphere summer. We highlight the specific network characteristics associated with the sequence of strong blocking episodes over Europe during summer 2010 as an illustrative example. Our results demonstrate the ability of the node degree, entropy and harmonic closeness centrality based on outgoing links to trace important spatio-temporal characteristics of atmospheric blocking events. In particular, all three measures capture the effective separation of the stationary pressure cell forming the blocking high from the normal westerly flow and the deviation of the main atmospheric currents around it. Our results suggest the utility of further exploiting the Lagrangian flow network approach to atmospheric circulation in future targeted diagnostic and prognostic studies.
119 - Nicola Scafetta 2013
Power spectra of global surface temperature (GST) records reveal major periodicities at about 9.1, 10-11, 19-22 and 59-62 years. The Coupled Model Intercomparison Project 5 (CMIP5) general circulation models (GCMs), to be used in the IPCC (2013), are analyzed and found not able to reconstruct this variability. From 2000 to 2013.5 a GST plateau is observed while the GCMs predicted a warming rate of about 2 K/century. In contrast, the hypothesis that the climate is regulated by specific natural oscillations more accurately fits the GST records at multiple time scales. The climate sensitivity to CO2 doubling should be reduced by half, e.g. from the IPCC-2007 2.0-4.5 K range to 1.0-2.3 K with 1.5 C median. Also modern paleoclimatic temperature reconstructions yield the same conclusion. The observed natural oscillations could be driven by astronomical forcings. Herein I propose a semi empirical climate model made of six specific astronomical oscillations as constructors of the natural climate variability spanning from the decadal to the millennial scales plus a 50% attenuated radiative warming component deduced from the GCM mean simulation as a measure of the anthropogenic and volcano contributions to climatic changes. The semi empirical model reconstructs the 1850-2013 GST patterns significantly better than any CMIP5 GCM simulation. The model projects a possible 2000-2100 average warming ranging from about 0.3 C to 1.8 C that is significantly below the original CMIP5 GCM ensemble mean range (1 K to 4 K).
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا