ترغب بنشر مسار تعليمي؟ اضغط هنا

HadISD: a quality-controlled global synoptic report database for selected variables at long-term stations from 1973--2011

83   0   0.0 ( 0 )
 نشر من قبل Robert Dunn
 تاريخ النشر 2012
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

[Abridged] This paper describes the creation of HadISD: an automatically quality-controlled synoptic resolution dataset of temperature, dewpoint temperature, sea-level pressure, wind speed, wind direction and cloud cover from global weather stations for 1973--2011. The full dataset consists of over 6000 stations, with 3427 long-term stations deemed to have sufficient sampling and quality for climate applications requiring sub-daily resolution. As with other surface datasets, coverage is heavily skewed towards Northern Hemisphere mid-latitudes. The dataset is constructed from a large pre-existing ASCII flatfile data bank that represents over a decade of substantial effort at data retrieval, reformatting and provision. These raw data have had varying levels of quality control applied to them by individual data providers. The work proceeded in several steps: merging stations with multiple reporting identifiers; reformatting to netCDF; quality control; and then filtering to form a final dataset. Particular attention has been paid to maintaining true extreme values where possible within an automated, objective process. Detailed validation has been performed on a subset of global stations and also on UK data using known extreme events to help finalise the QC tests. Further validation was performed on a selection of extreme events world-wide (Hurricane Katrina in 2005, the cold snap in Alaska in 1989 and heat waves in SE Australia in 2009). Although the filtering has removed the poorest station records, no attempt has been made to homogenise the data thus far. Hence non-climatic, time-varying errors may still exist in many of the individual station records and care is needed in inferring long-term trends from these data. A version-control system has been constructed for this dataset to allow for the clear documentation of any updates and corrections in the future.



قيم البحث

اقرأ أيضاً

106 - L. Lopes , A. B. Alves , P. Assis 2018
Large area arrays composed by dispersed stations are of major importance in experiments where Extended Air Shower (EAS) sampling is necessary. In those dispersed stations is mandatory to have detectors that requires very low maintenance and shows goo d resilience to environmental conditions. In 2012 our group started to work in RPCs that could become acceptable candidates to operate within these conditions. Since that time, more than 30 complete detectors were produced, tested and installed in different places, both indoor and outdoor. The data and analysis to be presented is manly related to the tests made in the Auger site, where two RPCs are under test in real conditions for more than two years. The results confirm the capability to operate such kind of RPCs for long time periods under harsh conditions at a stable efficiency. In the last years Lip and USP - S~ao Carlos start collaboration that aim to install an Eng. Array at BATATA (Auger) site to better study and improve the resilience and performance of the RPCs in outdoor experiments. The organization of such collaboration and the work done so far will be presented.
71 - Chao Xu , Yiping Xie , Xijun Wang 2021
Recently, we have struck the balance between the information freshness, in terms of age of information (AoI), experienced by users and energy consumed by sensors, by appropriately activating sensors to update their current status in caching enabled I nternet of Things (IoT) networks [1]. To solve this problem, we cast the corresponding status update procedure as a continuing Markov Decision Process (MDP) (i.e., without termination states), where the number of state-action pairs increases exponentially with respect to the number of considered sensors and users. Moreover, to circumvent the curse of dimensionality, we have established a methodology for designing deep reinforcement learning (DRL) algorithms to maximize (resp. minimize) the average reward (resp. cost), by integrating R-learning, a tabular reinforcement learning (RL) algorithm tailored for maximizing the long-term average reward, and traditional DRL algorithms, initially developed to optimize the discounted long-term cumulative reward rather than the average one. In this technical report, we would present detailed discussions on the technical contributions of this methodology.
Fuel moisture has a major influence on the behavior of wildland fires and is an important underlying factor in fire risk assessment. We propose a method to assimilate dead fuel moisture content observations from remote automated weather stations (RAW S) into a time-lag fuel moisture model. RAWS are spatially sparse and a mechanism is needed to estimate fuel moisture content at locations potentially distant from observational stations. This is arranged using a trend surface model (TSM), which allows us to account for the effects of topography and atmospheric state on the spatial variability of fuel moisture content. At each location of interest, the TSM provides a pseudo-observation, which is assimilated via Kalman filtering. The method is tested with the time-lag fuel moisture model in the coupled weather-fire code WRF-SFIRE on 10-hr fuel moisture content observations from Colorado RAWS in 2013. We show using leave-one-out testing that the TSM compares favorably with inverse squared distance interpolation as used in the Wildland Fire Assessment System. Finally, we demonstrate that the data assimilation method is able to improve fuel moisture content estimates in unobserved fuel classes.
We present the activities of the New Physics working group for the Physics at TeV Colliders workshop (Les Houches, France, 30 May-17 June, 2011). Our report includes new agreements on formats for interfaces between computational tools, new tool devel opments, important signatures for searches at the LHC, recommendations for presentation of LHC search results, as well as additional phenomenological studies.
Spatially and temporally explicit canopy water content (CWC) data are important for monitoring vegetation status, and constitute essential information for studying ecosystem-climate interactions. Despite many efforts there is currently no operational CWC product available to users. In the context of the Satellite Application Facility for Land Surface Analysis (LSA-SAF), we have developed an algorithm to produce a global dataset of CWC based on data from the Advanced Very High Resolution Radiometer (AVHRR) sensor on board Meteorological Operational (MetOp) satellites forming the EUMETSAT Polar System (EPS). CWC reflects the water conditions at the leaf level and information related to canopy structure. An accuracy assessment of the EPS/AVHRR CWC indicated a close agreement with multi-temporal ground data from SMAPVEX16 in Canada and Dahra in Senegal. The present study further evaluates the consistency of the LSA-SAF product with respect to the Simplified Level 2 Product Prototype Processor (SL2P) product, and demonstrates its applicability at different spatio-temporal resolutions using optical data from MSI/Sentinel-2 and MODIS/Terra and Aqua. We conclude that the EPS/AVHRR CWC product is a promising tool for monitoring vegetation water status at regional and global scales.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا