ترغب بنشر مسار تعليمي؟ اضغط هنا

Long term Experience in Autonomous Stations and production quality control

107   0   0.0 ( 0 )
 نشر من قبل Luis Lopes
 تاريخ النشر 2018
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Large area arrays composed by dispersed stations are of major importance in experiments where Extended Air Shower (EAS) sampling is necessary. In those dispersed stations is mandatory to have detectors that requires very low maintenance and shows good resilience to environmental conditions. In 2012 our group started to work in RPCs that could become acceptable candidates to operate within these conditions. Since that time, more than 30 complete detectors were produced, tested and installed in different places, both indoor and outdoor. The data and analysis to be presented is manly related to the tests made in the Auger site, where two RPCs are under test in real conditions for more than two years. The results confirm the capability to operate such kind of RPCs for long time periods under harsh conditions at a stable efficiency. In the last years Lip and USP - S~ao Carlos start collaboration that aim to install an Eng. Array at BATATA (Auger) site to better study and improve the resilience and performance of the RPCs in outdoor experiments. The organization of such collaboration and the work done so far will be presented.

قيم البحث

اقرأ أيضاً

The Kilo Degree Survey (KiDS) is a 1500 square degree optical imaging survey with the recently commissioned OmegaCAM wide-field imager on the VLT Survey Telescope (VST). A suite of data products will be delivered to ESO and the community by the KiDS survey team. Spread over Europe, the KiDS team uses Astro-WISE to collaborate efficiently and pool hardware resources. In Astro-WISE the team shares, calibrates and archives all survey data. The data-centric architectural design realizes a dynamic live archive in which new KiDS survey products of improved quality can be shared with the team and eventually the full astronomical community in a flexible and controllable manner
[Abridged] This paper describes the creation of HadISD: an automatically quality-controlled synoptic resolution dataset of temperature, dewpoint temperature, sea-level pressure, wind speed, wind direction and cloud cover from global weather stations for 1973--2011. The full dataset consists of over 6000 stations, with 3427 long-term stations deemed to have sufficient sampling and quality for climate applications requiring sub-daily resolution. As with other surface datasets, coverage is heavily skewed towards Northern Hemisphere mid-latitudes. The dataset is constructed from a large pre-existing ASCII flatfile data bank that represents over a decade of substantial effort at data retrieval, reformatting and provision. These raw data have had varying levels of quality control applied to them by individual data providers. The work proceeded in several steps: merging stations with multiple reporting identifiers; reformatting to netCDF; quality control; and then filtering to form a final dataset. Particular attention has been paid to maintaining true extreme values where possible within an automated, objective process. Detailed validation has been performed on a subset of global stations and also on UK data using known extreme events to help finalise the QC tests. Further validation was performed on a selection of extreme events world-wide (Hurricane Katrina in 2005, the cold snap in Alaska in 1989 and heat waves in SE Australia in 2009). Although the filtering has removed the poorest station records, no attempt has been made to homogenise the data thus far. Hence non-climatic, time-varying errors may still exist in many of the individual station records and care is needed in inferring long-term trends from these data. A version-control system has been constructed for this dataset to allow for the clear documentation of any updates and corrections in the future.
Environmental monitoring of marine environments presents several challenges: the harshness of the environment, the often remote location, and most importantly, the vast area it covers. Manual operations are time consuming, often dangerous, and labor intensive. Operations from oceanographic vessels are costly and limited to open seas and generally deeper bodies of water. In addition, with lake, river, and ocean shoreline being a finite resource, waterfront property presents an ever increasing valued commodity, requiring exploration and continued monitoring of remote waterways. In order to efficiently explore and monitor currently known marine environments as well as reach and explore remote areas of interest, we present a design of an autonomous surface vehicle (ASV) with the power to cover large areas, the payload capacity to carry sufficient power and sensor equipment, and enough fuel to remain on task for extended periods. An analysis of the design and a discussion on lessons learned during deployments is presented in this paper.
NASA regards data handling and archiving as an integral part of space missions, and has a strong track record of serving astrophysics data to the public, beginning with the the IRAS satellite in 1983. Archives enable a major science return on the sig nificant investment required to develop a space mission. In fact, the presence and accessibility of an archive can more than double the number of papers resulting from the data. In order for the community to be able to use the data, they have to be able to find the data (ease of access) and interpret the data (ease of use). Funding of archival research (e.g., the ADAP program) is also important not only for making scientific progress, but also for encouraging authors to deliver data products back to the archives to be used in future studies. NASA has also enabled a robust system that can be maintained over the long term, through technical innovation and careful attention to resource allocation. This article provides a brief overview of some of NASAs major astrophysics archive systems, including IRSA, MAST, HEASARC, KOA, NED, the Exoplanet Archive, and ADS.
Summary of the long term data taking, related to one of the proposed next generation ground-based gravitational detectors location is presented here. Results of seismic and infrasound noise, electromagnetic attenuation and cosmic muon radiation measu rements are reported in the underground Matra Gravitational and Geophysical Laboratory near Gyongyosoroszi, Hungary. The collected seismic data of more than two years is evaluated from the point of view of the Einstein Telescope, a proposed third generation underground gravitational wave observatory. Applying our results for the site selection will significantly improve the signal to nose ratio of the multi-messenger astrophysics era, especially at the low frequency regime.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا