ترغب بنشر مسار تعليمي؟ اضغط هنا

NASAs Long-Term Astrophysics Data Archives

63   0   0.0 ( 0 )
 نشر من قبل Luisa Rebull
 تاريخ النشر 2017
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

NASA regards data handling and archiving as an integral part of space missions, and has a strong track record of serving astrophysics data to the public, beginning with the the IRAS satellite in 1983. Archives enable a major science return on the significant investment required to develop a space mission. In fact, the presence and accessibility of an archive can more than double the number of papers resulting from the data. In order for the community to be able to use the data, they have to be able to find the data (ease of access) and interpret the data (ease of use). Funding of archival research (e.g., the ADAP program) is also important not only for making scientific progress, but also for encouraging authors to deliver data products back to the archives to be used in future studies. NASA has also enabled a robust system that can be maintained over the long term, through technical innovation and careful attention to resource allocation. This article provides a brief overview of some of NASAs major astrophysics archive systems, including IRSA, MAST, HEASARC, KOA, NED, the Exoplanet Archive, and ADS.



قيم البحث

اقرأ أيضاً

59 - M. Wertz , D. Horns , D. Groote 2016
At the Hamburger Sternwarte an effort was started in 2010 with the aim of digitizing its more than 45000 photographic plates and films stored in its plate archives. At the time of writing, more than 31000 plates have already been made available on th e Internet for researchers, historians, as well as for the interested public. The digitization process and the Internet presentation of the plates and accompanying hand written material (plate envelopes, logbooks, observer notes) are presented here. To fully exploit the unique photometric and astrometric data, stored on the plates, further processing steps are required including registering the plate to celestial coordinates, masking of the plates, and a calibration of the photo-emulsion darkening curve. To demonstrate the correct functioning of these procedures, historical light curves of two bright BL Lac type active galactic nuclei are extracted. The resulting light curve of the blazar 1ES 1215+303 exhibits a large decrease in the magnitude from $14.25^{+0.07}_{-0.12}$ to $15.94^{+0.09}_{-0.13}$ in about 300 days, which proves the variability in the optical region. Furthermore, we compare the measured magnitudes for the quasar 3C~273 with contemporaneous measurements.
Laboratory astrophysics and complementary theoretical calculations are the foundations of astronomy and astrophysics and will remain so into the foreseeable future. The mission enabling impact of laboratory astrophysics ranges from the scientific con ception stage for airborne and space-based observatories, all the way through to the scientific return of these missions. It is our understanding of the under-lying physical processes and the measurements of critical physical parameters that allows us to address fundamental questions in astronomy and astrophysics. In this regard, laboratory astrophysics is much like detector and instrument development at NASA. These efforts are necessary for the success of astronomical research being funded by NASA. Without concomitant efforts in all three directions (observational facilities, detector/instrument development, and laboratory astrophysics) the future progress of astronomy and astrophysics is imperiled. In addition, new developments in experimental technologies have allowed laboratory studies to take on a new role as some questions which previously could only be studied theoretically can now be addressed directly in the lab. With this in mind we, the members of the AAS Working Group on Laboratory Astrophysics (WGLA), have prepared this White Paper on the laboratory astrophysics infrastructure needed to maximize the scientific return from NASAs space and Earth sciences program.
112 - Chao Sun , Ce Yu , Chenzhou Cui 2020
Astronomical observation data require long-term preservation, and the rapid accumulation of observation data makes it necessary to consider the cost of long-term archive storage. In addition to low-speed disk-based online storage, optical disk or tap e-based offline storage can be used to save costs. However, for astronomical research that requires historical data (particularly time-domain astronomy), the performance and energy consumption of data-accessing techniques cause problems because the requested data (which are organized according to observation time) may be located across multiple storage devices. In this study, we design and develop a tool referred to as AstroLayout to redistribute the observation data using spatial aggregation. The core algorithm uses graph partitioning to generate an optimized data placement according to the original observation data statistics and the target storage system. For the given observation data, AstroLayout can copy the long-term archive in the target storage system in accordance with this placement. An efficiency evaluation shows that AstroLayout can reduce the number of devices activated when responding to data-access requests in time-domain astronomy research. In addition to improving the performance of data-accessing techniques, AstroLayout can also reduce the storage systems power consumption. For enhanced adaptability, it supports storage systems of any media, including optical disks, tapes, and hard disks.
We review some aspects of the current state of data-intensive astronomy, its methods, and some outstanding data analysis challenges. Astronomy is at the forefront of big data science, with exponentially growing data volumes and data rates, and an eve r-increasing complexity, now entering the Petascale regime. Telescopes and observatories from both ground and space, covering a full range of wavelengths, feed the data via processing pipelines into dedicated archives, where they can be accessed for scientific analysis. Most of the large archives are connected through the Virtual Observatory framework, that provides interoperability standards and services, and effectively constitutes a global data grid of astronomy. Making discoveries in this overabundance of data requires applications of novel, machine learning tools. We describe some of the recent examples of such applications.
The past year has witnessed discovery of the first identified counterparts to a gravitational wave transient (GW 170817A) and a very high-energy neutrino (IceCube-170922A). These source identifications, and ensuing detailed studies, have realized lon gstanding dreams of astronomers and physicists to routinely carry out observations of cosmic sources by other than electromagnetic means, and inaugurated the era of multi-messenger astronomy. While this new era promises extraordinary physical insights into the universe, it brings with it new challenges, including: highly heterogeneous, high-volume, high-velocity datasets; globe-spanning cross-disciplinary teams of researchers, regularly brought together into transient collaborations; an extraordinary breadth and depth of domain-specific knowledge and computing resources required to anticipate, model, and interpret observations; and the routine need for adaptive, distributed, rapid-response observing campaigns to fully exploit the scientific potential of each source. We argue, therefore, that the time is ripe for the community to conceive and propose an Institute for Multi-Messenger Astrophysics that would coordinate its resources in a sustained and strategic fashion to efficiently address these challenges, while simultaneously serving as a center for education and key supporting activities. In this fashion, we can prepare now to realize the bright future that we see, beyond, through these newly opened windows onto the universe.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا