Do you want to publish a course? Click here

German-Russian Astroparticle Data Life Cycle Initiative

136   0   0.0 ( 0 )
 Added by Andreas Haungs
 Publication date 2019
  fields Physics
and research's language is English




Ask ChatGPT about the research

A data life cycle (DLC) is a high-level data processing pipeline that involves data acquisition, event reconstruction, data analysis, publication, archiving, and sharing. For astroparticle physics a DLC is particularly important due to the geographical and content diversity of the research field. A dedicated and experiment spanning analysis and data centre would ensure that multi-messenger analyses can be carried out using state-of-the-art methods. The German-Russian Astroparticle Data Life Cycle Initiative (GRADLCI) is a joint project of the KASCADE-Grande and TAIGA collaborations, aimed at developing a concept and creating a DLC prototype that takes into account the data processing features specific for the research field. An open science system based on the KASCADE Cosmic Ray Data Centre (KCDC), which is a web-based platform to provide the astroparticle physics data for the general public, must also include effective methods for distributed data storage algorithms and techniques to allow the community to perform simulations and analyses with sophisticated machine learning methods. The aim is to achieve more efficient analyses of the data collected in different, globally dispersed observatories, as well as a modern education to Big Data Scientist in the synergy between basic research and the information society. The contribution covers the status and future plans of the initiative.



rate research

Read More

Dust offers a unique probe of the interstellar medium (ISM) across multiple size, density, and temperature scales. Dust is detected in outflows of evolved stars, star-forming molecular clouds, planet-forming disks, and even in galaxies at the dawn of the Universe. These grains also have a profound effect on various astrophysical phenomena from thermal balance and extinction in galaxies to the building blocks for planets, and changes in dust grain properties will affect all of these phenomena. A full understanding of dust in all of its forms and stages requires a multi-disciplinary investigation of the dust life cycle. Such an investigation can be achieved with a statistical study of dust properties across stellar evolution, star and planet formation, and redshift. Current and future instrumentation will enable this investigation through fast and sensitive observations in dust continuum, polarization, and spectroscopy from near-infrared to millimeter wavelengths.
78 - V. Tokareva , A. Haungs , D. Kang 2019
Nowadays astroparticle physics faces a rapid data volume increase. Meanwhile, there are still challenges of testing the theoretical models for clarifying the origin of cosmic rays by applying a multi-messenger approach, machine learning and investigation of the phenomena related to the rare statistics in detecting incoming particles. The problems are related to the accurate data mapping and data management as well as to the distributed storage and high-performance data processing. In particular, one could be interested in employing such solutions in study of air-showers induced by ultra-high energy cosmic and gamma rays, testing new hypotheses of hadronic interaction or cross-calibration of different experiments. KASCADE (Karlsruhe, Germany) and TAIGA (Tunka valley, Russia) are experiments in the field of astroparticle physics, aiming at the detection of cosmic-ray air-showers, induced by the primaries in the energy range of about hundreds TeVs to hundreds PeVs. They are located at the same latitude and have an overlap in operation runs. These factors determine the interest in performing a joint analysis of these data. In the German-Russian Astroparticle Data Life Cycle Initiative (GRADLCI), modern technologies of the distributed data management are being employed for establishing a reliable open access to the experimental cosmic-ray physics data collected by KASCADE and the Tunka-133 setup of TAIGA.
331 - A.Haungs , D.Kang , S.Schoo 2018
The `KASCADE Cosmic ray Data Centre is a web portal (url{https://kcdc.ikp.kit.edu}), where the data of the astroparticle physics experiment KASCADE-Grande are made available for the interested public. The KASCADE experiment was a large-area detector for the measurement of high-energy cosmic rays via the detection of extensive air showers. The multi-detector installations KASCADE and its extension KASCADE-Grande stopped the active data acquisition in 2013 of all its components end of 2012 after more than 20 years of data taking. In several updates since our first release in 2013 with KCDC we provide the public measured and reconstructed parameters of more than 433 million air showers. In addition, KCDC provides meta data information and documentation to enable a user outside the community of experts to perform their own data analysis. Simulation data from three different high energy interaction models have been made available as well as a compilation of measured and published spectra from various experiments. In addition, detailed educational examples shall encourage high-school students and early stage researchers to learn about astroparticle physics, cosmic radiation as well as the handling of Big Data and about the sustainable and public provision of scientific data.
The almost universal availability of electronic connectivity, web software, and portable devices is bringing about a major revolution: information of all kinds is rapidly becoming accessible to everyone, transforming social, economic and cultural life practically everywhere in the world. Internet technologies represent an unprecedented and extraordinary two-way channel of communication between producers and users of data. For this reason the web is widely recognized as an asset capable of achieving the fundamental goal of transparency of information and of data products, in line with the growing demand for transparency of all goods that are produced with public money. This paper describes Open Universe an initiative proposed to the United Nations Committee on the Peaceful Uses of Outer Space (COPUOS) with the objective of stimulating a dramatic increase in the availability and usability of space science data, extending the potential of scientific discovery to new participants in all parts of the world.
Recent years have seen rapid deployment of mobile computing and Internet of Things (IoT) networks, which can be mostly attributed to the increasing communication and sensing capabilities of wireless systems. Big data analysis, pervasive computing, and eventually artificial intelligence (AI) are envisaged to be deployed on top of the IoT and create a new world featured by data-driven AI. In this context, a novel paradigm of merging AI and wireless communications, called Wireless AI that pushes AI frontiers to the network edge, is widely regarded as a key enabler for future intelligent network evolution. To this end, we present a comprehensive survey of the latest studies in wireless AI from the data-driven perspective. Specifically, we first propose a novel Wireless AI architecture that covers five key data-driven AI themes in wireless networks, including Sensing AI, Network Device AI, Access AI, User Device AI and Data-provenance AI. Then, for each data-driven AI theme, we present an overview on the use of AI approaches to solve the emerging data-related problems and show how AI can empower wireless network functionalities. Particularly, compared to the other related survey papers, we provide an in-depth discussion on the Wireless AI applications in various data-driven domains wherein AI proves extremely useful for wireless network design and optimization. Finally, research challenges and future visions are also discussed to spur further research in this promising area.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا