Do you want to publish a course? Click here

Experiment Software and Projects on the Web with VISPA

79   0   0.0 ( 0 )
 Added by Marcel Rieger
 Publication date 2017
  fields Physics
and research's language is English




Ask ChatGPT about the research

The Visual Physics Analysis (VISPA) project defines a toolbox for accessing software via the web. It is based on latest web technologies and provides a powerful extension mechanism that enables to interface a wide range of applications. Beyond basic applications such as a code editor, a file browser, or a terminal, it meets the demands of sophisticated experiment-specific use cases that focus on physics data analyses and typically require a high degree of interactivity. As an example, we developed a data inspector that is capable of browsing interactively through event content of several data formats, e.g., MiniAOD which is utilized by the CMS collaboration. The VISPA extension mechanism can also be used to embed external web-based applications that benefit from dynamic allocation of user-defined computing resources via SSH. For example, by wrapping the JSROOT project, ROOT files located on any remote machine can be inspected directly through a VISPA server instance. We introduced domains that combine groups of users and role-based permissions. Thereby, tailored projects are enabled, e.g. for teaching where access to students homework is restricted to a team of tutors, or for experiment-specific data that may only be accessible for members of the collaboration. We present the extension mechanism including corresponding applications and give an outlook onto the new permission system.



rate research

Read More

128 - O. Actis , M. Erdmann , R. Fischer 2008
VISPA is a novel development environment for high energy physics analyses, based on a combination of graphical and textual steering. The primary aim of VISPA is to support physicists in prototyping, performing, and verifying a data analysis of any complexity. We present example screenshots, and describe the underlying software concepts.
The objective of the Karlsruhe Tritium Neutrino (KATRIN) experiment is to determine the effective electron neutrino mass $m( u_text{e})$ with an unprecedented sensitivity of $0.2,text{eV}$ (90% C.L.) by precision electron spectroscopy close to the endpoint of the $beta$ decay of tritium. We present a consistent theoretical description of the $beta$ electron energy spectrum in the endpoint region, an accurate model of the apparatus response function, and the statistical approaches suited to interpret and analyze tritium $beta$ decay data observed with KATRIN with the envisaged precision. In addition to providing detailed analytical expressions for all formulae used in the presented model framework with the necessary detail of derivation, we discuss and quantify the impact of theoretical and experimental corrections on the measured $m( u_text{e})$. Finally, we outline the statistical methods for parameter inference and the construction of confidence intervals that are appropriate for a neutrino mass measurement with KATRIN. In this context, we briefly discuss the choice of the $beta$ energy analysis interval and the distribution of measuring time within that range.
51 - A. Rimoldi , A. DellAcqua 2003
The simulation of the ATLAS detector is a major challenge, given the complexity of the detector and the demanding environment of the LHC. The apparatus, one of the biggest and most complex ever designed, requires a detailed, flexible and, if possible, fast simulation which is needed already today to deal with questions related to design optimization, to issues raised by staging scenarios, and of course to enable detailed physics studies to lay the basis for the first physics discoveries. Scalability and robustness stand out as the most critical issues that are to be faced in the implementation of such a simulation. In this paper we present the status of the present simulation and the adopted solutions in terms of speed optimization, centralization of services, framework facilities and persistency solutions. Emphasis is put on the global performance when the different detector components are collected together in a full and detailed simulation. The reference tool adopted is Geant4.
154 - G.Vianello 2017
Several experiments in high-energy physics and astrophysics can be treated as on/off measurements, where an observation potentially containing a new source or effect (on measurement) is contrasted with a background-only observation free of the effect (off measurement). In counting experiments, the significance of the new source or effect can be estimated with a widely-used formula from [LiMa], which assumes that both measurements are Poisson random variables. In this paper we study three other cases: i) the ideal case where the background measurement has no uncertainty, which can be used to study the maximum sensitivity that an instrument can achieve, ii) the case where the background estimate $b$ in the off measurement has an additional systematic uncertainty, and iii) the case where $b$ is a Gaussian random variable instead of a Poisson random variable. The latter case applies when $b$ comes from a model fitted on archival or ancillary data, or from the interpolation of a function fitted on data surrounding the candidate new source/effect. Practitioners typically use in this case a formula which is only valid when $b$ is large and when its uncertainty is very small, while we derive a general formula that can be applied in all regimes. We also develop simple methods that can be used to assess how much an estimate of significance is sensitive to systematic uncertainties on the efficiency or on the background. Examples of applications include the detection of short Gamma-Ray Bursts and of new X-ray or $gamma$-ray sources.
The high energy physics community is discussing where investment is needed to prepare software for the HL-LHC and its unprecedented challenges. The ROOT project is one of the central software players in high energy physics since decades. From its experience and expectations, the ROOT team has distilled a comprehensive set of areas that should see research and development in the context of data analysis software, for making best use of HL-LHCs physics potential. This work shows what these areas could be, why the ROOT team believes investing in them is needed, which gains are expected, and where related work is ongoing. It can serve as an indication for future research proposals and cooperations.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا