ﻻ يوجد ملخص باللغة العربية
When the data do not conform to the hypothesis of a known sampling-variance, the fitting of a constant to a set of measured values is a long debated problem. Given the data, fitting would require to find what measurand value is the most trustworthy. Bayesian inference is here reviewed, to assign probabilities to the possible measurand values. Different hypothesis about the data variance are tested by Bayesian model comparison. Eventually, model selection is exemplified in deriving an estimate of the Planck constant.
We discuss the problem of extending data mining approaches to cases in which data points arise in the form of individual graphs. Being able to find the intrinsic low-dimensionality in ensembles of graphs can be useful in a variety of modeling context
The Collaborative Analysis Versioning Environment System (CAVES) project concentrates on the interactions between users performing data and/or computing intensive analyses on large data sets, as encountered in many contemporary scientific disciplines
We review the methods to combine several measurements, in the form of parameter values or $p$-values.
A new data analysis method is developed for the angle resolving silicon telescope introduced at the neutron time of flight facility n_TOF at CERN. The telescope has already been used in measurements of several neutron induced reactions with charged p
The location-scale model is usually present in physics and chemistry in connection to the Birge ratio method for the adjustment of fundamental physical constants such as the Planck constant or the Newtonian constant of gravitation, while the random e