No Arabic abstract
We present a conclusive answer to Bertrands paradox, a long standing open issue in the basic physical interpretation of probability. The paradox deals with the existence of mutually inconsistent results when looking for the probability that a chord, drawn at random in a circle, is longer than the side of an inscribed equilateral triangle. We obtain a unique solution by substituting chord drawing with the throwing of a straw of finite length L on a circle of radius R, thus providing a satisfactory operative definition of the associated experiment. The obtained probability turns out to be a function of the ratio L/R, as intuitively expected.
We consider whether the asymptotic distributions for the log-likelihood ratio test statistic are expected to be Gaussian or chi-squared. Two straightforward examples provide insight on the difference.
Statistical modeling of experimental physical laws is based on the probability density function of measured variables. It is expressed by experimental data via a kernel estimator. The kernel is determined objectively by the scattering of data during
A physical law is represented by the probability distribution of a measured variable. The probability density is described by measured data using an estimator whose kernel is the instrument scattering function. The experimental information and data redundancy are defined in terms of information entropy. The model cost function, comprised of data redundancy and estimation error, is minimized by the creation-annihilation process.
A method to include multiplicative systematic uncertainties into branching ratio limits was proposed by M. Convery. That solution used approximations which are not necessarily valid. This note provides a solution without approximations and compares the results.
The extraction of a physical law y=yo(x) from joint experimental data about x and y is treated. The joint, the marginal and the conditional probability density functions (PDF) are expressed by given data over an estimator whose kernel is the instrument scattering function. As an optimal estimator of yo(x) the conditional average is proposed. The analysis of its properties is based upon a new definition of prediction quality. The joint experimental information and the redundancy of joint measurements are expressed by the relative entropy. With the number of experiments the redundancy on average increases, while the experimental information converges to a certain limit value. The difference between this limit value and the experimental information at a finite number of data represents the discrepancy between the experimentally determined and the true properties of the phenomenon. The sum of the discrepancy measure and the redundancy is utilized as a cost function. By its minimum a reasonable number of data for the extraction of the law yo(x) is specified. The mutual information is defined by the marginal and the conditional PDFs of the variables. The ratio between mutual information and marginal information is used to indicate which variable is the independent one. The properties of the introduced statistics are demonstrated on deterministically and randomly related variables.