Do you want to publish a course? Click here

Bertrands paradox: a physical solution

92   0   0.0 ( 0 )
 Added by Alessandro Ciattoni
 Publication date 2010
  fields Physics
and research's language is English




Ask ChatGPT about the research

We present a conclusive answer to Bertrands paradox, a long standing open issue in the basic physical interpretation of probability. The paradox deals with the existence of mutually inconsistent results when looking for the probability that a chord, drawn at random in a circle, is longer than the side of an inscribed equilateral triangle. We obtain a unique solution by substituting chord drawing with the throwing of a straw of finite length L on a circle of radius R, thus providing a satisfactory operative definition of the associated experiment. The obtained probability turns out to be a function of the ratio L/R, as intuitively expected.

rate research

Read More

61 - Louis Lyons 2017
We consider whether the asymptotic distributions for the log-likelihood ratio test statistic are expected to be Gaussian or chi-squared. Two straightforward examples provide insight on the difference.
153 - I. Grabec 2007
Statistical modeling of experimental physical laws is based on the probability density function of measured variables. It is expressed by experimental data via a kernel estimator. The kernel is determined objectively by the scattering of data during calibration of experimental setup. A physical law, which relates measured variables, is optimally extracted from experimental data by the conditional average estimator. It is derived directly from the kernel estimator and corresponds to a general nonparametric regression. The proposed method is demonstrated by the modeling of a return map of noisy chaotic data. In this example, the nonparametric regression is used to predict a future value of chaotic time series from the present one. The mean predictor error is used in the definition of predictor quality, while the redundancy is expressed by the mean square distance between data points. Both statistics are used in a new definition of predictor cost function. From the minimum of the predictor cost function, a proper number of data in the model is estimated.
452 - I. Grabec 2007
A physical law is represented by the probability distribution of a measured variable. The probability density is described by measured data using an estimator whose kernel is the instrument scattering function. The experimental information and data redundancy are defined in terms of information entropy. The model cost function, comprised of data redundancy and estimation error, is minimized by the creation-annihilation process.
94 - K. Stenson 2006
A method to include multiplicative systematic uncertainties into branching ratio limits was proposed by M. Convery. That solution used approximations which are not necessarily valid. This note provides a solution without approximations and compares the results.
173 - I. Grabec 2007
The extraction of a physical law y=yo(x) from joint experimental data about x and y is treated. The joint, the marginal and the conditional probability density functions (PDF) are expressed by given data over an estimator whose kernel is the instrument scattering function. As an optimal estimator of yo(x) the conditional average is proposed. The analysis of its properties is based upon a new definition of prediction quality. The joint experimental information and the redundancy of joint measurements are expressed by the relative entropy. With the number of experiments the redundancy on average increases, while the experimental information converges to a certain limit value. The difference between this limit value and the experimental information at a finite number of data represents the discrepancy between the experimentally determined and the true properties of the phenomenon. The sum of the discrepancy measure and the redundancy is utilized as a cost function. By its minimum a reasonable number of data for the extraction of the law yo(x) is specified. The mutual information is defined by the marginal and the conditional PDFs of the variables. The ratio between mutual information and marginal information is used to indicate which variable is the independent one. The properties of the introduced statistics are demonstrated on deterministically and randomly related variables.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا