ترغب بنشر مسار تعليمي؟ اضغط هنا

Experimental modeling of physical laws

420   0   0.0 ( 0 )
 نشر من قبل Igor Grabec
 تاريخ النشر 2007
  مجال البحث فيزياء
والبحث باللغة English
 تأليف I. Grabec




اسأل ChatGPT حول البحث

A physical law is represented by the probability distribution of a measured variable. The probability density is described by measured data using an estimator whose kernel is the instrument scattering function. The experimental information and data redundancy are defined in terms of information entropy. The model cost function, comprised of data redundancy and estimation error, is minimized by the creation-annihilation process.

قيم البحث

اقرأ أيضاً

140 - I. Grabec 2007
The extraction of a physical law y=yo(x) from joint experimental data about x and y is treated. The joint, the marginal and the conditional probability density functions (PDF) are expressed by given data over an estimator whose kernel is the instrume nt scattering function. As an optimal estimator of yo(x) the conditional average is proposed. The analysis of its properties is based upon a new definition of prediction quality. The joint experimental information and the redundancy of joint measurements are expressed by the relative entropy. With the number of experiments the redundancy on average increases, while the experimental information converges to a certain limit value. The difference between this limit value and the experimental information at a finite number of data represents the discrepancy between the experimentally determined and the true properties of the phenomenon. The sum of the discrepancy measure and the redundancy is utilized as a cost function. By its minimum a reasonable number of data for the extraction of the law yo(x) is specified. The mutual information is defined by the marginal and the conditional PDFs of the variables. The ratio between mutual information and marginal information is used to indicate which variable is the independent one. The properties of the introduced statistics are demonstrated on deterministically and randomly related variables.
125 - I. Grabec 2007
Statistical modeling of experimental physical laws is based on the probability density function of measured variables. It is expressed by experimental data via a kernel estimator. The kernel is determined objectively by the scattering of data during calibration of experimental setup. A physical law, which relates measured variables, is optimally extracted from experimental data by the conditional average estimator. It is derived directly from the kernel estimator and corresponds to a general nonparametric regression. The proposed method is demonstrated by the modeling of a return map of noisy chaotic data. In this example, the nonparametric regression is used to predict a future value of chaotic time series from the present one. The mean predictor error is used in the definition of predictor quality, while the redundancy is expressed by the mean square distance between data points. Both statistics are used in a new definition of predictor cost function. From the minimum of the predictor cost function, a proper number of data in the model is estimated.
We present a conclusive answer to Bertrands paradox, a long standing open issue in the basic physical interpretation of probability. The paradox deals with the existence of mutually inconsistent results when looking for the probability that a chord, drawn at random in a circle, is longer than the side of an inscribed equilateral triangle. We obtain a unique solution by substituting chord drawing with the throwing of a straw of finite length L on a circle of radius R, thus providing a satisfactory operative definition of the associated experiment. The obtained probability turns out to be a function of the ratio L/R, as intuitively expected.
328 - I. Grabec 2007
Redundancy of experimental data is the basic statistic from which the complexity of a natural phenomenon and the proper number of experiments needed for its exploration can be estimated. The redundancy is expressed by the entropy of information perta ining to the probability density function of experimental variables. Since the calculation of entropy is inconvenient due to integration over a range of variables, an approximate expression for redundancy is derived that includes only a sum over the set of experimental data about these variables. The approximation makes feasible an efficient estimation of the redundancy of data along with the related experimental information and information cost function. From the experimental information the complexity of the phenomenon can be simply estimated, while the proper number of experiments needed for its exploration can be determined from the minimum of the cost function. The performance of the approximate estimation of these statistics is demonstrated on two-dimensional normally distributed random data.
Amplitude analysis is a powerful technique to study hadron decays. A significant complication in these analyses is the treatment of instrumental effects, such as background and selection efficiency variations, in the multidimensional kinematic phase space. This paper reviews conventional methods to estimate efficiency and background distributions and outlines the methods of density estimation using Gaussian processes and artificial neural networks. Such techniques see widespread use elsewhere, but have not gained popularity in use for amplitude analyses. Finally, novel applications of these models are proposed, to estimate background density in the signal region from the sidebands in multiple dimensions, and a more general method for model-assisted density estimation using artificial neural networks.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا