ترغب بنشر مسار تعليمي؟ اضغط هنا

We review the information geometry of linear systems and its application to Bayesian inference, and the simplification available in the Kahler manifold case. We find conditions for the information geometry of linear systems to be Kahler, and the rela tion of the Kahler potential to information geometric quantities such as $alpha $-divergence, information distance and the dual $alpha $-connection structure. The Kahler structure simplifies the calculation of the metric tensor, connection, Ricci tensor and scalar curvature, and the $alpha $-generalization of the geometric objects. The Laplace--Beltrami operator is also simplified in the Kahler geometry. One of the goals in information geometry is the construction of Bayesian priors outperforming the Jeffreys prior, which we use to demonstrate the utility of the Kahler structure.
We prove the correspondence between the information geometry of a signal filter and a Kahler manifold. The information geometry of a minimum-phase linear system with a finite complex cepstrum norm is a Kahler manifold. The square of the complex cepst rum norm of the signal filter corresponds to the Kahler potential. The Hermitian structure of the Kahler manifold is explicitly emergent if and only if the impulse response function of the highest degree in $z$ is constant in model parameters. The Kahlerian information geometry takes advantage of more efficient calculation steps for the metric tensor and the Ricci tensor. Moreover, $alpha$-generalization on the geometric tensors is linear in $alpha$. It is also robust to find Bayesian predictive priors, such as superharmonic priors, because Laplace-Beltrami operators on Kahler manifolds are in much simpler forms than those of the non-Kahler manifolds. Several time series models are studied in the Kahlerian information geometry.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا