We review the information geometry of linear systems and its application to Bayesian inference, and the simplification available in the Kahler manifold case. We find conditions for the information geometry of linear systems to be Kahler, and the relation of the Kahler potential to information geometric quantities such as $alpha $-divergence, information distance and the dual $alpha $-connection structure. The Kahler structure simplifies the calculation of the metric tensor, connection, Ricci tensor and scalar curvature, and the $alpha $-generalization of the geometric objects. The Laplace--Beltrami operator is also simplified in the Kahler geometry. One of the goals in information geometry is the construction of Bayesian priors outperforming the Jeffreys prior, which we use to demonstrate the utility of the Kahler structure.