• Corpus ID: 237142612

# K\"ahler information manifolds of signal processing filters in weighted Hardy spaces

@inproceedings{Choi2021KahlerIM,
title={K\"ahler information manifolds of signal processing filters in weighted Hardy spaces},
author={Jaehyung Choi},
year={2021}
}
We generalize Kähler information manifolds of complex-valued signal processing filters by introducing weighted Hardy spaces and generic composite functions of transfer functions. We prove that the Riemannian geometry induced from weighted Hardy norms for composite functions of its transfer function is the Kähler manifold. Additionally, the Kähler potential of the linear system geometry corresponds to the square of the weighted Hardy norms for composite functions of its transfer function. By…

## References

SHOWING 1-10 OF 25 REFERENCES

• Mathematics
Entropy
• 2015
The correspondence between the information geometry of a signal filter and a K\"ahler manifold is proved and the Bayesian predictive priors are found, such as superharmonic priors, because Laplace-Beltrami operators on K \"ahler manifolds are in much simpler forms than those of the non-K\"ahlers manifolds.
Information Geometry has been introduced by Rao, and axiomatized by Chentsov, to define a distance between statistical distributions that is invariant to non-singular parameterization
The Koszul-Vinberg Characteristic Function (KVCF) on convex cones will be presented as cornerstone of “Information Geometry” theory, defining KoszUL Entropy as Legendre transform of minus the logarithm of KVCF, and Fisher Information Metrics as hessian of these dual functions, invariant by their automorphisms.
• S. Amari
• Mathematics, Computer Science
Mathematical systems theory
• 2005
A new geometrical method and framework for analyzing properties of manifolds of systems using a Riemannian metric and a pair of dual affine connections to solve the problem of approximating a given system by one included in a model.
• Mathematics
ArXiv
• 2014
One of the goals in information geometry is the construction of Bayesian priors outperforming the Jeffreys prior, which is used to demonstrate the utility of the Kahler structure.
• Mathematics
GSI
• 2013
Divergence functions play a central role in information geometry. Given a manifold $${\mathcal M}$$, a divergence function $${\mathcal D}$$ is a smooth, non-negative function on the product manifold
• Mathematics
Entropy
• 2015
An efficient and robust algorithm for finding superharmonic priors which outperform the Jeffreys prior is introduced and several ansatze for the Bayesian predictive priors are suggested.
• Mathematics
• 2016
Reproducing kernel Hilbert spaces have developed into an important tool in many areas, especially statistics and machine learning, and they play a valuable role in complex analysis, probability,
• Mathematics
IEEE Transactions on Information Theory
• 2021
This work investigates shrinkage priors on power spectral densities for complex-valued circular-symmetric autoregressive processes and proposes general constructions of objective priors for Kähler parameter spaces by utilizing a positive continuous eigenfunction of the Laplace–Beltrami operator with a negative eigenvalue.
Tanaka and Komaki (Sankhya Ser A Indian Stat Inst 73-A:162–184, 2011) proposed superharmonic priors in Bayesian time series analysis as alternative to the famous Jeffreys prior. By definition the