• Corpus ID: 237142612

K\"ahler information manifolds of signal processing filters in weighted Hardy spaces

  title={K\"ahler information manifolds of signal processing filters in weighted Hardy spaces},
  author={Jaehyung Choi},
We generalize Kähler information manifolds of complex-valued signal processing filters by introducing weighted Hardy spaces and generic composite functions of transfer functions. We prove that the Riemannian geometry induced from weighted Hardy norms for composite functions of its transfer function is the Kähler manifold. Additionally, the Kähler potential of the linear system geometry corresponds to the square of the weighted Hardy norms for composite functions of its transfer function. By… 



Kählerian Information Geometry for Signal Processing

The correspondence between the information geometry of a signal filter and a K\"ahler manifold is proved and the Bayesian predictive priors are found, such as superharmonic priors, because Laplace-Beltrami operators on K \"ahler manifolds are in much simpler forms than those of the non-K\"ahlers manifolds.

Information Geometry of Covariance Matrix: Cartan-Siegel Homogeneous Bounded Domains, Mostow/Berger Fibration and Fréchet Median

Information Geometry has been introduced by Rao, and axiomatized by Chentsov, to define a distance between statistical distributions that is invariant to non-singular parameterization

Koszul Information Geometry and Souriau Geometric Temperature/Capacity of Lie Group Thermodynamics

The Koszul-Vinberg Characteristic Function (KVCF) on convex cones will be presented as cornerstone of “Information Geometry” theory, defining KoszUL Entropy as Legendre transform of minus the logarithm of KVCF, and Fisher Information Metrics as hessian of these dual functions, invariant by their automorphisms.

Differential geometry of a parametric family of invertible linear systems—Riemannian metric, dual affine connections, and divergence

  • S. Amari
  • Mathematics, Computer Science
    Mathematical systems theory
  • 2005
A new geometrical method and framework for analyzing properties of manifolds of systems using a Riemannian metric and a pair of dual affine connections to solve the problem of approximating a given system by one included in a model.

Application of Kähler manifold to signal processing and Bayesian inference

One of the goals in information geometry is the construction of Bayesian priors outperforming the Jeffreys prior, which is used to demonstrate the utility of the Kahler structure.

Symplectic and Kähler Structures on Statistical Manifolds Induced from Divergence Functions

Divergence functions play a central role in information geometry. Given a manifold \({\mathcal M}\), a divergence function \({\mathcal D}\) is a smooth, non-negative function on the product manifold

Geometric shrinkage priors for Kählerian signal filters

An efficient and robust algorithm for finding superharmonic priors which outperform the Jeffreys prior is introduced and several ansatze for the Bayesian predictive priors are suggested.

An Introduction to the Theory of Reproducing Kernel Hilbert Spaces

Reproducing kernel Hilbert spaces have developed into an important tool in many areas, especially statistics and machine learning, and they play a valuable role in complex analysis, probability,

Shrinkage Priors on Complex-Valued Circular- Symmetric Autoregressive Processes

This work investigates shrinkage priors on power spectral densities for complex-valued circular-symmetric autoregressive processes and proposes general constructions of objective priors for Kähler parameter spaces by utilizing a positive continuous eigenfunction of the Laplace–Beltrami operator with a negative eigenvalue.

Superharmonic priors for autoregressive models

Tanaka and Komaki (Sankhya Ser A Indian Stat Inst 73-A:162–184, 2011) proposed superharmonic priors in Bayesian time series analysis as alternative to the famous Jeffreys prior. By definition the