Information Properties of a Random Variable Decomposition through Lattices

  title={Information Properties of a Random Variable Decomposition through Lattices},
  author={F{\'a}bio C. C. Meneghetti and Henrique K. Miyamoto and Sueli Irene Rodrigues Costa},
A full-rank lattice in the Euclidean space is a discrete set formed by all integer linear combinations of a basis. Given a probability distribution on R n , two operations can be induced by considering the quotient of the space by such a lattice: wrapping and quantization. For a lattice Λ, and a fundamental domain D which tiles R n through Λ, the wrapped distribution over the quotient is obtained by summing the density over each coset, while the quantized distribution over the lattice is defined… 

Figures from this paper

On lattice quantization noise

  • R. ZamirM. Feder
  • Computer Science
    Proceedings of IEEE Data Compression Conference (DCC'94)
  • 1994
The authors find that the noise associated with the optimal lattice quantizers is wide-sense stationary and white, and any desirable noise spectra may be realized by an appropriate linear transformation ("shaping") of a latticequantizer.

Multivariate normal distributions, Fisher information and matrix inequalities

Using appropriately parameterized families of multivariate normal distributions and basic properties of the Fisher information matrix for normal random vectors, we provide statistical proofs of the

Lattice Coding for Signals and Networks: A Structured Coding Approach to Quantization, Modulation and Multiuser Information Theory

It is shown how high dimensional lattice codes can close the gap to the optimal information theoretic solution, including the characterisation of error exponents, when generalising the framework to Gaussian networks.

On the Lattice Smoothing Parameter Problem

A tighter worst-case to average-case reduction for basing cryptography on the worstcase hardness of the GapSPP problem is demonstrated, with Õ(√n) smaller approximation factor than the GapSVP problem.

Information-Theoretic Inequalities on Unimodular Lie Groups

In this paper, several definitions are extended from the Euclidean setting to that of Lie groups (including entropy and the Fisher information matrix), and inequalities analogous to those in classical information theory are derived and stated in the form of fifteen small theorems.

Achieving AWGN Channel Capacity With Lattice Gaussian Coding

The notion of good constellations, which carry almost the same mutual information as that of continuous Gaussian inputs, is introduced and addressed for the proposed lattice Gaussian coding scheme.

The Kullback–Leibler Divergence Between Lattice Gaussian Distributions

  • F. Nielsen
  • Computer Science
    Journal of the Indian Institute of Science
  • 2022
This paper illustrates how to use the Kullback-Leibler divergence to calculate the Chernoff information on the dually flat structure of the manifold of lattice Gaussian distributions.

Variations on a Theme by Massey

  • O. Rioul
  • Computer Science
    IEEE Transactions on Information Theory
  • 2022
A link is established between the two works by Jim Massey in the more general framework of the relationship between discrete (absolute) entropy and continuous (differential) entropy.

Entropy and Convergence on Compact Groups

We investigate the behaviour of the entropy of convolutions of independent random variables on compact groups. We provide an explicit exponential bound on the rate of convergence of entropy to its

Almost Universal Codes for MIMO Wiretap Channels

A fading wiretap channel model where the transmitter has only partial statistical channel state information is considered, and concrete lattice codes satisfying this design criterion are proposed, which are almost universal in the sense that a fixed code is good for secrecy for a wide range of fading models.