Probabilistic learning constrained by realizations using a weak formulation of Fourier transform of probability measures

@article{Soize2022ProbabilisticLC,
  title={Probabilistic learning constrained by realizations using a weak formulation of Fourier transform of probability measures},
  author={Christian Soize},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.03078}
}
This paper deals with the taking into account a given set of realizations as constraints in the Kullback-Leibler minimum principle, which is used as a probabilistic learning algorithm. This permits the effective integration of data into predictive models. We consider the probabilistic learning of a random vector that is made up of either a quantity of interest (unsupervised case) or the couple of the quantity of interest and a control parameter (supervised case). A training set of independent… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 81 REFERENCES

Entropy optimization principles with applications

Applications of Jaynes' maximum entropy principle and Kullback's minimum cross-entropy principle are applied to develop new entropy optimization principles generalized principles of maximum entropy the four inverse maximum entropy principles.

Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints

  • Christian Soize
  • Computer Science, Mathematics
    Computer Methods in Applied Mechanics and Engineering
  • 2022

Probabilistic learning on manifolds constrained by nonlinear partial differential equations for small datasets

Physics‐constrained non‐Gaussian probabilistic learning on manifolds

The method consists in constructing a generator using the PLoM and the classical Kullback‐Leibler minimum cross‐entropy principle and the resulting optimization problem is reformulated using Lagrange multipliers associated with the constraints.

Stochastic elliptic operators defined by non-gaussian random fields with uncertain spectrum

This paper present a construction and the analysis of a class of non-Gaussian positive-definite matrix-valued homogeneous random fields with uncertain spectral measure for stochastic elliptic

Probabilistic learning on manifolds (PLoM) with partition

Improvements of the probabilistic learning on manifolds are presented such as a simplified algorithm for constructing the diffusion‐map basis and a new mathematical result for quantifying the concentration of the probability measure in terms of a probability upper bound.

COMPUTATION OF SOBOL INDICES IN GLOBAL SENSITIVITY ANALYSIS FROM SMALL DATA SETS BY PROBABILISTIC LEARNING ON MANIFOLDS

The objective of the probabilistic learning is to learn from the available samples a Probabilistic model that can be used to generate additional samples, from which Monte Carlo estimates of the global sensitivity indices are then deduced.

Probabilistic learning and updating of a digital twin for composite material systems

Conditional regression is carried out using the estimated joint density function, permitting a systematic exploration of interdependence between fine scale and coarse observables that can be used for both prognosis and design of complex material systems.

Probabilistic learning on manifolds

It is proven that this transported measure is a marginal distribution of the invariant measure of a reduced-order Ito stochastic differential equation that corresponds to a dissipative Hamiltonian dynamical system.
...