Geometric Methods for Sampling, Optimisation, Inference and Adaptive Agents
@article{Barp2022GeometricMF, title={Geometric Methods for Sampling, Optimisation, Inference and Adaptive Agents}, author={Alessandro Barp and Lancelot Da Costa and Guilherme Francca and Karl John Friston and Mark A. Girolami and M.I. Jordan and Grigorios A. Pavliotis}, journal={ArXiv}, year={2022}, volume={abs/2203.10592} }
13 Citations
Reward Maximisation through Discrete Active Inference
- Computer Science
- 2020
This paper shows the conditions under which active inference produces the optimal solution to the Bellman equation—a formulation that underlies several approaches to model-based reinforcement learning and control.
Targeted Separation and Convergence with Kernel Discrepancies
- MathematicsArXiv
- 2022
Maximum mean discrepancies (MMDs) like the kernel Stein discrepancy (KSD) have grown central to a wide range of applications, including hypothesis testing, sampler selection, distribution…
Nesterov smoothing for sampling without smoothness
- Mathematics, Computer Science
- 2022
A novel sampling algorithm is proposed for a class of non-smooth potentials by approximating them by smooth potentials using a technique that is akin to Nesterov smoothing, and the accuracy of the algorithm is guaranteed.
Modelling non-reinforced preferences using selective attention
- Computer ScienceArXiv
- 2022
Nore is validated in a modified OpenAI Gym FrozenLake environment with and without volatility under a model of the environment—and is compared to Pepper, a Hebbian preference learning mechanism.
A Worked Example of the Bayesian Mechanics of Classical Objects
- Mathematics
- 2022
. Bayesian mechanics is a new approach to studying the mathematics and physics of interacting stochastic processes. In this note, we provide a worked example of a physical mechanics for classical…
On Bayesian Mechanics: A Physics of and by Beliefs
- Computer Science
- 2022
A duality between the free energy principle and the constrained maximum entropy principle are examined, both of which lie at the heart of Bayesian mechanics.
Particular flows and attracting sets: A comment on "How particular is the physics of the free energy principle?" by Aguilera, Millidge, Tschantz and Buckley.
- PhysicsPhysics of life reviews
- 2022
Regarding flows under the free energy principle: A comment on "How particular is the physics of the free energy principle?" by Aguilera, Millidge, Tschantz, and Buckley.
- PhysicsPhysics of life reviews
- 2022
Towards a Geometry and Analysis for Bayesian Mechanics
- Computer Science
- 2022
A simple case of Bayesian mechanics under the free energy principle is formulated in axiomatic terms, providing a related, but alternative, formalism to those driven purely by descriptions of random dynamical systems, and taking a further step towards a comprehensive statement of the physics of self-organisation in formal mathematical language.
Entropy-Maximising Diffusions Satisfy a Parallel Transport Law
- Physics
- 2022
. We show that the principle of maximum entropy, a variational method ap-pearing in statistical inference, statistical physics, and the analysis of stochastic dynamical systems, admits a geometric…
References
SHOWING 1-10 OF 309 REFERENCES
Exponential convergence of Langevin distributions and their discrete approximations
- Mathematics
- 1996
In this paper we consider a continuous-time method of approximating a given distribution using the Langevin diusion dLtdWt 1 2 r log (Lt)dt. We ®nd conditions under this diusion converges…
Integral Probability Metrics and Their Generating Classes of Functions
- Mathematics, Computer ScienceAdvances in Applied Probability
- 1997
A unified study of integral probability metrics of the following type are given and how some interesting properties of these probability metrics arise directly from conditions on the generating class of functions is shown.
Optimization on manifolds: A symplectic approach
- Mathematics
- 2021
There has been great interest in using tools from dynamical systems and numerical analysis of differential equations to understand and construct new optimization methods. In particular, recently a…
On dissipative symplectic integration with applications to gradient-based optimization
- Computer Science, Mathematics
- 2020
A generalization of symplectic integrators to non-conservative and in particular dissipative Hamiltonian systems is able to preserve rates of convergence up to a controlled error, enabling the derivation of ‘rate-matching’ algorithms without the need for a discrete convergence analysis.
Deep active inference agents using Monte-Carlo methods
- Computer ScienceNeurIPS
- 2020
A neural architecture for building deep active inference agents operating in complex, continuous state-spaces using multiple forms of Monte-Carlo (MC) sampling, which enables agents to learn environmental dynamics efficiently, while maintaining task performance, in relation to reward-based counterparts.
Active Inference: Demystified and Compared
- Computer ScienceNeural Computation
- 2021
This letter aims to demystify the behavior of active inference agents by presenting an accessible discrete state-space and time formulation and demonstrating these behaviors in a OpenAI gym environment, alongside reinforcement learning agents.
THE VARIATIONAL FORMULATION OF THE FOKKER-PLANCK EQUATION
- Mathematics, Physics
- 1996
The Fokker--Planck equation, or forward Kolmogorov equation, describes the evolution of the probability density for a stochastic process associated with an Ito stochastic differential equation. It ...
Stochastic Processes and Applications: Diffusion Processes, the Fokker-Planck and Langevin Equations. Number volume 60 in Texts in Applied Mathematics
- 2014
Active inference on discrete state-spaces: A synthesis
- Computer ScienceJournal of mathematical psychology
- 2020
Riemann manifold Langevin and Hamiltonian Monte Carlo methods
- Computer ScienceJournal of the Royal Statistical Society: Series B (Statistical Methodology)
- 2011
The methodology proposed automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density, and substantial improvements in the time‐normalized effective sample size are reported when compared with alternative sampling approaches.