• Corpus ID: 251442371

Adaptive Kernel Density Estimation proposal in gravitational wave data analysis

@inproceedings{Falxa2022AdaptiveKD,
  title={Adaptive Kernel Density Estimation proposal in gravitational wave data analysis},
  author={Mikel Falxa and Stanislav Babak and Maude Le Jeune},
  year={2022}
}
Markov Chain Monte Carlo approach is frequently used within Bayesian framework to sample the target posterior distribution. Its e ffi ciency strongly depends on the proposal used to build the chain. The best jump proposal is the one that closely resembles the unknown target distribution, therefore we suggest an adaptive proposal based on Kernel Density Estimation (KDE). We group parameters of the model according to their correlation and build KDE based on the already accepted points for each… 

References

SHOWING 1-10 OF 33 REFERENCES

Data Analysis Recipes: Using Markov Chain Monte Carlo

It is argued that autocorrelation time is the most important test for convergence, as it directly connects to the uncertainty on the sampling estimate of any quantity of interest.

Learn From Thy Neighbor: Parallel-Chain and Regional Adaptive MCMC

This paper draws attention to the deficient performance of standard adaptation when the target distribution is multimodal and proposes a parallel chain adaptation strategy that incorporates multiple Markov chains which are run in parallel.

An Improved Variable Kernel Density Estimator Based on L2 Regularization

An improved variable KDE is proposed which determines the optimal bandwidth for each data point in the given dataset based on the integrated squared error (ISE) criterion with the L2 regularization term and an effective optimization algorithm is developed to solve the improved objective function.

An adaptive Metropolis algorithm

An adaptive Metropolis (AM) algorithm, where the Gaussian proposal distribution is updated along the process using the full information cumulated so far, which establishes here that it has the correct ergodic properties.

Dynamic temperature selection for parallel tempering in Markov chain Monte Carlo simulations

This paper presents a simple, easily-implemented algorithm for dynamically adapting the temperature configuration of a sampler while sampling, and dynamically adjusts the temperature spacing to achieve a uniform rate of exchanges between chains at neighbouring temperatures.

Inference from Iterative Simulation Using Multiple Sequences

The focus is on applied inference for Bayesian posterior distributions in real problems, which often tend toward normal- ity after transformations and marginalization, and the results are derived as normal-theory approximations to exact Bayesian inference, conditional on the observed simulations.

A Markov Chain Monte Carlo version of the genetic algorithm Differential Evolution: easy Bayesian computing for real parameter spaces

  • C. Braak
  • Computer Science
    Stat. Comput.
  • 2006
The essential ideas of DE and MCMC are integrated, resulting in Differential Evolution Markov Chain (DE-MC), a population MCMC algorithm, in which multiple chains are run in parallel, showing simplicity, speed of calculation and convergence, even for nearly collinear parameters and multimodal densities.

Weak convergence and optimal scaling of random walk Metropolis algorithms

This paper considers the problem of scaling the proposal distribution of a multidimensional random walk Metropolis algorithm in order to maximize the efficiency of the algorithm. The main result is a

Monte Carlo Sampling Methods Using Markov Chains and Their Applications

SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and

A Bayesian analysis pipeline for continuous GW sources in the PTA band

A fully Bayesian data analysis pipeline is introduced that will allow to rapidly locate the global maxima in parameter space, map out the posterior, and finally weigh the evidence of a GW detection through a Bayes factor.