Laplace's 1774 Memoir on Inverse Probability

@article{Stigler1986Laplaces1M,
  title={Laplace's 1774 Memoir on Inverse Probability},
  author={Stephen M. Stigler},
  journal={Statistical Science},
  year={1986},
  volume={1},
  pages={359-363}
}
  • S. Stigler
  • Published 1 August 1986
  • Mathematics
  • Statistical Science
Laplace's first major article on mathematical statistics was published in 1774. It is arguably the most influential article in this field to appear before 1800, being the first widely read presentation of inverse probability and its application to both binomial and location parameter estimation. After a brief introduction, an English translation of this epochal memoir is given. 
Poincaré’s Odds
This paper is devoted to Poincare’s work in probability. Though the subject does not represent a large part of the mathematician’s achievements, it provides significant insight into the evolution of
Studies in the history of probability and statistics, L: Karl Pearson and the Rule of Three
Karl Pearson's role in the transformation that took the 19th century statistics of Laplace and Gauss into the modern era of 20th century multivariate analysis is examined from a new point of view. By
The Bernoullis of Basel
The variational approximation for Bayesian inference
TLDR
It was from here that "Bayesian" ideas first spread through the mathematical world, as Bayes's own article was ignored until 1780 and played no important role in scientific debate until the 20th century.
The Bayesian Approach and Its Evolution Until the Beginning of the Twentieth Century
Rev. Bayes and his friend Richard Price created a new way to deal with the philosophical and theological problems on induction as were explained by Hume. This mathematical formula included the notion
Laplace-a pioneer of statistical inference
It is generally held that R.A. Fisher, Jerzy Neyman and Egon Pearson laid the foundations of statistical inference in the 1920s and 1930s. Recent research concerning the history of statistical ideas
Life After the EM Algorithm : The Variational Approximation for Bayesian Inference .
TLDR
This paper presents a tutorial introduction of Bayesian variational inference aimed at the signal processing community and uses linear regression and Gaussian mixture modeling as examples to demonstrate the additional capabilities that Bayesian Variational inference offers as compared to the EM algorithm.
What Did Fisher Mean by "Inverse Probability" in 1912-1922?
The method of maximum likelihood was introduced by R. A. Fisher in 1912, but not until 1922 under that name. This paper seeks to elucidate what Fisher understood by the phrase "inverse probability,"
Laplace's Method
The idea behind the Laplace approximation is simple. We assume that an unnormalized probability density P * (x), whose normalizing constant Z P ≡ P * (x) dx (27.1) is of interest, has a peak at a
...
...

References

SHOWING 1-5 OF 5 REFERENCES
Laplace's Early Work: Chronology and Citations
PIERRE SIMON LAPLACE (1749-1827) was admitted to the French Academie des Sciences as an adjoint in April of 1773, an unusual honor for a young man of twenty-four. By the time of this elevation he had
Laplace's theory of errors
The genesis and development of the theory of errors before Laplace have been considered in a series of my articles [69] -[74]. My present aim is to elucidate the relevant work of Laplace himself,
Bayes or Laplace? An examination of the origin and early applications of Bayes' theorem
Maistrov (1974) in fact goes so far as to say "Bayes' formula appears in all texts on probability theory" (p. 87), a statement which is perhaps a little exaggerated (unless, of course, one is
A history of the mathematical theory of probability from the time of Pascal to that of Laplace / by I. Todhunter.
TLDR
The author lists the authors of the Miscellaneous investigations between the years 1780 and 1800, including Laplace, D'Alembert, Bayes, Lagrange, Condorcet, Trembley, and Euler.
Thomas Bayes's Bayesian Inference