Natural Exponential Families with Quadratic Variance Functions

@article{Morris1982NaturalEF,
  title={Natural Exponential Families with Quadratic Variance Functions},
  author={C. Morris},
  journal={Annals of Statistics},
  year={1982},
  volume={10},
  pages={65-80}
}
  • C. Morris
  • Published 1982
  • Mathematics
  • Annals of Statistics
Two-stage weighted least squares estimator of the conditional mean of observation-driven time series models
General parametric forms are assumed for the conditional mean λ_{t}(θ₀) and variance υ_{t}(ξ₀) of a time series. These conditional moments can for instance be derived from count time series,Expand
Integrating Exponential Dispersion Models to Latent Structures
TLDR
This work argues that the heteroscedastic impact of missing-data pattern on the dispersion of observation variable can be captured with the proposed model, and generalizes hierarchical Poisson factorization, a Bayesian nonnegative matrix factorization model, by compounding the original Poisson output with EDMs. Expand
Caractérisations des modèles multivariés de stables-Tweedie multiples
Ce travail de these porte sur differentes caracterisations des modeles multivaries de stables-Tweedie multiples dans le cadre des familles exponentielles naturelles sous la propriete de "steepness".Expand
Topics in empirical Bayesian analysis
While very useful in the realm of decision theory, it is widely understood that when applied to interval estimation, empirical Bayesian estimation techniques produce intervals with an incorrect widthExpand
AdaCluster : Adaptive Clustering for Heterogeneous Data
TLDR
An adaptive approach to clustering using classes of parametrized Bregman divergences is proposed and it is empirically verified that adaptively learning the underlying topology yields better clustering of heterogeneous data. Expand
A simple derivation and classification of common probability distributions based on information symmetry and measurement scale
TLDR
This framework relates the different continuous probability distributions through the variations in measurement scale that change each family of maximum entropy distributions into a distinct family based on two meaningful and justifiable propositions. Expand
Gibbs sampling, conjugate priors and coupling
We give a large family of simple examples where a sharp analysis of the Gibbs sampler can be proved by coupling. These examples involve standard statistical models — exponential families withExpand
Bayesian Methods and Extensions for the Two State Markov Modulated Poisson Process
TLDR
The model accepts output from such procedures as covariates, resulting in a principled accumulation of evidence over time, an important departure from earlier fraud detection procedures focusing on characteristics of a single event. Expand
Approximating the Operating Characteristics of Bayesian Uncertainty Directed Trial Designs
Bayesian response adaptive clinical trials are currently evaluating experimental therapies for several diseases. Adaptive decisions, such as pre-planned variations of the randomization probabilities,Expand
General dependence structures for some models based on exponential families with quadratic variance functions
We describe a procedure to introduce general dependence structures on a set of random variables. These include order-q moving average-type structures, as well as seasonal, periodic and spatialExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 14 REFERENCES
Generalized Linear Models
The technique of iterative weighted linear regression can be used to obtain maximum likelihood estimates of the parameters with observations distributed according to some exponential family andExpand
Parametric Empirical Bayes Inference: Theory and Applications
Abstract This article reviews the state of multiparameter shrinkage estimators with emphasis on the empirical Bayes viewpoint, particularly in the case of parametric prior distributions. SomeExpand
Quasi-likelihood functions, generalized linear models, and the Gauss-Newton method
SUMMARY To define a likelihood we have to specify the form of distribution of the observations, but to define a quasi-likelihood function we need only specify a relation between the mean and varianceExpand
Stein's Estimation Rule and Its Competitors- An Empirical Bayes Approach
Abstract Stein's estimator for k normal means is known to dominate the MLE if k ≥ 3. In this article we ask if Stein's estimator is any good in its own right. Our answer is yes: the positive partExpand
The Construction of Uniformly Minimum Variance Unbiased Estimators for Exponential Distributions
1. Summary and Introduction. Consider a sample (Xl, X2 * , XN) from a population with a distribution function F0(x), (0eK2) for which a complete sufficient statistic, s(x), exists. Then anyExpand
A Note on the Posterior Mean of a Population Mean
SUMMARY It is a well-known result, see for example Lindley (1965) and Raiffa and Schlaifer (1961), that if x is the mean of a sample of independent observations distributed N(ju, a2) where qr2 isExpand
...
1
2
...