Natural Exponential Families with Quadratic Variance Functions

  title={Natural Exponential Families with Quadratic Variance Functions},
  author={Carl N. Morris},
  journal={Annals of Statistics},
  • C. Morris
  • Published 1982
  • Mathematics
  • Annals of Statistics
Two-stage weighted least squares estimator of the conditional mean of observation-driven time series models
General parametric forms are assumed for the conditional mean λ_{t}(θ₀) and variance υ_{t}(ξ₀) of a time series. These conditional moments can for instance be derived from count time series,Expand
Integrating Exponential Dispersion Models to Latent Structures
This work argues that the heteroscedastic impact of missing-data pattern on the dispersion of observation variable can be captured with the proposed model, and generalizes hierarchical Poisson factorization, a Bayesian nonnegative matrix factorization model, by compounding the original Poisson output with EDMs. Expand
Caractérisations des modèles multivariés de stables-Tweedie multiples
Ce travail de these porte sur differentes caracterisations des modeles multivaries de stables-Tweedie multiples dans le cadre des familles exponentielles naturelles sous la propriete de "steepness".Expand
Topics in empirical Bayesian analysis
While very useful in the realm of decision theory, it is widely understood that when applied to interval estimation, empirical Bayesian estimation techniques produce intervals with an incorrect widthExpand
AdaCluster : Adaptive Clustering for Heterogeneous Data
An adaptive approach to clustering using classes of parametrized Bregman divergences is proposed and it is empirically verified that adaptively learning the underlying topology yields better clustering of heterogeneous data. Expand
A simple derivation and classification of common probability distributions based on information symmetry and measurement scale
This framework relates the different continuous probability distributions through the variations in measurement scale that change each family of maximum entropy distributions into a distinct family based on two meaningful and justifiable propositions. Expand
Gibbs sampling, conjugate priors and coupling
We give a large family of simple examples where a sharp analysis of the Gibbs sampler can be proved by coupling. These examples involve standard statistical models — exponential families withExpand
Bayesian Methods and Extensions for the Two State Markov Modulated Poisson Process
The model accepts output from such procedures as covariates, resulting in a principled accumulation of evidence over time, an important departure from earlier fraud detection procedures focusing on characteristics of a single event. Expand
Approximating the Operating Characteristics of Bayesian Uncertainty Directed Trial Designs
Bayesian response adaptive clinical trials are currently evaluating experimental therapies for several diseases. Adaptive decisions, such as pre-planned variations of the randomization probabilities,Expand
Hypothesis testing with low-degree polynomials in the Morris class of exponential families
Low-degree polynomials appear to offer a tradeoff between robustness and strong performance fine-tuned to specific models, and may struggle with problems requiring an algorithm to first examine the input and then use some intermediate computation to choose from one of several inference subroutines. Expand


Generalized Linear Models
The technique of iterative weighted linear regression can be used to obtain maximum likelihood estimates of the parameters with observations distributed according to some exponential family andExpand
Parametric Empirical Bayes Inference: Theory and Applications
Abstract This article reviews the state of multiparameter shrinkage estimators with emphasis on the empirical Bayes viewpoint, particularly in the case of parametric prior distributions. SomeExpand
Quasi-likelihood functions, generalized linear models, and the Gauss-Newton method
SUMMARY To define a likelihood we have to specify the form of distribution of the observations, but to define a quasi-likelihood function we need only specify a relation between the mean and varianceExpand
Stein's Estimation Rule and Its Competitors- An Empirical Bayes Approach
Abstract Stein's estimator for k normal means is known to dominate the MLE if k ≥ 3. In this article we ask if Stein's estimator is any good in its own right. Our answer is yes: the positive partExpand
The Construction of Uniformly Minimum Variance Unbiased Estimators for Exponential Distributions
1. Summary and Introduction. Consider a sample (Xl, X2 * , XN) from a population with a distribution function F0(x), (0eK2) for which a complete sufficient statistic, s(x), exists. Then anyExpand
A Note on the Posterior Mean of a Population Mean
SUMMARY It is a well-known result, see for example Lindley (1965) and Raiffa and Schlaifer (1961), that if x is the mean of a sample of independent observations distributed N(ju, a2) where qr2 isExpand