Multiscale variance stabilization via maximum likelihood

@article{Nason2014MultiscaleVS,
  title={Multiscale variance stabilization via maximum likelihood},
  author={Guy P. Nason},
  journal={Biometrika},
  year={2014},
  volume={101},
  pages={499-504}
}
  • G. Nason
  • Published 1 June 2014
  • Mathematics
  • Biometrika
This article proposes maximum likelihood approaches for multiscale variance stabilization transformations for independently and identically distributed data. For two multiscale variance stabilization transformations we present new unified theoretical results on their Jacobians, a key component of the likelihood. The results provide a deeper understanding of the transformations and the ability to compute the likelihood in linear time. The transformations are shown empirically to compare… 

Figures and Tables from this paper

Likelihood ratio Haar variance stabilization and normalization for Poisson and other non-Gaussian noise removal
We propose a new methodology for denoising, variance-stabilizing and normalizing signals whose both mean and variance are parameterized by a single unknown varying parameter, such as Poisson or
Case study: shipping trend estimation and prediction via multiscale variance stabilisation
TLDR
An analysis of shipping credit flow data: an important and archetypal series whose analysis is hampered by rapid changes of variance and which uses the recently developed Haar–Fisz transformation that enables accurate trend estimation and successful prediction in these kinds of situation.

References

SHOWING 1-10 OF 33 REFERENCES
Multiscale Poisson data smoothing
TLDR
A framework for non‐linear multiscale decompositions of Poisson data that have piecewise smooth intensity curves is introduced, which combines the advantages of the Haar–Fisz transform with wavelet smoothing and (Bayesian) multiscales likelihood models, with additional benefits, such as extendability towards arbitrary wavelet families.
Data-driven wavelet-Fisz methodology for nonparametric function estimation
TLDR
This work proposes a wavelet-based technique for the nonparametric estimation of functions contaminated with noise whose mean and variance are linked via a possibly unknown variance function, and establishes an exponential inequality for the Nadaraya-Watson variance function estimator.
A wavelet‐Fisz approach to spectrum estimation
Abstract.  We propose a new approach to wavelet threshold estimation of spectral densities of stationary time series. Our proposal addresses the problem of heteroscedasticity and non‐normality of the
Variance stabilization and normalization for one-color microarray data using a data-driven multiscale approach
TLDR
The Data-Driven Haar-Fisz for microarrays (DDHFm) has the advantage of being 'distribution-free' in the sense that no parametric model for the underlying microarray data is required to be specified or estimated; hence, DDHFm can be applied very generally, not just to micro array data.
Wavelet Methods in Statistics with R
TLDR
This book has three main objectives: providing an introduction to wavelets and their uses in statistics, acting as a quick and broad reference to many developments in the area, and interspersing R code that enables the reader to learn the methods, to carry out their own analyses, and further develop their own ideas.
Statistical Modeling by Wavelets
TLDR
Wavelets and Random Processes, Wavelet-Based Random Variables and Densities, and Miscellaneous Statistical Applications.
Nonlinear Wavelet Methods for Recovery of Signals, Densities, and Spectra from Indirect and Noisy Da
TLDR
This work focuses on noise reduction by constrained reconstructions in the wavelet-transform domain and proposes a best-basis segmentation SURE(y; …), a telescoping nonlinear multiresolution decomposition based on decimating by factors of 3.
An Analysis of Transformations
[Read at a RESEARCH METHODS MEETING of the SOCIETY, April 8th, 1964, Professor D. V. LINDLEY in the Chair] SUMMARY In the analysis of data it is often assumed that observations Yl, Y2, *-, Yn are
Wavelet processes and adaptive estimation of the evolutionary wavelet spectrum
This paper defines and studies a new class of non‐stationary random processes constructed from discrete non‐decimated wavelets which generalizes the Cramér (Fourier) representation of stationary time
Smoothing reference centile curves: the LMS method and penalized likelihood.
TLDR
The LMS method summarizes the changing distribution of a measurement as it changes according to some covariate by three curves representing the median, coefficient of variation and skewness, the latter expressed as a Box-Cox power.
...
...