Spatially-adaptive sensing in nonparametric regression

  title={Spatially-adaptive sensing in nonparametric regression},
  author={Adam D. Bull},
  journal={arXiv: Statistics Theory},
  • A. Bull
  • Published 2 July 2012
  • Mathematics, Computer Science
  • arXiv: Statistics Theory
While adaptive sensing has provided improved rates of convergence in sparse regression and classification, results in nonparametric regression have so far been restricted to quite specific classes of functions. In this paper, we describe an adaptive-sensing algorithm which is applicable to general nonparametric-regression problems. The algorithm is spatially adaptive, and achieves improved rates of convergence over spatially inhomogeneous functions. Over standard function classes, it likewise… 

Figures and Tables from this paper

Optimal Sampling Density for Nonparametric Regression
The proposed active learning method outperforms the existing state-of-the-art model-agnostic approaches and factorizes the influence of local function complexity, noise level and test density in a transparent and interpretable way.
Active Learning for Non-Parametric Regression Using Purely Random Trees
This paper proposes an intuitive tree based active learning algorithm for non-parametric regression with provable improvement over random sampling and when implemented with Mondrian Trees the algorithm is tuning parameter free, consistent and minimax optimal for Lipschitz functions.
Sequential Adaptive Design for Jump Regression Estimation
A new strategy of selecting the design points for a regression model when the underlying regression function is discontinuous is presented, and some statistical properties with a fixed design will be presented first, and then these properties will be used to propose a new criterion of selected theDesign points for the regression analysis.
Sequential Adaptive Design for Jump Regression Estimation in Materials Discovery
A simple and effective adaptive design strategy for a regression analysis with discontinuities is proposed: some statistical properties with a fixed design will be presented first, and then these properties will be used to propose a new criterion of selecting the design points for the regression analysis.


Distilled Sensing: Adaptive Sampling for Sparse Detection and Estimation
It is shown that using adaptive sampling, reliable detection is possible provided the amplitude exceeds a constant, and localization is possible when the amplitudes exceeds any arbitrarily slowly growing function of the dimension.
Adaptive sensing performance lower bounds for sparse estimation and testing
The results show that the adaptive sensing methodologies proposed previously in the literature are essentially optimal, and cannot be substantially improved, and provide further insights on the limits of adaptive compressive sensing.
On the Fundamental Limits of Adaptive Sensing
It is proved that the advantages offered by clever adaptive strategies and sophisticated estimation procedures-no matter how intractable-over classical compressed acquisition/recovery schemes are, in general, minimal.
Group testing strategies for recovery of sparse signals in noise
  • M. Iwen
  • Computer Science
    2009 Conference Record of the Forty-Third Asilomar Conference on Signals, Systems and Computers
  • 2009
It is demonstrated that group testing measurement matrix constructions may be combined with statistical binary detection and estimation methods to produce efficient adaptive sequential algorithms for sparse signal support recovery.
Ideal spatial adaptation by wavelet shrinkage
SUMMARY With ideal spatial adaptation, an oracle furnishes information about how best to adapt a spatially variable estimator, whether piecewise constant, piecewise polynomial, variable knot spline,
Rates of convergence in active learning
The general problem of model selection for active learning with a nested hierarchy of hypothesis classes is studied and an algorithm whose error rate provably converges to the best achievable error among classifiers in the hierarchy at a rate adaptive to both the complexity of the optimal classifier and the noise conditions is proposed.
Faster Rates in Regression via Active Learning
A practical algorithm capable of exploiting the extra flexibility of the active setting and provably improving upon the classical passive techniques is described.
Data‐Driven Bandwidth Selection in Local Polynomial Fitting: Variable Bandwidth and Spatial Adaptation
When estimating a mean regression function and its derivatives, locally weighted least squares regression has proven to be a very attractive technique. The present paper focuses on the important
Optimal spatial adaptation to inhomogeneous smoothness: an approach based on kernel estimates with variable bandwidth selectors
A new v~ria~~,E_<:.~_~1~!~_s,~~~,!?E_0r estimation is proposed. The application-of this bandwidth selector leads to kernel estimates that achieve optimal rates of convergence over B£~~.£~:3.sses.
Wavelet Shrinkage: Asymptopia?
A method for curve estimation based on n noisy data: translate the empirical wavelet coefficients towards the origin by an amount √(2 log n) /√n and draw loose parallels with near optimality in robustness and also with the broad near eigenfunction properties of wavelets themselves.