• Corpus ID: 245537480

Improving Nonparametric Classification via Local Radial Regression with an Application to Stock Prediction

  title={Improving Nonparametric Classification via Local Radial Regression with an Application to Stock Prediction},
  author={Ruixing Cao and Akifumi Okuno and Kei Nakagawa and Hidetoshi Shimodaira},
For supervised classification problems, this paper considers estimating the query’s label probability through local regression using observed covariates. Well-known nonparametric kernel smoother and knearest neighbor (k-NN) estimator, which take label average over a ball around the query, are consistent but asymptotically biased particularly for a large radius of the ball. To eradicate such bias, local polynomial regression (LPoR) and multiscale k-NN (MS-k-NN) learn the bias term by local… 

Figures and Tables from this paper


Conditional quantile estimation by local logistic regression
In this article, we propose a new nonparametric estimator of the conditional quantile function. It is based on locally fitting a logistic model. We compare the new proposal with some existing
Optimal weighted nearest neighbour classifiers
An asymptotic expansion for the excess risk (regret) of a weighted nearest-neighbour classifier is derived, and it is argued that improvements in the rate of convergence are possible under stronger smoothness assumptions, provided the authors allow negative weights.
Locally Weighted Regression: An Approach to Regression Analysis by Local Fitting
Abstract Locally weighted regression, or loess, is a way of estimating a regression surface through a multivariate smoothing procedure, fitting a function of the independent variables locally and in
Nearest neighbor pattern classification
The nearest neighbor decision rule assigns to an unclassified sample point the classification of the nearest of a set of previously classified points, so it may be said that half the classification information in an infinite sample set is contained in the nearest neighbor.
Stock Price Prediction with Fluctuation Patterns Using Indexing Dynamic Time Warping and k^* -Nearest Neighbors
Experimental results show that the proposed method is more effective for predicting monthly stock price changes than other methods proposed by previous studies, and the proposed algorithm applies the \(k^*\)-nearest neighbor algorithm with IDTW to predict stock prices for major stock indices and to assist users in making informed investment decisions.
Rates of Convergence for Nearest Neighbor Classification
This work analyzes the behavior of nearest neighbor classification in metric spaces and provides finite-sample, distribution-dependent rates of convergence under minimal assumptions, and finds that under the Tsybakov margin condition the convergence rate of nearest neighbors matches recently established lower bounds for nonparametric classification.
Bandwidth choice for nonparametric classification
It is shown that, for kernel-based classification with univariate distributions and two populations, optimal bandwidth choice has a dichotomous character. If the two densities cross at just one
Introduction to Nonparametric Estimation
  • A. Tsybakov
  • Mathematics
    Springer series in statistics
  • 2009
The main idea is to introduce the fundamental concepts of the theory while maintaining the exposition suitable for a first approach in the field, and many important and useful results on optimal and adaptive estimation are provided.
Portfolio Choices with Orthogonal Bandit Learning
This paper presents a bandit algorithm for conducting online portfolio choices by effectually exploiting correlations among multiple arms and derives the optimal portfolio strategy that represents the combination of passive and active investments according to a risk-adjusted reward function.
Local Regression and Likelihood
  • G. Pan
  • Mathematics
  • 2000
The Origins of Local Regression, Fitting with LOCFIT, and Optimizing local Regression methods.