• Corpus ID: 247318645

Efficient Kirszbraun Extension with Applications to Regression

@inproceedings{Zaichyk2019EfficientKE,
title={Efficient Kirszbraun Extension with Applications to Regression},
author={Hanan Zaichyk and Armin Biess and Aryeh Kontorovich and Yury Makarychev},
year={2019}
}
• Published 28 May 2019
• Computer Science, Mathematics
We introduce a framework for performing regression between two Hilbert spaces. This is done based on Kirszbraun’s extension theorem, to the best of our knowledge, the first application of this technique to supervised learning. We analyze the statistical and computational aspects of this method. We decompose this task into two stages: training (which corresponds operationally to smoothing/regularization) and prediction (which is achieved via Kirszbraun extension). Both are solved algorithmically…

References

SHOWING 1-10 OF 30 REFERENCES
Efficient Regression in Metric Spaces via Approximate Lipschitz Extension
• Computer Science
IEEE Transactions on Information Theory
• 2017
A regression algorithm whose speed and generalization performance depend on the intrinsic dimension of the data, to which the algorithm adapts is designed, and the main innovation is algorithmic.
Alternating Minimization for Regression Problems with Vector-valued Outputs
• Computer Science, Mathematics
NIPS
• 2015
This work provides finite sample upper and lower bounds on the estimation error of OLS and MLE, in two popular models: a) Pooled model, b) Seemingly Unrelated Regression (SUR) model, and shows that the output of a computationally efficient alternating minimization procedure enjoys the same performance guarantee as MLE.
• R. Caruana
• Computer Science
Encyclopedia of Machine Learning and Data Mining
• 1998
Prior work on MTL is reviewed, new evidence that MTL in backprop nets discovers task relatedness without the need of supervisory signals is presented, and new results for MTL with k-nearest neighbor and kernel regression are presented.
An Improved Analysis of Alternating Minimization for Structured Multi-Response Regression
• Computer Science
NeurIPS
• 2018
A resampling-free analysis for the alternating minimization algorithm applied to the multi-response regression with structured coefficient parameter, and the statistical error of the parameter can be expressed by the complexity measure, Gaussian width, which is related to the assumed structure.
Vector-Valued Support Vector Regression
• M. Brudnak
• Mathematics, Computer Science
The 2006 IEEE International Joint Conference on Neural Network Proceedings
• 2006
It is shown that the vector-valued approach results in sparse representations in terms of support vectors as compared to aggregated scalar-valued learning.
Foundations of Machine Learning
• Computer Science
• 2012
This graduate-level textbook introduces fundamental concepts and methods in machine learning, and provides the theoretical underpinnings of these algorithms, and illustrates key aspects for their application.
A survey on multi‐output regression
• Computer Science
WIREs Data Mining Knowl. Discov.
• 2015
This study provides a survey on state‐of‐the‐art multi‐output regression methods, that are categorized as problem transformation and algorithm adaptation methods, and presents the mostly used performance evaluation measures, publicly available data sets for multi-output regression real‐world problems, as well as open‐source software frameworks.
Nonlinear dimension reduction via outer Bi-Lipschitz extensions
• Mathematics, Computer Science
STOC
• 2018
It is shown that for every map f there exists an outer bi-Lipschitz extension f′ whose distortion is greater than that of f by at most a constant factor, and it is proved that given a set X of N points in ℜd, there exists a *terminal* dimension reduction embedding of �”d into ℝd′, which preserves distances.
Estimation and inference in econometrics
• Economics
• 1993
A theme of the text is the use of artificial regressions for estimation, reference, and specification testing of nonlinear models, including diagnostic tests for parameter constancy, serial correlation, heteroscedasticity, and other types of mis-specification.
A Refined Laser Method and Faster Matrix Multiplication
• Computer Science
SODA
• 2021
This paper is a refinement of the laser method that improves the resulting value bound for most sufficiently large tensors, and obtains the best bound on $\omega$ to date.