• Corpus ID: 8440777

Vanishing Component Analysis

@inproceedings{Livni2013VanishingCA,
  title={Vanishing Component Analysis},
  author={Roi Livni and David Lehavi and Sagi Schein and Hila Nachlieli and Shai Shalev-Shwartz and Amir Globerson},
  booktitle={ICML},
  year={2013}
}
The vanishing ideal of a set of points, S ⊂ Rn, is the set of all polynomials that attain the value of zero on all the points in S. Such ideals can be compactly represented using a small set of polynomials known as generators of the ideal. Here we describe and analyze an efficient procedure that constructs a set of generators of a vanishing ideal. Our procedure is numerically stable, and can be used to find approximately vanishing polynomials. The resulting polynomials capture nonlinear… 

Figures and Tables from this paper

Approximate Vanishing Ideal via Data Knotting
TLDR
A vanishing ideal that is tolerant to noisy data and also pursued to have a better algebraic structure is proposed, which accelerated the runtime of the classification tasks without degrading the classification accuracy.
Spurious Vanishing Problem in Approximate Vanishing Ideal
TLDR
A first general method is proposed that enables various basis construction algorithms to overcome the spurious vanishing problem and takes advantage of the iterative nature of basis construction so that computationally costly operations for coefficient normalization can be circumvented.
Algebraic Clustering of Affine Subspaces
  • M. Tsakiris, R. Vidal
  • Computer Science, Mathematics
    IEEE Transactions on Pattern Analysis and Machine Intelligence
  • 2018
TLDR
It is proved that the homogenization trick, which embeds points in a union of affine subspaces into points inA union of linear subspacing, preserves the general position of the points and the transversality of the union of subspace in the embedded space, thus establishing the correctness of ASC for affineSubspace clustering.
Quadric hypersurface intersection for manifold learning in feature space
TLDR
A manifold learning technique suitable for moderately high dimension and large datasets that can be used to introduce an outlier score for arbitrary new points and to improve a given similarity metric by incorporating learned geometric structure into it is proposed.
Gradient Boosts the Approximate Vanishing Ideal
TLDR
This paper exploits the gradient to sidestep the spurious vanishing problem in polynomial time to remove symbolically trivial redundant bases, achieve consistent output with respect to the translation and scaling of input, and remove nontrivially redundant bases.
Learning Sparse Polynomial Functions
TLDR
For some unknown polynomial f(x) of degree-d and k monomials, it is shown how to reconstruct f, within error e, given only a set of examples xi drawn uniformly from the n-dimensional cube, together with evaluations f(xi) on them.
Nonlinear discriminant analysis based on vanishing component analysis
Dual-to-kernel learning with ideals
TLDR
This paper proposes a theory which unifies kernel learning and symbolic algebraic methods, and illustrates this by proposing two algorithms, IPCA and AVICA, for simultaneous manifold and feature learning, and test their accuracy on synthetic and real world data.
Improvement on the vanishing component analysis by grouping strategy
TLDR
The GVCA method proposed in the paper has a perfect classification performance with a rapid rate of convergence compared to other statistical learning methods and uses the bagging theory in ensemble learning to successfully expound and prove the correctness of the strategy of grouping training sets.
Principal Variety Analysis
TLDR
A novel computational framework, Principal Variety Analysis (PVA), for primarily nonlinear data modeling that provides more flexible and generalizable models, namely an analytical algebraic kinematic model of the objects, even in unstructured, uncertain environments.
...
1
2
3
4
...

References

SHOWING 1-10 OF 16 REFERENCES
The Construction of Multivariate Polynomials with Preassigned Zeros
TLDR
An algorithm for constructing a basis of the ideal of all polynomials, which vanish at a preassigned set of points, is presented, which yields also Newton-type poynomials for pointwise interpolation.
Regression for sets of polynomial equations
TLDR
This work demonstrates how to formulate Stationary Subspace Analysis (SSA), a source separation problem, in terms of ideal regression, which also yields a consistent estimator for SSA, and compares this estimator in simulations with previous optimization-based approaches for Ssa.
Approximate computation of zero-dimensional polynomial ideals
Nonlinear Component Analysis as a Kernel Eigenvalue Problem
TLDR
A new method for performing a nonlinear form of principal component analysis by the use of integral operator kernel functions is proposed and experimental results on polynomial feature extraction for pattern recognition are presented.
Algebraic and geometric methods in statistics
This up-to-date account of algebraic statistics and information geometry explores the emerging connections between the two disciplines, demonstrating how they can be used in design of experiments and
Ideals, varieties, and algorithms - an introduction to computational algebraic geometry and commutative algebra (2. ed.)
TLDR
The algorithmic roots of algebraic object, called a close relationship between ideals, many of polynomial equations in geometric, object called a more than you, for teaching purposes and varieties, and the solutions and reduce even without copy.
Ideals, Varieties, and Algorithms: An Introduction to Computational Algebraic Geometry and Commutative Algebra, 3/e (Undergraduate Texts in Mathematics)
Algebraic Geometry is the study of systems of polynomial equations in one or more variables, asking such questions as: Does the The denominator is taking on this, book interested. This book for
Convex Optimization
TLDR
A comprehensive introduction to the subject of convex optimization shows in detail how such problems can be solved numerically with great efficiency.
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
TLDR
This book is an excellent choice for readers who wish to familiarize themselves with computational intelligence techniques or for an overview/introductory course in the field of computational intelligence.
...
1
2
...