Corpus ID: 53229043

An L1 Representer Theorem for Multiple-Kernel Regression

@article{Aziznejad2018AnLR,
  title={An L1 Representer Theorem for Multiple-Kernel Regression},
  author={Shayan Aziznejad and Michael Unser},
  journal={ArXiv},
  year={2018},
  volume={abs/1811.00836}
}
  • Shayan Aziznejad, Michael Unser
  • Published 2018
  • Mathematics, Computer Science
  • ArXiv
  • The theory of RKHS provides an elegant framework for supervised learning. It is the foundation of all kernel methods in machine learning. Implicit in its formulation is the use of a quadratic regularizer associated with the underlying inner product which imposes smoothness constraints. In this paper, we consider instead the generalized total-variation (gTV) norm as the sparsity-promoting regularizer. This leads us to propose a new Banach-space framework that justifies the use of generalized… CONTINUE READING
    TV-based Reconstruction of Periodic Functions
    2
    Hybrid-Spline Dictionaries for Continuous-Domain Inverse Problems
    5

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 50 REFERENCES
    The generalized LASSO
    • V. Roth
    • Medicine, Computer Science
    • 2004
    226
    Multiple kernel learning, conic duality, and the SMO algorithm
    1521
    A Generalized Representer Theorem
    1260
    Efficient and Accurate Lp-Norm Multiple Kernel Learning
    243
    Learning the Kernel Function via Regularization
    392
    Continuous-Domain Solutions of Linear Inverse Problems With Tikhonov Versus Generalized TV Regularization
    19
    Sparse kernel learning with LASSO and Bayesian inference algorithm
    46
    Breaking the Curse of Dimensionality with Convex Neural Networks
    249