• Corpus ID: 218889703

Hard Shape-Constrained Kernel Machines

@article{AubinFrankowski2020HardSK,
  title={Hard Shape-Constrained Kernel Machines},
  author={Pierre-Cyril Aubin-Frankowski and Zolt{\'a}n Szab{\'o}},
  journal={ArXiv},
  year={2020},
  volume={abs/2005.12636}
}
Shape constraints (such as non-negativity, monotonicity, convexity) play a central role in a large number of applications, as they usually improve performance for small sample size and help interpretability. However enforcing these shape requirements in a hard fashion is an extremely challenging problem. Classically, this task is tackled (i) in a soft way (without out-of-sample guarantees), (ii) by specialized transformation of the variables on a case-by-case basis, or (iii) by using highly… 

Figures and Tables from this paper

Deep Local Volatility

TLDR
A deep learning approach for interpolation of European vanilla option prices which jointly yields the full surface of local volatilities and the use of the Dupire formula to enforce bounds on the local volatility associated with option prices, during the network fitting.

Linearly Constrained Linear Quadratic Regulator from the Viewpoint of Kernel Methods

TLDR
This study presents how matrix-valued reproducing kernels allow for an alternative viewpoint in the linear quadratic regulator problem, and introduces a strengthened continuous-time convex optimization problem which can be tackled exactly with finite dimensional solvers, and which solution is interior to the constraints.

Handling Hard Affine SDP Shape Constraints in RKHSs

TLDR
A unified and modular convex optimization framework, relying on second-order cone (SOC) tightening, to encode hard affine SDP constraints on function derivatives, for models belonging to vector-valued reproducing kernel Hilbert spaces (vRKHSs).

Comparing optimistic and pessimistic constraint evaluation in shape-constrained symbolic regression

TLDR
The results indicate that the optimistic approach works better for predicting out-of-domain points (extrapolation) and the pessimistic approach work better for high noise levels, and both methods are effective on different problem instances.

Sparse Representations of Positive Functions via First- and Second-Order Pseudo-Mirror Descent

TLDR
First and second-order variants of stochastic mirror descent employing pseudo-gradients and complexity-reducing projections are developed, which establish tradeoffs between the radius of convergence of the expected sub-optimality and the projection budget parameter, as well as non-asymptotic bounds on the model complexity.

Using Shape Constraints for Improving Symbolic Regression Models

TLDR
This work describes and analyze algorithms for shape-constrained symbolic regression, which allows the inclusion of prior knowledge about the shape of the regression function, and implements shape constraints using a soft-penalty approach which uses multiobjective algorithms to minimize constraint violations and training error.

Learning sparse conditional distribution: An efficient kernel-based approach

TLDR
A novel method to recover the sparse structure of the conditional distribution, which plays a crucial role in subsequent statistical analysis such as prediction, forecasting, conditional distribution estimation and others, which can be efficiently implemented by optimizing its dual form.

A Dimension-free Computational Upper-bound for Smooth Optimal Transport Estimation

TLDR
A statistical estimator of smooth optimal transport is derived which achieves in average a precision ǫ for a computational cost of Õ when the smoothness increases, hence yielding a dimension free rate.

References

SHOWING 1-10 OF 68 REFERENCES

A Computational Framework for Multivariate Convex Regression and Its Variants

TLDR
A scalable algorithmic framework based on the augmented Lagrangian method to compute the nonparametric least squares estimator (LSE) of a multivariate convex regression function is proposed and a novel approach to obtain smooth convex approximations to the fitted (piecewise affine) convex LSE is developed.

Optimization over Nonnegative and Convex Polynomials With and Without Semidefinite Programming

  • G. Hall
  • Computer Science, Mathematics
    ArXiv
  • 2018
TLDR
This thesis provides the first theoretical framework for constructing converging hierarchies of lower bounds on POPs whose computation simply requires the ability to multiply certain fixed polynomials together and to check nonnegativity of the coefficients of their product.

Functional estimation under shape constraints

In the problem of nonparametric regression for a fixed design model, we may want to use additional information about the shape of the regression function, when available, to improve the estimation.

Shape-Constrained Estimation Using Nonnegative Splines

TLDR
A general computational framework that treats nonparametric estimation of unknown smooth functions in the presence of restrictions on the shape of the estimator and on its support using polynomial splines, and a simpler approach in which nonnegative splines are approximated by splines whose pieces are polynomials with nonnegative coefficients in a nonnegative basis.

Projected Stochastic Primal-Dual Method for Constrained Online Learning with Kernels

TLDR
A primal-dual method which executes alternating projected primal/dual stochastic descent/ascent on the dual-augmented Lagrangian of this problem with kernels is developed.

Isotonic regression in general dimensions

We study the least squares regression function estimator over the class of real-valued functions on $[0,1]^d$ that are increasing in each coordinate. For uniformly bounded signals and with a fixed,

Generalized additive and index models with shape constraints

We study generalized additive models, with shape restrictions (e.g. monotonicity, convexity and concavity) imposed on each component of the additive prediction function. We show that this framework

Semiparametric Estimation under Shape Constraints

Shape constrained smoothing using smoothing splines

TLDR
This work proposes a new method for calculating smoothing splines that fulfill shape constraints such as the underlying regression curve being positive, monotone, convex or concave and shows that the resulting problem can be solved using the algorithm of Goldfarb and Idnani.

Multivariate convex regression: global risk bounds and adaptation

TLDR
Two general model selection methods are developed to provide sieved adaptive estimators (SAE) that achieve nearly optimal rates of convergence for particular "regular" classes of convex functions, while maintaining nearly parametric rate-adaptivity to polyhedral functions in arbitrary dimensions.
...