Learn More
Various regularization techniques are investigated in supervised learning from data. Theoretical features of the associated optimization problems are studied, and sparse suboptimal solutions are searched for. Rates of approximate optimization are estimated for sequences of suboptimal solutions formed by linear combinations of n-tuples of computational(More)
Keywords: Region-based segmentation Variational level set method Active contours Self-organizing neurons Region-based prior knowledge a b s t r a c t Active Contour Models (ACMs) constitute a powerful energy-based minimization framework for image segmentation, based on the evolution of an active contour. Among ACMs, supervised ACMs are able to exploit the(More)
Fixed-basis and variable-basis approximation schemes are compared for the problems of function approximation and functional optimization also known as infinite programming . Classes of problems are investigated for which variable-basis schemes with sigmoidal computational units perform better than fixed-basis ones, in terms of the minimum number of(More)
Approximation properties of some connectionistic models, commonly used to construct approximation schemes for optimization problems with multivariable functions as admissible solutions, are investigated. Such models are made up of linear combinations of computational units with adjustable parameters. The relationship between model complexity (number of(More)
Neural networks provide a more flexible approximation of functions than traditional linear regression. In the latter, one can only adjust the coefficients in linear combinations of fixed sets of functions, such as orthogonal polynomials or Hermite functions, while for neural networks, one may also adjust the parameters of the functions which are being(More)
The mathematical foundations of a new theory for the design of intelligent agents are presented. The proposed learning paradigm is centered around the concept of constraint, representing the interactions with the environment, and the parsimony principle. The classical regularization framework of kernel machines is naturally extended to the case in which the(More)
Approximation capabilities of two types of computational models are explored: dictionary-based models (i.e., linear combinations of n-tuples of basis functions computable by units belonging to a set called "dictionary") and linear ones (i.e., linear combinations of n fixed basis functions). The two models are compared in terms of approximation rates, i.e.,(More)
A variational norm associated with sets of computational units and used in function approximation, learning from data, and infinite-dimensional optimization is investigated. For sets Gk obtained by varying a vector y of parameters in a fixed-structure computational unit K(-,y) (e.g., the set of Gaussians with free centers and widths), upper and lower bounds(More)