#### Filter Results:

- Full text PDF available (12)

#### Publication Year

2001

2016

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

- Bruno Pelletier
- 2010

The nonparametric estimation of the regression function of a real-valued random variable Y on a random object X valued in a closed Riemannian manifold M is considered. A regression estimator which generalizes kernel regression es-timators on Euclidean sample spaces is introduced. Under classical assumptions on the kernel and the bandwidth sequence , the… (More)

We consider the problem of estimating the gradient lines of a density, which can be used to cluster points sampled from that density, for example via the mean-shift algorithm of Fukunaga and Hostetler (1975). We prove general convergence bounds that we then specialize to kernel density estimation.

Remote sensing of ocean color from space, a problem that consists in retrieving spectral marine reflectance from spectral top-of-atmosphere reflectance, is considered as a collection of similar inverse problems continuously indexed by the angular variables influencing the observation process. A general solution is proposed in the form of a field of… (More)

- Bruno Pelletier, Robert Frouin
- Applied optics
- 2006

A methodology is presented for retrieving phytoplankton chlorophyll-a concentration from space. The data to be inverted, namely, vectors of top-of-atmosphere reflectance in the solar spectrum, are treated as explanatory variables conditioned by angular geometry. This approach leads to a continuum of inverse problems, i.e., a collection of similar inverse… (More)

- Bruno Pelletier, Robert Frouin
- ESANN
- 2004

In the context of nonlinear regression, we consider the problem of explaining a variable y from a vector x of explanatory variables and from a vector t of conditionning variables, that influences the link function between y and x. A neural based solution is proposed in the form of a field of nonlinear regression models, by which it is meant that the… (More)

- Ery Arias-Castro, Bruno Pelletier
- Journal of Machine Learning Research
- 2013

Maximum Variance Unfolding is one of the main methods for (nonlinear) dimensionality reduction. We study its large sample limit, providing specific rates of convergence under standard assumptions. We find that it is consistent when the underlying submanifold is isometric to a convex subset, and we provide some simple examples where it fails to be consistent.

We consider the linear inverse problem of reconstructing an unknown finite measure µ from a noisy observation of a generalized moment of µ defined as the integral of a continuous and bounded operator Φ with respect to µ. Motivated by various applications, we focus on the case where the operator Φ is unknown; instead, only an approximation Φ m to it is… (More)

- Bruno Pelletier
- Journal of Approximation Theory
- 2004

We study the approximation of a continuous function field over a compact set T , by a continuous field of ridge ap-proximants over T , named ridge function fields. We first give general density results about function fields, and show how they apply to ridge function fields. We next discuss the parametrization of sets of ridge function fields, and give… (More)

- Bruno Pelletier, Pierre Pudlo
- Journal of Machine Learning Research
- 2011

Following Hartigan [1975], a cluster is defined as a connected component of the t-level set of the underlying density, i.e., the set of points for which the density is greater than t. A clustering algorithm which combines a density estimate with spectral clustering techniques is proposed. Our algorithm is composed of two steps. First, a nonparametric… (More)

- Bruno Pelletier
- NNSP
- 2003