#### Filter Results:

- Full text PDF available (193)

#### Publication Year

2001

2017

- This year (15)
- Last 5 years (95)
- Last 10 years (173)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Francis R. Bach, Michael I. Jordan
- ICASSP
- 2002

We present a class of algorithms for independent component analysis (ICA) which use contrast functions based on canonical correlations in a reproducing kernel Hilbert space. On the one hand, we show that our contrast functions are related to mutual information and have desirable mathematical properties as measures of statistical dependence. On the other… (More)

- Julien Mairal, Francis R. Bach, Jean Ponce, Guillermo Sapiro
- Journal of Machine Learning Research
- 2010

Sparse coding—that is, modelling data vectors as sparse linear combinations of basis elements—is widely used in machine learning, neuroscience, signal processing, and statistics. This paper focuses on the large-scale matrix factorization problem that consists of learning the basis set in order to adapt it to specific data. Variations of this problem include… (More)

While classical kernel-based classifiers are based on a single kernel, in practice it is often desirable to base classifiers on combinations of multiple kernels. Lanckriet et al. (2004) considered conic combinations of kernel matrices for the support vector machine (SVM), and showed that the optimization of the coefficients of such a combination reduces to… (More)

- Julien Mairal, Francis R. Bach, Jean Ponce, Guillermo Sapiro
- ICML
- 2009

Sparse coding---that is, modelling data vectors as sparse linear combinations of basis elements---is widely used in machine learning, neuroscience, signal processing, and statistics. This paper focuses on <i>learning</i> the basis set, also called dictionary, to adapt it to specific data, an approach that has recently proven to be very effective for signal… (More)

- Matthew D. Hoffman, David M. Blei, Francis R. Bach
- NIPS
- 2010

We develop an online variational Bayes (VB) algorithm for Latent Dirichlet Allocation (LDA). Online LDA is based on online stochastic optimization with a natural gradient step, which we show converges to a local optimum of the VB objective function. It can handily analyze massive document collections, including those arriving in a stream. We study the… (More)

- Y-Lan Boureau, Francis R. Bach, Yann LeCun, Jean Ponce
- 2010 IEEE Computer Society Conference on Computer…
- 2010

Many successful models for scene or object recognition transform low-level descriptors (such as Gabor filter responses, or SIFT descriptors) into richer representations of intermediate complexity. This process can often be broken down into two steps: (1) a coding step, which performs a pointwise transformation of the descriptors into a representation better… (More)

- Julien Mairal, Francis R. Bach, Jean Ponce, Guillermo Sapiro, Andrew Zisserman
- 2009 IEEE 12th International Conference on…
- 2009

We propose in this paper to unify two different approaches to image restoration: On the one hand, learning a basis set (dictionary) adapted to sparse signal descriptions has proven to be very effective in image reconstruction and classification tasks. On the other hand, explicitly exploiting the self-similarities of natural images has led to the successful… (More)

We give a probabilistic interpretation of canonical correlation (CCA) analysis as a latent variable model for two Gaussian random vectors. Our interpretation is similar to the probabilistic interpretation of principal component analysis (Tipping and Bishop, 1999, Roweis, 1998). In addition, we can interpret Fisher linear discriminant analysis (LDA) as CCA… (More)

- Armand Joulin, Francis R. Bach, Jean Ponce
- 2010 IEEE Computer Society Conference on Computer…
- 2010

Purely bottom-up, unsupervised segmentation of a single image into foreground and background regions remains a challenging task for computer vision. Co-segmentation is the problem of simultaneously dividing multiple images into regions (segments) corresponding to different object classes. In this paper, we combine existing tools for bottom-up image… (More)

- Francis R. Bach
- Journal of Machine Learning Research
- 2008

We consider the least-square regression problem with regularization by a block 1-norm, i.e., a sum of Euclidean norms over spaces of dimensions larger than one. This problem, referred to as the group Lasso, extends the usual regularization by the 1-norm where all spaces have dimension one, where it is commonly referred to as the Lasso. In this paper, we… (More)