#### Filter Results:

- Full text PDF available (30)

#### Publication Year

1992

2016

- This year (0)
- Last 5 years (9)
- Last 10 years (22)

#### Publication Type

#### Co-author

#### Journals and Conferences

Learn More

- Eric Moulines, Jean-FranÃ§ois Cardoso, Elisabeth Gassiat
- ICASSP
- 1997

In this paper, an approximate maximum likelihood method for blind source separation and deconvolution of noisy signal is proposed. This technique relies upon a data augmentation scheme, where the (unobserved) input are viewed as the missing data. In the technique described in this contribution, the input signal distribution is modeled by a mixture ofâ€¦ (More)

- StÃ©phane Boucheron, AurÃ©lien Garivier, Elisabeth Gassiat
- IEEE Transactions on Information Theory
- 2009

This paper describes universal lossless coding strategies for compressing sources on countably infinite alphabets. Classes of memoryless sources defined by an envelope condition on the marginal distribution provide benchmarks for coding techniques originating from the theory of universal coding over finite alphabets. We prove general upper bounds on minimaxâ€¦ (More)

â€“ We give two simple inequalities on likelihood ratios. A first application is the consistency of the maximum-penalized marginal-likelihood estimator of the number of populations in a mixture with Markov regime. The second application is the derivation of the asymptotic power of the likelihood ratio test under loss of identifiability for contiguousâ€¦ (More)

- Elisabeth Gassiat, StÃ©phane Boucheron
- IEEE Trans. Information Theory
- 2003

We consider the estimation of the number of hidden states (the order) of a discrete-time finite-alphabet hidden Markov model (HMM). The estimators we investigate are related to code-based order estimators: penalized maximum-likelihood (ML) estimators and penalized versions of the mixture estimator introduced by Liu and Narayan. We prove strong consistencyâ€¦ (More)

- Imre CsiszÃ¡r, Fabrice Gamboa, Elisabeth Gassiat
- IEEE Trans. Information Theory
- 1999

- Dominique Bontemps, StÃ©phane Boucheron, Elisabeth Gassiat
- IEEE Transactions on Information Theory
- 2014

This paper sheds light on adaptive coding with respect to classes of memoryless sources over a countable alphabet defined by an envelope function with finite and non-decreasing hazard rate (log-concave envelope distributions). We prove that the auto-censuring (AC) code is adaptive with respect to the collection of such classes. The analysis builds on theâ€¦ (More)

We address the issue of order identification for hmm with Poisson and Gaussian emissions. We prove information-theoretic bic-like mixture inequalities in the spirit of (Finesso, 1991; Liu & Narayan, 1994; Gassiat & Boucheron, 2003). These inequalities lead to consistent penalized estimators that need no prior bound on the order nor on the parameters of theâ€¦ (More)

Abstract: Motivated by applications in genetic fields, we propose to estimate the heritability in high-dimensional sparse linear mixed models. The heritability determines how the variance is shared between the different random components of a linear mixed model. The main novelty of our approach is to consider that the random effects can be sparse, that isâ€¦ (More)

Abstract. We establish that for q â‰¥ 1, the class of convex combinations of q translates of a smooth probability density has local doubling dimension proportional to q. The key difficulty in the proof is to control the local geometric structure of mixture classes. Our local geometry theorem yields a bound on the (bracketing) metric entropy of a class ofâ€¦ (More)

Consider an i.i.d. sequence of random variables whose distribution f lies in one of a nested family of models Mq , q â‰¥ 1. We obtain a sharp characterization of the pathwise fluctuations of the generalized likelihood ratio statistic under entropy assumptions on the model classes Mq. Moreover, we develop a technique to obtain local entropy bounds from globalâ€¦ (More)