Statistical machine learning algorithms deal with the problem of selecting an appropriate statistical model from a model space Θ based on a training set {xi}i=1 ⊂ X or {(xi, yi)}i=1 ⊂ X × Y. In doing… (More)

Sound lateralization can be induced by interaural intensity disparities (IIDs) or by interaural temporal disparities (ITDs). The purpose of this study was to indicate whether IIDs and ITDs are… (More)

We describe a new formalism for word morphology. Our model vi ws word generation as a random walk on a trellis of units where each unit is a set of (short) strings. The model naturally incorporates… (More)

We prove what appears to be the first concentration of measure result for hidden Markov processes. Our bound is stated in terms of the contraction coefficients of the underlying Markov process, and… (More)

We prove a strong law of large numbers for a class of strongly mixing processes. Our result rests on recent advances in understanding of concentration of measure. It is simple to apply and gives… (More)

We derive sufficient conditions for a family (Sn, ρn,Pn) of metric probability spaces to have the measure concentration property. Specifically, if the sequence {Pn} of probability measures satisfies… (More)

Over the past decade there has been a flurry of new concentration of measure inequalities; we refer the reader to [4] for an in-depth survey, or [2, 3, 5] for some more recent advances. In [2] the… (More)

We prove an apparently novel concentration of measure result for Markov tree processes. The bound we derive reduces to the known bounds for Markov processes when the tree is a chain, thus strictly… (More)

The rate at which dependencies between future and past observations decay in a random process may be quantified in terms of mixing coefficients. The latter in turn appear in strong laws of large… (More)

We prove what appears to be the first concentration of measure result for hidden Markov processes. Our bound is stated in terms of the contraction coefficients of the underlying Markov process, and… (More)