#### Filter Results:

#### Publication Year

2001

2015

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

In the past few years we have seen a surge in the theory of finite Markov chains, by way of new techniques to bounding the convergence to stationarity. This includes functional techniques such as logarithmic Sobolev and Nash inequalities, refined spectral and entropy techniques, and isoperimetric techniques such as the average and blocking conduc-tance and… (More)

The mixing properties of several Markov chains to sample from configurations of a hard-core model have been examined. The model is familiar in the statistical physics of the liquid state and consists of a set of n nonoverlapping particle balls of radius r * in a d-dimensional hy-percube. Starting from an initial configuration, standard Markov chain monte… (More)

We show a Birthday Paradox for self-intersections of Markov chains with uniform stationary distribution. As an application, we analyze Pollard's Rho algorithm for finding the discrete logarithm in a cyclic group G and find that, if the partition in the algorithm is given by a random oracle, then with high probability a collision occurs in Θ(|G|) steps.… (More)

The notion of conductance introduced by Jerrum and Sinclair [8] has been widely used to prove rapid mixing of Markov Chains. Here we introduce a bound that extends this in two directions. First, instead of measuring the conductance of the worst subset of states, we bound the mixing time by a formula that can be thought of as a weighted average of the… (More)

The discrete logarithm problem asks to solve for the exponent x, given the generator g of a cyclic group G and an element h∈ G such that g<sup>x</sup>=h. We give the first rigorous proof that Pollard's Kangaroo method finds the discrete logarithm in expected time (3+o(1))√{b-a} for the worst value of x∈[a,b], and (2+o(1))√b-a when… (More)

We analyze-a fairly standard idealization of Pollard's rho algorithm for finding the discrete logarithm in acyclic group G. It is found that, with high probability, a collision occurs in O(radic( |G|log|G|log log|G|)) steps, not far from the widely conjectured value of Theta(radic|G|). Tins improves upon a recent result of Miller-Venkalesan which showed an… (More)

We show how to bound the mixing time and log-Sobolev constants of Markov chains by bounding the edge-isoperimetry of their underlying graphs. To do this we use two recent techniques, one involving Average Conductance and the other log-Sobolev constants. We show a sort of strong conductance bound on a family of geometric Markov chains, give improved bounds… (More)

We show bounds on total variation and L ∞ mixing times, spectral gap and magnitudes of the complex valued eigenvalues of a general (non-reversible non-lazy) Markov chain with a minor expansion property. This leads to the first known bounds for the non-lazy simple and max-degree walks on a (directed) graph, and even in the lazy case they are the first bounds… (More)

2002 This thesis is concerned with isoperimetric methods for studying the rate at which Markov chains approach their steady state distribution. We begin by proving a new isoperimetric bound on the mixing time using a quantity which we call blocking conductance φ(x), this is an extension of conductance Φ and average conductance Φ(x). We then look at the… (More)

We show a strict hierarchy among various edge and vertex expansion properties of Markov chains. This gives easy proofs of a range of bounds, both classical and new, on chi-square distance, spectral gap and mixing time. The 2-gradient is then used to give an isoperimetric proof that a random walk on the grid [k] n mixes in time O * (k 2 n).