#### Filter Results:

#### Publication Year

1992

2016

#### Publication Type

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

We describe estimators χ n (X 0 , X 1 ,. .. , X n), which when applied to an unknown stationary process taking values from a countable alphabet X , converge almost surely to k in case the process is a k-th order Markov chain and to infinity otherwise.

—For a stationary stochastic process {Xn} with values in some set A, a finite word w ∈ A K is called a memory word if the conditional probability of X0 given the past is constant on the cylinder set defined by X −1 −K = w. It is a called a minimal memory word if no proper suffix of w is also a memory word. For example in a K-step Markov processes all words… (More)

Let {Xn} be a stationary and ergodic time series taking values from a finite or countably infinite set X and that f (X) is a function of the process with finite second moment. Assume that the distribution of the process is otherwise unknown. We construct a sequence of stopping times λn along which we will be able to estimate the conditional expectation E(f… (More)

For a binary stationary time series define σn to be the number of consecutive ones up to the first zero encountered after time n, and consider the problem of estimating the conditional distribution and conditional expectation of σn after one has observed the first n outputs. We present a sequence of stopping times and universal estimators for these… (More)