Skip to search formSkip to main contentSkip to account menu

Markov Chains

Known as: Process, Markov, Markov Chain, Processes, Markov 
A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by… 
National Institutes of Health

Papers overview

Semantic Scholar uses AI to extract papers important to this topic.
2012
2012
Objectives  To describe associations between different summaries of adherence in the first year on antiretroviral therapy (ART… 
2009
2009
Protein complexes are responsible for most of vital biological processes within the cell. Understanding the machinery behind… 
2007
2007
In this paper we shall derive asymptotic expansions of the Green function and the transition probabilities of Markov additive (MA… 
Review
2002
Review
2002
In this survey we present a unified treatment of both singular and regular perturbations in finite Markov chains and decision… 
2001
2001
  • M. Nicas
  • 2001
  • Corpus ID: 24770300
Turbulent eddy diffusion models are used to describe a continuous concentration gradient with distance from an in-room… 
1988
1988
If p is a polynomial of degree at most n such that \p(x)\ < Vl — x2 for — 1 < X < I, then for each k, max|p(*'(x)| on [—1,1] is… 
1974
1974
To clarify the stochastic properties of the neuronal impulse sequences, we have proposed a measure of statistical dependency di(T… 
1973
1973
Markov processes in special-relativistic position-velocity phase space are proved to have converging velocities as t…