Synonyms of markov chain

Noun


1. Markov chain, Markoff chain, Markov process, Markoff process

usage: a Markov process for which the parameter is discrete time values

WordNet 3.0 Copyright © 2006 by Princeton University.
All rights reserved.