Synonyms and Antonyms for Markoff_chain

1. Markoff chain (n.)

a Markov process for which the parameter is discrete time values

Synonyms: