Synonyms and Antonyms for Markov_chain

1. Markov chain (n.)

a Markov process for which the parameter is discrete time values

Synonyms: