Synonyms and Antonyms for markov chain

1. Markov chain (n.)

a Markov process for which the parameter is discrete time values

Synonyms: