Loading…

Random Walks, Directed Cycles, and Markov Chains

A Markov chain is a random process which iteratively travels around in its state space with each transition only depending on the current position and not on the past. When the state space is discrete, we can think of a Markov chain as a special type of random walk on a directed graph. Although a Ma...

Full description

Saved in:
Bibliographic Details
Published in:The American mathematical monthly 2023-02, Vol.130 (2), p.127-144
Main Authors: Gingell, Kate, Mendivil, Franklin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:A Markov chain is a random process which iteratively travels around in its state space with each transition only depending on the current position and not on the past. When the state space is discrete, we can think of a Markov chain as a special type of random walk on a directed graph. Although a Markov chain normally never settles down but keeps moving around, it does usually have a well-defined limiting behavior in a statistical sense. A given finite directed graph can potentially support many different random walks or Markov chains and each one could have one or more invariant (stationary) distributions. In this paper we explore the question of characterizing the set of all possible invariant distributions. The answer turns out to be quite simple and very natural and involves the cycles on the graph.
ISSN:0002-9890
1930-0972
DOI:10.1080/00029890.2022.2144088