Summary
A Markov process is a random process in which the future is independent of the past, given the present. Thus, Markov processes are the natural stochastic analogs of the deterministic processes described by differential and difference equations. They form one of the most important classes of random processes.
Basic Topics
- Introduction
- Recurrence and Transience
- Periodicity
- Invariant and Limiting Distributions
- Time Reversal
Special Models
- The Ehrenfest Chains
- The Bernoulli Laplace Chain
- Reliability Chains
- The Branching Chain
- Queuing Chains
- Birth Death Chains
- Random Walks on Graphs
Applets
Topics in PDF Format
External Resources
The study of Markov chains is a core topic in every textbook on stochastic processes.
Quote
- When in disgrace with Fortune and men's eyes
I all alone beweep my outcast state ...--Shakespeare, Sonnet 29