Markov Chain

Uncatgorized

  1. Equations for absorption probabilities
    • unique solutions to equations $a_s = 1, a_i = 0$ for absorbing states, and $a_i = \sum_{j=1}^{M}a_{j}p_{ji}$ for transient states
  2. Equations for the expected time to absorption
    • $\mu_i = 1+\sum_{j=1}^{M}\mu_{j}p_{ji}$ for transient states and $\mu_i = 0$ for absorbing states
  3. Gambler’s ruin problem
  4. Dice question
    • single 12 or dual 7
  5. Coin triplets
  6. Color balls (very difficult)

Martingale and Random walk

  1. Wald’s Equality
    • A martingale stopped at a stopping time is a martingale
  2. Drunk man
  3. Dice game revisit
  4. Ticket line
    • price 5 and $2n$ people having 5 and 10 bills
  5. Coin sequence
    • $n$ heads in a row (typist monkey problem)
    • We generalize this problem to have a $1/6$ chance of getting head (or a six from a dice), and $5/6$ of getting tail (or except a six from a dice), and we want to obtain a given sequence, say $HHTTHHHTT$, what is the expected number of tosses needed?

    </summary> </details>