Advanced topics: continuous-time MC, compound Poisson, SDEs, branching processes, and limit theorems
Instructions
A continuous-time Markov chain on {0, 1} has generator matrix:
(1) Find the transition probability matrix P(t) = e^(Qt).
(2) Find the stationary distribution.
(3) If X(0) = 0, find P(X(1) = 1).
Work from the generator or holding-time structure and solve the forward or backward equations for transition probabilities.
Claims arrive at an insurance company according to a Poisson process with rate λ = 10 per day. Each claim amount is uniformly distributed on [0, 1000] dollars, independent of arrival times and other claims.
(1) What is the expected total claim amount in one day?
(2) Find the variance of the total claim amount in one day.
(3) What is the probability that total claims exceed 6000 dollars in one day?
Decompose the model into a Poisson count and i.i.d. jump sizes so moments and distributions can be built in two layers.
Consider the SDE: dX(t) = μX(t)dt + σX(t)dB(t) with X(0) = x₀.
(1) Use Ito's lemma to find the SDE for Y(t) = ln(X(t)).
(2) Solve for X(t) (geometric Brownian motion).
(3) Find E[X(t)] and Var(X(t)).
Apply Ito's formula carefully, keeping both the drift and quadratic-variation terms that ordinary calculus would miss.
A gambler starts with $5. Each round, they win $1 with probability p = 0.48 or lose $1 with probability q = 0.52.
They play until reaching $0 (ruin) or $10 (target).
(1) Find the probability of ruin starting from $5.
(2) Find the expected number of rounds until the game ends.
Use the standard ruin recursion or closed-form formula, then distinguish clearly between ruin probability and expected duration.
In a Galton-Watson branching process, each individual produces k offspring with probability pₖ:
(1) Find the mean offspring m.
(2) Will the population eventually go extinct?
(3) If extinction occurs, find the extinction probability starting with Z₀ = 1.
Compute the offspring mean first, then solve the generating-function fixed-point equation for extinction probability.
Let {Xₙ, n ≥ 0} be a stationary stochastic process with mean μ and autocovariance function γ(h) = Cov(Xₙ, Xₙ₊ₕ).
(1) Show that γ(h) = γ(-h).
(2) If γ(h) = σ²ρ^|h| for |ρ| < 1, find the variance of the sample mean .
(3) Under what conditions does as n → ∞?
Separate the definitions of stationarity and ergodicity, then test whether time averages and ensemble averages align.
Let B(t) be standard Brownian motion. Define the stopping time T = inf{t: B(t) = 2}.
(1) Show that E[B(T)] = 2.
(2) Find E[T].
(3) Is E[B(T)²] = E[T]?
Use the definition of the stopping time together with careful optional-stopping conditions so you do not confuse path identities with expected values.
Consider an irreducible, aperiodic Markov chain with stationary distribution π.
(1) State the ergodic theorem for Markov chains.
(2) If f is a function on the state space, what does converge to?
(3) Describe the Central Limit Theorem for Markov chains.
Check the chain assumptions behind convergence, then connect transition powers or ergodic averages to the limiting distribution.
Review Stochastic Process Fundamentals
Revisit the theory that supports these worked practice problems.
Review Independent Increment Processes
Revisit the theory that supports these worked practice problems.
Review Brownian Motion
Revisit the theory that supports these worked practice problems.
Review Markov Chains
Revisit the theory that supports these worked practice problems.