MathIsimple
Fundamentals
6-10 Hours

Stochastic Process Fundamentals

Master the core concepts: definitions, classification, sample functions, and numerical characteristics

Definitions & Structure - The Stochastic Process

Let (Ω,F,P)(\Omega, \mathcal{F}, P) be a probability space and TT be a parameter set (usually time). A stochastic process is a collection of random variables:

{X(t,ω):tT,ωΩ}\{X(t, \omega) : t \in T, \omega \in \Omega\}

It can be viewed in two ways:

  • Fixed tt: X(t,)X(t, \cdot) is a random variable on Ω\Omega.
  • Fixed ω\omega: X(,ω)X(\cdot, \omega) is a deterministic function of tt, called a sample function or trajectory.
Classification of Processes
TypeTime TTState SSExample
Discrete Time, Discrete StateDiscreteDiscreteBernoulli process, Random Walk
Continuous Time, Discrete StateContinuousDiscretePoisson process
Continuous Time, Continuous StateContinuousContinuousBrownian motion
Numerical Characteristics - Mean Function
μX(t)=E[X(t)]\mu_X(t) = E[X(t)]

Represents the average behavior of the process at time tt.

Variance Function
σX2(t)=Var(X(t))=E[(X(t)μX(t))2]\sigma_X^2(t) = \text{Var}(X(t)) = E[(X(t) - \mu_X(t))^2]

Measures the spread around the mean at time tt.

Autocorrelation Function
RX(t1,t2)=E[X(t1)X(t2)]R_X(t_1, t_2) = E[X(t_1)X(t_2)]

Describes linear relationship between values at two different times.

Autocovariance Function
CX(t1,t2)=E[(X(t1)μ(t1))(X(t2)μ(t2))]C_X(t_1, t_2) = E[(X(t_1)-\mu(t_1))(X(t_2)-\mu(t_2))]

Relationship: CX(t1,t2)=RX(t1,t2)μX(t1)μX(t2)C_X(t_1, t_2) = R_X(t_1, t_2) - \mu_X(t_1)\mu_X(t_2)

Stationarity - Strict-Sense Stationarity (SSS)

All statistical properties are invariant to time shifts. For any τ,n,t1,,tn\tau, n, t_1, \dots, t_n:

FX(t1+τ),,X(tn+τ)=FX(t1),,X(tn)F_{X(t_1+\tau), \dots, X(t_n+\tau)} = F_{X(t_1), \dots, X(t_n)}
Wide-Sense Stationarity (WSS)

A process X(t)X(t) is WSS if:

  1. E[X(t)]=μE[X(t)] = \mu (constant)
  2. RX(t1,t2)=RX(t1t2)R_X(t_1, t_2) = R_X(t_1 - t_2) (depends only on lag)
Gaussian Processes - Definition & Properties

A process {X(t),tT}\{X(t), t \in T\} is a Gaussian Process if for any finite set of times t1,,tnt_1, \dots, t_n, the random vector (X(t1),,X(tn))(X(t_1), \dots, X(t_n)) has a multivariate normal distribution.

Complete Characterization

A Gaussian process is completely determined by its mean function μX(t)\mu_X(t) and autocovariance function CX(t1,t2)C_X(t_1, t_2).

Fundamental Theorems - Kolmogorov Extension Theorem

For any consistent family of finite-dimensional distributions Ft1,,tn(x1,,xn)F_{t_1, \dots, t_n}(x_1, \dots, x_n), there exists a probability space (Ω,F,P)(\Omega, \mathcal{F}, P) and a stochastic process {X(t)}\{X(t)\} such that its finite-dimensional distributions are exactly the given family.

Significance: This guarantees that we can define a process simply by specifying its finite-dimensional distributions.

Worked Examples - Example 1: Random Phase Cosine Wave

Problem:

Let X(t)=Acos(ωt+Θ)X(t) = A \cos(\omega t + \Theta), where A,ωA, \omega are constants and ΘU(0,2π)\Theta \sim U(0, 2\pi). Find μX(t)\mu_X(t) and RX(t1,t2)R_X(t_1, t_2).

Solution:

Mean: μX(t)=E[Acos(ωt+Θ)]=0\mu_X(t) = E[A \cos(\omega t + \Theta)] = 0 (by symmetry).

Autocorrelation: Using trigonometric identities, RX(t1,t2)=A22cos(ω(t1t2))R_X(t_1, t_2) = \frac{A^2}{2} \cos(\omega(t_1-t_2)).

Key Insight: This process is Wide-Sense Stationary (WSS).

Example 2: Poisson Process Characteristics

Problem:

Let {N(t),t0}\{N(t), t \geq 0\} be a Poisson process with rate λ\lambda. Find μN(t)\mu_N(t) and CN(t1,t2)C_N(t_1, t_2).

Solution:

Mean: μN(t)=E[N(t)]=λt\mu_N(t) = E[N(t)] = \lambda t.

Autocovariance: For t1<t2t_1 < t_2, using independent increments: CN(t1,t2)=λmin(t1,t2)C_N(t_1, t_2) = \lambda \min(t_1, t_2).

Key Insight: The Poisson process is not stationary because its mean depends on time.

Example 3: Simple Random Walk

Problem:

Let Xn=i=1nYiX_n = \sum_{i=1}^n Y_i where YiY_i are i.i.d. with P(Yi=1)=p,P(Yi=1)=q=1pP(Y_i=1)=p, P(Y_i=-1)=q=1-p. Find μX(n)\mu_X(n) and σX2(n)\sigma_X^2(n).

Solution:

Mean: μX(n)=n(pq)\mu_X(n) = n(p-q).

Variance: σX2(n)=4npq\sigma_X^2(n) = 4npq.

Practice Quiz
10
Questions
0
Correct
0%
Accuracy
1
What is the fundamental definition of a stochastic process?
Not attempted
2
What is the difference between a random variable and a stochastic process?
Not attempted
3
What are finite-dimensional distributions?
Not attempted
4
What is the mean function μX(t)\mu_X(t) of a stochastic process?
Not attempted
5
What is the autocorrelation function RX(t1,t2)R_X(t_1, t_2)?
Not attempted
6
What is the difference between strict-sense and wide-sense stationarity?
Not attempted
7
What is a Gaussian process?
Not attempted
8
What is a sample function (or trajectory)?
Not attempted
9
What does the Kolmogorov Extension Theorem guarantee?
Not attempted
10
What is an independent increment process?
Not attempted

Frequently Asked Questions

What is the difference between a random variable and a stochastic process?

A random variable maps outcomes to numbers (a single value per trial). A stochastic process maps outcomes to functions of time (a whole trajectory per trial). You can think of a stochastic process as a 'dynamic' random variable that evolves over time.

Why are finite-dimensional distributions so important?

According to the Kolmogorov Extension Theorem, the family of all finite-dimensional distributions completely characterizes the stochastic process (under mild consistency conditions). They contain all the statistical information about the process.

What is the difference between strict-sense and wide-sense stationarity?

Strict-sense stationarity requires all statistical properties (distributions) to be invariant under time shifts. Wide-sense (or weak) stationarity only requires the first two moments (mean and autocorrelation) to be invariant. Wide-sense is much easier to verify and is sufficient for many applications like signal processing.

Can a process be wide-sense stationary but not strict-sense?

Yes. For example, a process with a constant mean and autocorrelation but whose higher-order moments change over time would be wide-sense but not strict-sense stationary. However, for Gaussian processes, the two concepts are equivalent.

What is a sample function?

A sample function (or realization/trajectory) is the specific time function x(t) observed for a single outcome ω of the random experiment. Once the experiment is performed, the sample function is deterministic.

Ask AI ✨
MathIsimple – Simple, Friendly Math Tools & Learning