MathIsimple

Stochastic Process Fundamentals

Master the core concepts of stochastic processes: definitions, classification, sample functions, finite-dimensional distributions, and numerical characteristics

6-10 hoursIntermediate to Advanced8 lessons
Learning Objectives
  • Understand the rigorous definition of stochastic processes and their mathematical structure
  • Master classification of processes by parameter sets and state spaces
  • Analyze sample functions and their properties in stochastic modeling
  • Apply finite-dimensional distribution theory for process characterization
  • Calculate and interpret numerical characteristics of stochastic processes
  • Understand Gaussian processes and their special properties

1. Stochastic Process Definition & Structure

Core Definition

Let SS be the sample space of a random experiment (containing all possible outcomes ω\omega), and TT be a parameter set (usually representing time, where TRT \subset \mathbb{R}, such as T={1,2,}T = \{1,2,\ldots\} or T=[0,+)T = [0,+\infty)).

Definition:

If for any tTt \in T, there exists a random variable X(t,ω)X(t,\omega)defined on SS, then the family of two-parameter functions

{X(t,ω);tT,ωS}\{X(t,\omega); t \in T, \omega \in S\}

is called a stochastic process, denoted as {X(t);tT}\{X(t); t \in T\}.

State & State Space

For fixed tt, the value X(t)X(t) is called thestate at time tt. The set of all possible states forms the state space II.

Examples:

  • • Target hits: I={0,1,2,,n}I = \{0,1,2,\ldots,n\}
  • • Voltage signal: I=[a,a]I = [-a,a]

Sample Functions

For fixed ω\omega (one trial outcome), X(t,ω)X(t,\omega) is a deterministic function of tt, called a sample functionor realization, denoted as x(t)=X(t,ω0)x(t) = X(t,\omega_0).

Physical Meaning:

One complete observation of the random process over time

2. Classification of Stochastic Processes

Four-Category Classification
Based on parameter set TT (discrete/continuous) and state space II (discrete/continuous)
CategoryParameter Set TState Space IExample
1. Discrete Time, Discrete StateAt most countable (e.g., {1,2,}\{1,2,\ldots\})At most countableBinomial process SnS_n (hits in first nn trials)
2. Discrete Time, Continuous StateAt most countableInterval (e.g., R\mathbb{R})Temperature sequence XnX_n (daily temperature, I[30,40]I \in [-30,40])
3. Continuous Time, Discrete StateReal interval (e.g., T=[0,+)T = [0,+\infty))At most countableClaim count process N(t)N(t) (claims in (0,t](0,t], I={0,1,2,}I = \{0,1,2,\ldots\})
4. Continuous Time, Continuous StateReal intervalIntervalRandom phase cosine X(t)=acos(ωt+Θ)X(t) = a\cos(\omega t + \Theta), voltage fluctuation
Classic Examples with Analysis

Example 1: Binomial Process

Independent target shooting with hit probability pp. Let SnS_n represent total hits in first nn trials.

Parameter set: T={1,2,}T = \{1,2,\ldots\} (discrete time)

State space: I={0,1,,n}I = \{0,1,\ldots,n\} (discrete state)

Sample functions: Non-decreasing step functions

Property: SnSn1{0,1}S_n - S_{n-1} \in \{0,1\} (at most one hit per trial)

Example 2: Random Phase Cosine Wave

X(t)=acos(ωt+Θ)X(t) = a\cos(\omega t + \Theta) where tRt \in \mathbb{R}, a,ω>0a, \omega > 0 are constants, ΘU(0,2π)\Theta \sim U(0, 2\pi)

Parameter set: T=RT = \mathbb{R} (continuous time)

State space: I=[a,a]I = [-a,a] (continuous state)

Sample functions: x(t)=acos(ωt+θ0)x(t) = a\cos(\omega t + \theta_0) (cosine curves with different phases)

Property: All realizations have same frequency but random phase

3. Finite-Dimensional Distribution Family

Mathematical Foundation
Complete characterization of stochastic processes through finite-dimensional distributions

One-Dimensional Distribution

For fixed tTt \in T, the distribution function of random variable X(t)X(t):

FX(x;t)=P{X(t)x},xRF_X(x;t) = P\{X(t) \leq x\}, \quad x \in \mathbb{R}

The collection {FX(x;t);tT}\{F_X(x;t); t \in T\} forms the one-dimensional distribution family.

n-Dimensional Distribution

For any n2n \geq 2 and distinct times t1,t2,,tnTt_1, t_2, \ldots, t_n \in T:

FX(x1,x2,,xn;t1,t2,,tn)=P{X(t1)x1,X(t2)x2,,X(tn)xn}F_X(x_1,x_2,\ldots,x_n;t_1,t_2,\ldots,t_n) = P\{X(t_1) \leq x_1, X(t_2) \leq x_2, \ldots, X(t_n) \leq x_n\}

where x1,,xnRx_1,\ldots,x_n \in \mathbb{R}.

Compatibility Conditions

The finite-dimensional distribution family must satisfy:

  • Consistency: Lower-dimensional distributions are marginals of higher-dimensional ones
  • Permutation invariance: Distribution unchanged under time reordering
  • Example: limx2+FX(x1,x2;t1,t2)=FX(x1;t1)\lim_{x_2 \to +\infty} F_X(x_1,x_2;t_1,t_2) = F_X(x_1;t_1)

4. Numerical Characteristics of Stochastic Processes

Essential Functions
Simplified statistical description when finite-dimensional distributions are complex

Mean Function

μX(t)=E[X(t)]\mu_X(t) = E[X(t)]

Average level of the process at time tt, representing the "central tendency" of sample functions.

Variance Function

σX2(t)=Var[X(t)]=E[(X(t)μX(t))2]\sigma_X^2(t) = \text{Var}[X(t)] = E[(X(t) - \mu_X(t))^2]

Fluctuation degree around the mean at time tt.

Autocorrelation Function

RX(t1,t2)=E[X(t1)X(t2)],t1,t2TR_X(t_1,t_2) = E[X(t_1)X(t_2)], \quad t_1,t_2 \in T

Describes linear correlation between process values at times t1t_1 and t2t_2.

Special cases: When t1=t2=tt_1 = t_2 = t, RX(t,t)=E[X2(t)]R_X(t,t) = E[X^2(t)] (mean square value)

Autocovariance Function

CX(t1,t2)=Cov(X(t1),X(t2))=RX(t1,t2)μX(t1)μX(t2)C_X(t_1,t_2) = \text{Cov}(X(t_1),X(t_2)) = R_X(t_1,t_2) - \mu_X(t_1)\mu_X(t_2)

Linear correlation after removing mean effects.

Key relationships:

  • σX2(t)=CX(t,t)\sigma_X^2(t) = C_X(t,t) (variance as special case)
  • σX2(t)=RX(t,t)[μX(t)]2\sigma_X^2(t) = R_X(t,t) - [\mu_X(t)]^2
Second-Order Processes

Definition

If for any tTt \in T, the mean square value exists:

E[X2(t)]<E[X^2(t)] < \infty

then {X(t);tT}\{X(t); t \in T\} is called a second-order moment process.

Properties

  • • Mean function, autocorrelation, and autocovariance functions all exist
  • • Guaranteed by Cauchy-Schwarz inequality: [EX(t1)X(t2)]2E[X2(t1)]E[X2(t2)]<[E|X(t_1)X(t_2)|]^2 \leq E[X^2(t_1)]E[X^2(t_2)] < \infty
  • • Example: Random phase cosine wave X(t)=acos(ωt+Θ)X(t) = a\cos(\omega t + \Theta) is second-order since E[X2(t)]=a2/2<E[X^2(t)] = a^2/2 < \infty

5. Gaussian (Normal) Processes

Definition and Superior Properties

Definition

If for any n1n \geq 1 and any times t1,t2,,tnTt_1, t_2, \ldots, t_n \in T, the n-dimensional random vector (X(t1),X(t2),,X(tn))(X(t_1), X(t_2), \ldots, X(t_n)) follows an n-dimensional normal distribution, then {X(t);tT}\{X(t); t \in T\} is called a Gaussian process or normal process.

Key Properties

  • 1. Complete characterization by second-order characteristics:
    All statistical properties determined by mean function μX(t)\mu_X(t) and autocovariance function CX(t1,t2)C_X(t_1,t_2)
  • 2. Linear transformation invariance:
    If {X(t)}\{X(t)\} is Gaussian, then Y(t)=i=1kaiX(ti)Y(t) = \sum_{i=1}^k a_i X(t_i)is also Gaussian for constants a1,,aka_1, \ldots, a_k
  • 3. Uncorrelation equivalence to independence:
    For Gaussian processes, CX(t1,t2)=0X(t1)X(t2)C_X(t_1,t_2) = 0 \Leftrightarrow X(t_1) \perp X(t_2)(uncorrelated ⟺ independent)

Example: Distribution Calculation

Let {X(t);t0}\{X(t); t \geq 0\} be a Gaussian process with μX(t)=t\mu_X(t) = tand CX(t1,t2)=t1t2+1C_X(t_1,t_2) = t_1 t_2 + 1. Find the distribution of X(2)X(2).

Solution:

  • • Mean: μX(2)=2\mu_X(2) = 2
  • • Variance: σX2(2)=CX(2,2)=2×2+1=5\sigma_X^2(2) = C_X(2,2) = 2 \times 2 + 1 = 5
  • • Therefore: X(2)N(2,5)X(2) \sim N(2, 5)
Next Steps & Applications

Practice Problems

Reinforce understanding with classification exercises, numerical characteristics calculations, and Gaussian process problems.

Practice Now

Formula Reference

Quick access to key formulas for stochastic process fundamentals and numerical characteristics.

View Formulas

Advanced Topics

Continue with stationary processes, Markov chains, and specialized process types.

Continue Learning

Master the fundamentals