MathIsimple

Independent Increment Processes Formulas

Essential mathematical formulas for independent increment processes and their applications

Learn
Study the concepts and theory
Practice
Test your understanding
More Formulas
Explore other formula topics
1. Core Definition & Properties
Fundamental mathematical structure of independent increment processes

Definition

A stochastic process {X(t);t0}\{X(t); t \geq 0\} with state space R\mathbb{R} or Z\mathbb{Z} is an independent increment process if for any positive integer nn and any time points 0=t0<t1<t2<<tn0 = t_0 < t_1 < t_2 < \cdots < t_n, the increments are independent:

X(t1)X(t0),X(t2)X(t1),,X(tn)X(tn1)X(t_1) - X(t_0), \quad X(t_2) - X(t_1), \quad \ldots, \quad X(t_n) - X(t_{n-1})

Key Assumption: Often X(0)=0X(0) = 0 for simplicity

Markov Property

Independent increment processes are necessarily Markov processes:

P{X(t)yX(u),us}=P{X(t)yX(s)}P\{X(t) \leq y \mid X(u), u \leq s\} = P\{X(t) \leq y \mid X(s)\}

Intuition: Future depends only on present, not on past

2. Mean Functions & Additivity
Properties of expectation functions for independent increment processes

Mean Function Additivity

For any t>s0t > s \geq 0:

E[X(t)]=E[X(s)]+E[X(t)X(s)]E[X(t)] = E[X(s)] + E[X(t) - X(s)]

Derivation: X(t)=X(s)+[X(t)X(s)]X(t) = X(s) + [X(t) - X(s)] and linearity of expectation

Increment Expectation

Rearranging the additivity property:

E[X(t)X(s)]=E[X(t)]E[X(s)]E[X(t) - X(s)] = E[X(t)] - E[X(s)]

Application: Calculate expected increments from process values

Mean Square Value

Relationship between mean square value and variance:

E[X2(t)]=DX(t)+[E[X(t)]]2E[X^2(t)] = D_X(t) + [E[X(t)]]^2

Note: DX(t)=Var[X(t)]D_X(t) = \text{Var}[X(t)] is the variance function

3. Covariance Functions & Simplification
Covariance structure and simplification properties

Covariance Simplification

For any s,t0s, t \geq 0:

Cov[X(s),X(t)]=DX(min(s,t))\text{Cov}[X(s), X(t)] = D_X(\min(s, t))

Key Result: Covariance depends only on the minimum time

Derivation

For s<ts < t, X(t)=X(s)+[X(t)X(s)]X(t) = X(s) + [X(t) - X(s)]:

Cov[X(s),X(t)]=Cov[X(s),X(s)]+Cov[X(s),X(t)X(s)]\text{Cov}[X(s), X(t)] = \text{Cov}[X(s), X(s)] + \text{Cov}[X(s), X(t) - X(s)]
=DX(s)+0=DX(s)= D_X(s) + 0 = D_X(s)

Reasoning: Increment is independent of X(s)X(s)

Autocorrelation Function

The autocorrelation function is:

RX(s,t)=Cov[X(s),X(t)]DX(s)DX(t)=DX(min(s,t))DX(s)DX(t)R_X(s, t) = \frac{\text{Cov}[X(s), X(t)]}{\sqrt{D_X(s) D_X(t)}} = \frac{D_X(\min(s, t))}{\sqrt{D_X(s) D_X(t)}}

Property: Correlation depends on the ratio of variances

4. Random Walk Formulas & Applications
Mathematical properties of simple random walks

Simple Random Walk Definition

Let X1,X2,X_1, X_2, \ldots be i.i.d. with P(Xi=1)=pP(X_i = 1) = p and P(Xi=1)=q=1pP(X_i = -1) = q = 1 - p:

Sn=X1+X2++Xn(n=1,2,)S_n = X_1 + X_2 + \cdots + X_n \quad (n = 1, 2, \ldots)

Process: {Sn;n=1,2,}\{S_n; n = 1, 2, \ldots\} is a discrete-time independent increment process

Mean Function

Expected value after nn steps:

E[Sn]=n(pq)E[S_n] = n(p - q)

Special Case: Fair game (p=q=0.5p = q = 0.5) gives E[Sn]=0E[S_n] = 0

Variance Function

Variance after nn steps:

D[Sn]=4npqD[S_n] = 4npq

Special Case: Fair game gives D[Sn]=nD[S_n] = n

Covariance Function

For nmn \leq m:

Cov[Sn,Sm]=4npq\text{Cov}[S_n, S_m] = 4npq

Result: Cov[Sn,Sm]=D[Sn]\text{Cov}[S_n, S_m] = D[S_n] by covariance simplification

Increment Probability

For independent increments:

P{S1=a,S4=b}=P{S1=a}P{S4S1=ba}P\{S_1 = a, S_4 = b\} = P\{S_1 = a\}P\{S_4 - S_1 = b - a\}

Application: Calculate joint probabilities using independence

5. Advanced Properties & Theorems
Advanced mathematical properties and theoretical results

Normal Distribution Properties

If X(t)N(μt,σt2)X(t) \sim N(\mu_t, \sigma_t^2) and X(s)N(μs,σs2)X(s) \sim N(\mu_s, \sigma_s^2) with independent increments:

X(t)X(s)N(μtμs,σt2+σs2)X(t) - X(s) \sim N(\mu_t - \mu_s, \sigma_t^2 + \sigma_s^2)

Note: Variances add for independent normal variables

Stationary Increments

If increments are also stationary (distribution depends only on time difference):

X(t)X(s)=dX(ts)X(0)X(t) - X(s) \stackrel{d}{=} X(t - s) - X(0)

Implication: Process has time-homogeneous properties

Characteristic Function

For independent increments, the characteristic function factors:

ϕX(t)(u)=ϕX(s)(u)ϕX(t)X(s)(u)\phi_{X(t)}(u) = \phi_{X(s)}(u) \cdot \phi_{X(t) - X(s)}(u)

Application: Analyze distribution properties through characteristic functions

6. Connection to Advanced Processes
Relationship to Poisson processes and Brownian motion

Poisson Process

A counting process {N(t);t0}\{N(t); t \geq 0\} with:

  • Independent increments
  • Stationary increments
  • N(t)N(s)Poisson(λ(ts))N(t) - N(s) \sim \text{Poisson}(\lambda(t - s))

Property: E[N(t)]=λtE[N(t)] = \lambda t and D[N(t)]=λtD[N(t)] = \lambda t

Brownian Motion

A continuous process {B(t);t0}\{B(t); t \geq 0\} with:

  • Independent increments
  • Stationary increments
  • B(t)B(s)N(0,σ2(ts))B(t) - B(s) \sim N(0, \sigma^2(t - s))

Property: E[B(t)]=0E[B(t)] = 0 and D[B(t)]=σ2tD[B(t)] = \sigma^2 t

General Form

For independent increment processes with X(0)=0X(0) = 0:

X(t)=i=1n[X(ti)X(ti1)]X(t) = \sum_{i=1}^n [X(t_i) - X(t_{i-1})]

Decomposition: Process can be decomposed into independent increments