MathIsimple

Stochastic Processes – Problem 19: Prove that the density function of is where ,

Question

Let X1,X2,,XnX_{1}, X_{2}, \ldots, X_{n} be independent continuous random variables with common density function ff. Denote by X(i)X_{(i)} the ii-th smallest among X1,X2,,XnX_{1}, X_{2}, \ldots, X_{n}.

(a) Prove that the density function of X(i)X_{(i)} is fX(i)(x)=n!(i1)!(ni)!(F(x))i1(F(x))nif(x)f_{X_{(i)}}(x)=\frac{n!}{(i-1)!(n-i)!}(F(x))^{i-1}(\overline{F}(x))^{n-i}f(x) where F(x)=xf(y)dyF(x)=\int_{-\infty}^{x}f(y)\,dy, F(x)=1F(x)\overline{F}(x)=1-F(x).

(b) Show that P(X(i)x)=k=in(nk)[F(x)]k[F(x)]nkP(X_{(i)}\leqslant x)=\sum_{k=i}^{n}\binom{n}{k}[F(x)]^{k}[\overline{F}(x)]^{n-k}.

(c) Using the preceding two parts, prove the probability identity: k=in(nk)yk(1y)nk=0yn!(i1)!(ni)!xi1(1x)nidx.\sum_{k=i}^{n}\binom{n}{k}y^{k}(1-y)^{n-k}=\int_{0}^{y}\frac{n!}{(i-1)!(n-i)!}x^{i-1}(1-x)^{n-i}\,dx.

(d) Let TiT_{i} denote the arrival time of the ii-th event in a Poisson process N(t),t0N(t),t\geqslant0. Find E[TiN(t)=n]E[T_{i}|N(t)=n] (consider the cases ini\leqslant n and i>ni>n separately).

(e) Compute the conditional density function of T1,T2,,Tn1T_{1},T_{2},\dots,T_{n-1} given Tn=tT_{n}=t.

Step-by-step solution

(a) Step 1. Consider the event x<X(i)x+dxx < X_{(i)} \le x+dx. Partition the nn i.i.d. variables into three groups: i1i-1 in (,x](-\infty, x], 1 in (x,x+dx](x, x+dx], nin-i in (x+dx,)(x+dx, \infty).

Step 2. By independence and the multinomial formula: P(A)=n!(i1)!1!(ni)![F(x)]i1[f(x)dx+o(dx)][1F(x+dx)]niP(A) = \frac{n!}{(i-1)!\,1!\,(n-i)!} [F(x)]^{i-1} [f(x)dx + o(dx)] [1-F(x+dx)]^{n-i}.

Step 3. Dividing by dxdx and letting dx0+dx \to 0^+: fX(i)(x)=n!(i1)!(ni)![F(x)]i1[1F(x)]nif(x)f_{X_{(i)}}(x) = \frac{n!}{(i-1)!(n-i)!} [F(x)]^{i-1} [1-F(x)]^{n-i} f(x).

(b) {X(i)x}\{X_{(i)} \le x\} = "at least ii observations x\le x". Let Y=#{j:Xjx}Binomial(n,F(x))Y = \#\{j: X_j \le x\} \sim \text{Binomial}(n, F(x)). Then P(X(i)x)=P(Yi)=k=in(nk)[F(x)]k[1F(x)]nkP(X_{(i)} \le x) = P(Y \ge i) = \sum_{k=i}^{n} \binom{n}{k} [F(x)]^k [1-F(x)]^{n-k}.

(c) Step 1. From (a): FX(i)(x)=xn!(i1)!(ni)![F(t)]i1[1F(t)]nif(t)dtF_{X_{(i)}}(x) = \int_{-\infty}^{x} \frac{n!}{(i-1)!(n-i)!} [F(t)]^{i-1} [1-F(t)]^{n-i} f(t)\,dt.

Step 2. Substituting u=F(t)u = F(t): FX(i)(x)=0F(x)n!(i1)!(ni)!ui1(1u)niduF_{X_{(i)}}(x) = \int_0^{F(x)} \frac{n!}{(i-1)!(n-i)!} u^{i-1}(1-u)^{n-i}\,du.

Step 3. Setting y=F(x)y = F(x) and equating with (b) yields the identity.

(d) Step 1. When ini \le n: Given N(t)=nN(t)=n, Ti=dtU(i)T_i \stackrel{d}{=} t \cdot U_{(i)} where U(i)U_{(i)} is the ii-th order statistic of nn i.i.d. Uniform(0,1)(0,1). Using the Beta function: E[U(i)]=i/(n+1)E[U_{(i)}] = i/(n+1), so E[TiN(t)=n]=it/(n+1)E[T_i|N(t)=n] = it/(n+1).

Step 2. When i>ni > n: By the memoryless property, TiTn=j=n+1iEjT_i - T_n = \sum_{j=n+1}^{i} E_j with EjExp(λ)E_j \sim \text{Exp}(\lambda) i.i.d. Thus E[TiN(t)=n]=t+(in)/λE[T_i|N(t)=n] = t + (i-n)/\lambda.

(e) Step 1. Given Tn=tT_n = t, (T1,,Tn1)(T_1,\dots,T_{n-1}) are distributed as the order statistics of n1n-1 i.i.d. Uniform(0,t)(0,t) variables.

Step 2. Their joint density is (n1)!tn1\frac{(n-1)!}{t^{n-1}} on 0<t1<<tn1<t0 < t_1 < \cdots < t_{n-1} < t.

Final answer

(a) QED. (b) k=in(nk)[F(x)]k[F(x)]nk\sum_{k=i}^{n}\binom{n}{k}[F(x)]^k[\overline{F}(x)]^{n-k}. (c) QED. (d) itn+1\frac{it}{n+1} for ini \le n; t+inλt + \frac{i-n}{\lambda} for i>ni > n. (e) (n1)!tn1\frac{(n-1)!}{t^{n-1}}, for 0<t1<<tn1<t0 < t_1 < \cdots < t_{n-1} < t.

Marking scheme

1. Checkpoints (max 7 pts total)

Notes: Points must be awarded per the logical chains below. Merely copying formulas or listing known conditions earns no credit. For parts with multiple solution paths, score the best path; do not combine.

  • (a) Order statistic density (1 pt): Core derivation via infinitesimal method or CDF differentiation. Must show intermediate steps. [1 pt]
  • (b) CDF as binomial sum (1 pt): Identify {X(i)x}\{X_{(i)} \le x\} = "at least ii observations x\le x" and write the binomial sum. [1 pt]
  • (c) Integral identity (1 pt): Connect (a) and (b) via substitution u=F(x)u=F(x). [1 pt]
  • (d) Poisson conditional expectation (2 pts):
  • Case ini \le n: Identify uniform order statistics, compute it/(n+1)it/(n+1). [1 pt]
  • Case i>ni > n: Use memoryless property, get t+(in)/λt + (i-n)/\lambda. [1 pt]
  • (e) Conditional density (2 pts):
  • Method: identify uniform order statistics or compute joint/marginal ratio. [1 pt]
  • Result: (n1)!tn1\frac{(n-1)!}{t^{n-1}} with domain 0<t1<<tn1<t0 < t_1 < \cdots < t_{n-1} < t. [1 pt]

Total (max 7)


2. Zero-credit items

  • In (a), only writing the final formula with no derivation.
  • In (c), only verifying a special case.
  • In (d), giving unconditional expectation i/λi/\lambda ignoring N(t)=nN(t)=n.
  • In (d), incorrectly getting it/nit/n (equipartition error).

3. Deductions

  • Logical error in (d) i>ni>n case (mixing conditional TnT_n expectation with unconditional increments): that sub-part earns 0.
  • Missing domain in (e): -1 pt.
  • Confusing N(t)=nN(t)=n with Tn=tT_n=t: that part earns 0.

Total: ______ / 7

Ask AI ✨