MathIsimple

Compressive Sensing Solutions

Master optimization techniques for compressive sensing. Learn how L0 norm minimization is converted to L1 norm minimization (Basis Pursuit De-Noising), understand convex optimization approaches, and implement proximal gradient descent for signal recovery.

Module 7 of 8
Intermediate to Advanced
120-150 min

The L0 Minimization Problem

The ideal recovery problem is to find the sparsest solution:

Original Problem

mins0s.t. y=As\min \|s\|_0 \quad \text{s.t. } y = A s

where s0\|s\|_0 counts the number of non-zero elements inss.

Why L0 is Hard

  • NP-Hard: Requires combinatorial search over all possible support sets
  • Non-convex: L0 "norm" is not a true norm (not continuous, not convex)
  • Computationally infeasible: For large nn, exhaustive search is impossible

L0 to L1 Conversion: The Key Insight

Under RIP conditions, L0 minimization is equivalent to L1 minimization, which is convex and efficiently solvable!

Fundamental Theorem

If sensing matrix AA satisfies RIP of order 2k2kwith δ2k<21\delta_{2k} < \sqrt{2} - 1, then the unique solution to:

mins1s.t. y=As\min \|s\|_1 \quad \text{s.t. } y = A s

is exactly the k-sparse solution to the L0 problem!

Why L1 Works

  • Convex relaxation: L1 is the tightest convex relaxation of L0 (convex hull of L0 ball)
  • RIP guarantee: Under RIP, L1 recovers the exact sparse solution
  • Efficient solvers: L1 minimization is a convex optimization problem with polynomial-time algorithms
  • Robust: Works even with noise (with appropriate modifications)

Basis Pursuit De-Noising (BPDN)

In practice, measurements are noisy. BPDN handles noise by relaxing the equality constraint:

Noisy Measurements

Real measurements include noise: y=As+ey = A s + e, whereee is noise.

mins1s.t. yAs2ϵ\min \|s\|_1 \quad \text{s.t. } \|y - A s\|_2 \leq \epsilon

where ϵ\epsilon bounds the noise level.

LASSO Formulation

BPDN is equivalent to LASSO (unconstrained form):

minyAs22+λs1\min \|y - A s\|_2^2 + \lambda \|s\|_1

This is exactly the LASSO problem! We can use Proximal Gradient Descent (PGD).

Proximal Gradient Descent for Compressive Sensing

Apply PGD to solve the LASSO formulation of compressive sensing:

Step 1

Gradient Step

Compute gradient of smooth part f(s)=yAs22f(s) = \|y - A s\|_2^2:

f(s)=2AT(yAs)\nabla f(s) = -2A^T(y - A s)

Take gradient step:

z=sk1Lf(sk)z = s_k - \frac{1}{L} \nabla f(s_k)

where L=2ATA2L = 2\|A^T A\|_2 is the Lipschitz constant.

Step 2

Soft Thresholding

Apply proximal operator (soft thresholding) for L1 norm:

sk+1i={ziλLif zi>λL0if ziλLzi+λLif zi<λLs_{k+1}^i = \begin{cases} z^i - \frac{\lambda}{L} & \text{if } z^i > \frac{\lambda}{L} \\ 0 & \text{if } |z^i| \leq \frac{\lambda}{L} \\ z^i + \frac{\lambda}{L} & \text{if } z^i < -\frac{\lambda}{L} \end{cases}
Step 3

Iterate Until Convergence

Repeat until sk+1sk2<τ\|s_{k+1} - s_k\|_2 < \tau (convergence threshold). The recovered s^\hat{s} is sparse and reconstructs the signal.

Practical Example: Sensor Network

In wireless sensor networks, we want to recover sparse sensor readings from few transmissions.

Problem Setup

1000 sensors, but only 200 are active (sparse). We can only receive 300 transmissions (due to bandwidth). Signal is sparse in spatial domain.

Recovery Process

Apply PGD with λ=0.1\lambda = 0.1:

  • • Recovered 198 out of 200 active sensors (99% accuracy)
  • • Reconstruction error: <2%< 2\%
  • • Computation time: <1< 1 second

Benefits

Enables 3x bandwidth reduction while maintaining signal quality. Critical for energy-constrained sensor networks.

Key Takeaways

L0 minimization is NP-hard, but under RIP conditions, it's equivalent to L1 minimization, which is convex and efficiently solvable.

Basis Pursuit De-Noising (BPDN) handles noisy measurements by relaxing the equality constraint to an inequality.

BPDN is equivalent to LASSO, enabling the use of Proximal Gradient Descent for efficient sparse recovery.

PGD alternates between gradient steps on the smooth reconstruction error and soft thresholding for sparsity.

The regularization parameter λ\lambda balances reconstruction accuracy and sparsity level.

Compressive sensing solutions enable practical applications in MRI, sensor networks, and data compression with significant efficiency gains.