Loading [MathJax]/extensions/MathZoom.js
Expectation-Maximization Gaussian-Mixture Approximate Message Passing | IEEE Journals & Magazine | IEEE Xplore

Expectation-Maximization Gaussian-Mixture Approximate Message Passing


Abstract:

When recovering a sparse signal from noisy compressive linear measurements, the distribution of the signal's non-zero coefficients can have a profound effect on recovery ...Show More

Abstract:

When recovering a sparse signal from noisy compressive linear measurements, the distribution of the signal's non-zero coefficients can have a profound effect on recovery mean-squared error (MSE). If this distribution was a priori known, then one could use computationally efficient approximate message passing (AMP) techniques for nearly minimum MSE (MMSE) recovery. In practice, however, the distribution is unknown, motivating the use of robust algorithms like LASSO-which is nearly minimax optimal-at the cost of significantly larger MSE for non-least-favorable distributions. As an alternative, we propose an empirical-Bayesian technique that simultaneously learns the signal distribution while MMSE-recovering the signal-according to the learned distribution-using AMP. In particular, we model the non-zero distribution as a Gaussian mixture and learn its parameters through expectation maximization, using AMP to implement the expectation step. Numerical experiments on a wide range of signal classes confirm the state-of-the-art performance of our approach, in both reconstruction error and runtime, in the high-dimensional regime, for most (but not all) sensing operators.
Published in: IEEE Transactions on Signal Processing ( Volume: 61, Issue: 19, October 2013)
Page(s): 4658 - 4672
Date of Publication: 10 July 2013

ISSN Information:


I. Introduction

We consider estimating a -sparse (or compressible) signal from linear measurements , where is known and is additive white Gaussian noise (AWGN). For this problem, accurate (relative to the noise variance) signal recovery is known to be possible with polynomial-complexity algorithms when is sufficiently sparse and when satisfies certain restricted isometry properties [4], or when is large with i.i.d. zero-mean sub-Gaussian entries [5] as discussed below.

Contact IEEE to Subscribe

References

References is not available for this document.