Loading web-font TeX/Math/Italic
Joint Sparse Recovery Using Signal Space Matching Pursuit | IEEE Journals & Magazine | IEEE Xplore

Joint Sparse Recovery Using Signal Space Matching Pursuit


Abstract:

In this paper, we put forth a new joint sparse recovery algorithm called signal space matching pursuit (SSMP). The key idea of the proposed SSMP algorithm is to sequentia...Show More

Abstract:

In this paper, we put forth a new joint sparse recovery algorithm called signal space matching pursuit (SSMP). The key idea of the proposed SSMP algorithm is to sequentially investigate the support of jointly sparse vectors to minimize the subspace distance to the residual space. Our performance guarantee analysis indicates that SSMP accurately reconstructs any row K-sparse matrix of rank r in the full row rank scenario if the sampling matrix A satisfies krank(A) ≥ K+1, which meets the fundamental minimum requirement on A to ensure exact recovery. We also show that SSMP guarantees exact reconstruction in at most K - r + [r/L] iterations, provided that A satisfies the restricted isometry property (RIP) of order L(K - r) + r + 1 %/L with δL(K-r)+r+1 <; max{ √r/K+ r/4 + √r/4, √L/K +1.15√L), where, L is the number of indices chosen in each iteration. This implies that the requirement on the RIP constant becomes less restrictive when r increases. Such behavior seems to be natural but has not been reported for most of conventional methods. We also show that if r = 1, then by running more than K iterations, the performance guarantee of SSMP can be improved to δ[7.8K] ≤ 0.155. Furthermore, we show that under a suitable RIP condition, the reconstruction error of SSMP is upper bounded by a constant multiple of the noise power, which demonstrates the robustness of SSMP to measurement noise. Finally, from extensive numerical experiments, we show that SSMP outperforms conventional joint sparse recovery algorithms both in noiseless and noisy scenarios.
Published in: IEEE Transactions on Information Theory ( Volume: 66, Issue: 8, August 2020)
Page(s): 5072 - 5096
Date of Publication: 13 April 2020

ISSN Information:

Funding Agency:


I. Introduction

In recent years, sparse signal recovery has received considerable attention in image processing, seismology, data compression, source localization, wireless communication, machine learning, to name just a few [2]–[5]. The main goal of sparse signal recovery is to reconstruct a high dimensional -sparse vector ( where denotes the number of nonzero elements in ) from its compressed linear measurements \begin{equation*} \mathbf {y} = \mathbf {A} \mathbf {x},\tag{1}\end{equation*}

where () is the sampling (sensing) matrix. In various applications, such as wireless channel estimation [5], [6], sub-Nyquist sampling of multiband signals [7], [8], angles of departure and arrival (AoD and AoA) estimation in mmWave communication systems [9], and brain imaging [10], we encounter a situation where multiple measurement vectors (MMV) of a group of jointly sparse vectors are available. By the jointly sparse vectors, we mean multiple sparse vectors having a common support (the index set of nonzero entries). In this situation, one can dramatically improve the reconstruction accuracy by recovering all the desired sparse vectors simultaneously [11], [12]. The problem to reconstruct a group of jointly -sparse vectors is often referred to as the joint sparse recovery problem [17]. Let be the measurement vector of acquired through the sampling matrix . Then the system model describing the MMV can be expressed as \begin{equation*} \mathbf {Y} = \mathbf {A} \mathbf {X},\tag{2}\end{equation*}
where and .

In the sequel, we assume that are linearly independent.

Contact IEEE to Subscribe

References

References is not available for this document.