Epigraphically-Relaxed Linearly-Involved Generalized Moreau-Enhanced Model for Layered Mixed Norm Regularization | IEEE Conference Publication | IEEE Xplore

Epigraphically-Relaxed Linearly-Involved Generalized Moreau-Enhanced Model for Layered Mixed Norm Regularization


Abstract:

This paper proposes an epigraphically-relaxed linearly-involved generalized Moreau-enhanced (ER-LiGME) model for layered mixed norm regularization. Group sparse and low-r...Show More

Abstract:

This paper proposes an epigraphically-relaxed linearly-involved generalized Moreau-enhanced (ER-LiGME) model for layered mixed norm regularization. Group sparse and low-rank (GSpLr)-aware modeling using ℓ1 /nuclear-norm-based layered mixed norms has succeeded in precise high dimensional signal recovery, e.g., images and videos. Our previous work significantly expands the potential of the GSpLr-aware modeling by epigraphical relaxation (ER). It enables us to handle a (even non-proximable) deeply-layered mixed norm minimization by decoupling it into a norm and multiple epigraphical constraints (if each proximity operator is available). One problem with typical SpLr modeling is that it suffers from the underestimation effect due to the ℓ1 and nuclear norm regularization. To circumvent this problem, LiGME penalty functions, which modify conventional sparsity and low-rankness promoting convex functions to nonconvex ones while keeping overall convexity, have been proposed conventionally. In this work, we integrate the ER technique with the LiGME model to realize deeply-layered (possibly non-proximable) mixed norm regularization and show its effectiveness in denoising and compressed sensing reconstruction.
Date of Conference: 08-11 October 2023
Date Added to IEEE Xplore: 11 September 2023
ISBN Information:
Conference Location: Kuala Lumpur, Malaysia

1. INTRODUCTION

Group sparsity and low-rankness (GSpLr)-aware regularization modeled by composite convex functions (e.g., mixed norms) has been a fundamental tool for many tasks in high-dimensional signal processing and machine learning, e.g., signal/data recovery, regression, classification, and so on [1] –[17]. Typical examples are the total variation (TV) [9] –[13] and the structure-tensor TV [14] –[17]. One may attempt to introduce a more layered involved mixed norm to model various aspects of the group sparsity or low-rankness of signals. However, it possibly does not have an efficient calculation of the proximity operator (non-proximable). Our previous work [18] introduced epigraphical relaxation (ER), which successfully handles deeply-layered mixed norm minimization problems by decoupling the composite function into a norm and epigraphical constraints and calculating each proximity operator and the projection onto an epigraph [19], and showed a non-proximable three-layered mixed norm, called decorrelated structure-tensor TV (DSTV), as a practical realization.

Contact IEEE to Subscribe

References

References is not available for this document.