C-ISTA: Iterative Shrinkage-Thresholding Algorithm for Sparse Covariance Matrix Estimation | IEEE Conference Publication | IEEE Xplore

C-ISTA: Iterative Shrinkage-Thresholding Algorithm for Sparse Covariance Matrix Estimation


Abstract:

Covariance matrix estimation is a fundamental task in many fields related to data analysis. As the dimension of the covariance matrix becomes large, it is desirable to ob...Show More

Abstract:

Covariance matrix estimation is a fundamental task in many fields related to data analysis. As the dimension of the covariance matrix becomes large, it is desirable to obtain a sparse estimator and an efficient algorithm to compute it. In this paper, we consider the covariance matrix estimation problem by minimizing a Gaussian negative log-likelihood loss function with an ℓ1 penalty, which is a constrained non-convex optimization problem. We propose to solve the covariance estimator via a simple iterative shrinkage-thresholding algorithm (C-ISTA) with provable convergence. Numerical simulations with comparison to the benchmark methods demonstrate the computational efficiency and good estimation performance of C-ISTA.
Date of Conference: 02-05 July 2023
Date Added to IEEE Xplore: 09 August 2023
ISBN Information:

ISSN Information:

Conference Location: Hanoi, Vietnam

Funding Agency:

References is not available for this document.

I. Introduction

Estimation of a covariance matrix is one of the fundamental problems in many research areas, e.g., biology [1], finance [2]–[4], signal processing [5], [6], machine learning [7], etc. The sample covariance matrix (SCM) could be one of the most commonly used estimators, which is computationally simple and consistent with the Gaussian maximum likelihood estimator (MLE). In covariance estimation, the number of parameters to be estimated is the square of the variable dimension. When the dimension is high, the SCM estimator can perform badly with a limited number of observations [8].

Select All
1.
J. Schäfer and K. Strimmer, "A shrinkage approach to large-scale covariance matrix estimation and implications for functional genomics", Statistical Applications in Genetics and Molecular Biology, vol. 4, no. 1, 2005.
2.
Z. Zhao and D. P. Palomar, "Mean-reverting portfolio with budget constraint", IEEE Transactions on Signal Processing, vol. 66, no. 9, pp. 2342-2357, 2018.
3.
Z. Zhao, R. Zhou and D. P. Palomar, "Optimal mean-reverting portfolio with leverage constraint for statistical arbitrage in finance", IEEE Transactions on Signal Processing, vol. 67, no. 7, pp. 1681-1695, 2019.
4.
Z. Zhang and Z. Zhao, "Vast portfolio selection with submodular norm regularizations", 2021 29th European Signal Processing Conference (EUSIPCO), pp. 2099-2103, 2021.
5.
A. Aubry, A. De Maio and L. Pallotta, "A geometric approach to covariance matrix estimation and its applications to radar problems", IEEE Transactions on Signal Processing, vol. 66, no. 4, pp. 907-922, 2017.
6.
B. Wang, H. Zhang, Z. Zhao and Y. Sun, "Globally convergent algorithms for learning multivariate generalized gaussian distributions" in 2021 IEEE Statistical Signal Processing Workshop (SSP)., IEEE, pp. 336-340, 2021.
7.
Q. Wei and Z. Zhao, "Large covariance matrix estimation with oracle statistical rate", IEEE International Conference on Acoustics Speech and Signal Processing (ICASSP), 2023.
8.
V. A. Marčenko and L. A. Pastur, "Distribution of eigenvalues for some sets of random matrices", Mathematics of the USSR-Sbornik, vol. 1, no. 4, pp. 457, 1967.
9.
A. P. Dempster, "Covariance selection", Biometrics, vol. 28, no. 1, pp. 157-175, 1972.
10.
J. Bien and R. J. Tibshirani, "Sparse estimation of a covariance matrix", Biometrika, vol. 98, no. 4, pp. 807-820, 2011.
11.
S. Chaudhuri, M. Drton and T. S. Richardson, "Estimation of a covariance matrix with zeros", Biometrika, vol. 94, no. 1, pp. 199-216, 2007.
12.
P. J. Bickel and E. Levina, "Covariance regularization by thresholding", The Annals of Statistics, vol. 36, no. 6, pp. 2577-2604, 2008.
13.
N. El Karoui, "Operator norm consistent estimation of large-dimensional sparse covariance matrices", The Annals of Statistics, vol. 36, no. 6, pp. 2717-2756, 2008.
14.
A. J. Rothman, E. Levina and J. Zhu, "Generalized thresholding of large covariance matrices", Journal of the American Statistical Association, vol. 104, no. 485, pp. 177-186, 2009.
15.
T. Cai and W. Liu, "Adaptive thresholding for sparse covariance matrix estimation", Journal of the American Statistical Association, vol. 106, no. 494, pp. 672-684, 2011.
16.
L. Xue, S. Ma and H. Zou, " Positive-definite ℓ 1 -penalized estimation of large covariance matrices ", Journal of the American Statistical Association, vol. 107, no. 500, pp. 1480-1491, 2012.
17.
A. J. Rothman, "Positive definite estimators of large covariance matrices", Biometrika, vol. 99, no. 3, pp. 733-740, 2012.
18.
A. Kyrillidis, R. K. Mahabadi, Q. T. Dinh and V. Cevher, "Scalable sparse covariance estimation via self-concordance", Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28, no. 1, 2014.
19.
J. Fan, Y. Liao and H. Liu, "An overview of the estimation of large covariance and precision matrices", The Econometrics Journal, vol. 19, no. 1, pp. C1-C32, 2016.
20.
J. Z. Huang, N. Liu, M. Pourahmadi and L. Liu, "Covariance matrix selection and estimation via penalised normal likelihood", Biometrika, vol. 93, no. 1, pp. 85-98, 2006.
21.
C. Lam and J. Fan, "Sparsistency and rates of convergence in large covariance matrix estimation", Annals of Statistics, vol. 37, no. 6B, pp. 4254, 2009.
22.
D. N. Phan, H. A. Le Thi and T. P. Dinh, "Sparse covariance matrix estimation by DCA-based algorithms", Neural Computation, vol. 29, no. 11, pp. 3040-3077, 2017.
23.
J. Xu and K. Lange, "A proximal distance algorithm for likelihood-based sparse covariance estimation", Biometrika, vol. 109, no. 4, pp. 1047-1066, 2022.
24.
J. Goes, G. Lerman and B. Nadler, "Robust sparse covariance estimation by thresholding Tyler’s M-estimator", The Annals of Statistics, vol. 48, no. 1, pp. 86-110, 2020.
25.
H. Liu, L. Wang and T. Zhao, "Sparse covariance matrix estimation with eigenvalue constraints", Journal of Computational and Graphical Statistics, vol. 23, no. 2, pp. 439-459, 2014.
26.
Y. Cui, C. Leng and D. Sun, "Sparse estimation of high-dimensional correlation matrices", Computational Statistics & Data Analysis, vol. 93, pp. 390-403, 2016.
27.
R. Tibshirani, "Regression shrinkage and selection via the lasso", Journal of the Royal Statistical Society: Series B (Methodological), vol. 58, no. 1, pp. 267-288, 1996.
28.
A. Beck, First-Order Methods in Optimization., SIAM, vol. 25, 2017.
29.
A. Beck and M. Teboulle, "A fast iterative shrinkage-thresholding algorithm for linear inverse problems", SIAM Journal on Imaging Sciences, vol. 2, no. 1, pp. 183-202, 2009.
30.
A. J. Rothman, P. J. Bickel, E. Levina and J. Zhu, "Sparse permutation invariant covariance estimation", Electronic Journal of Statistics, vol. 2, pp. 494-515, 2008.
Contact IEEE to Subscribe

References

References is not available for this document.