Randomized Two-Sided Subspace Iteration for Low-rank Matrix and Tensor Decomposition

Digital Signal Processing(2024)

Cited 0|Views1
No score
Abstract
The low-rank approximation of big data matrices and tensors plays a pivotal role in many modern applications. Although, a truncated version of the singular value decomposition (SVD) furnishes the best approximation, its computation is challenging on modern, multicore architectures. Recently, the randomized subspace iteration has shown to be a powerful tool in approximating large-scale matrices. In this paper we present a two-sided variant of the randomized subspace iteration. Novel in our work is the exploitation of the unpivoted QR factorization, rather than the SVD, for factorizing the compressed matrix. Hence our algorithm is a randomized rank-revealing URV decomposition. We prove the rank-revealingness of our algorithm by establishing bounds for the singular values as well as the other blocks of the compressed matrix. We further provide bounds on the error of the low-rank approximations of the proposed algorithm, in both 2- and Frobenius norm. In addition, we employ the proposed algorithm to efficiently compute low rank tensor decompositions: we present two randomized algorithms, one for the truncated higher-order SVD, and the other for the tensor SVD. We conduct numerical tests on (i) various classes of matrices, and (ii) synthetic tensors and real datasets to demonstrate the efficacy of the proposed algorithms.
More
Translated text
Key words
Low-rank approximation,Pivoted QLP,randomized algorithm,tensor decomposition,Tucker format,tensor SVD
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined