Mixed norm regularized models for low-rank tensor completion

Information Sciences(2024)

引用 0|浏览0
暂无评分
摘要
Recent advances on low-rank representation have achieved promising performances for tensor completion in the area of information sciences. However, current low-rank tensor completion (LRTC) models merely model global low-rankness and lose sight of capturing the local subspace low-rank structures of underlying tensor objects. As such, they may fall short for the low sampling rates cases. To this end, we develop a novel tensor completion scheme that bridges global low-rankness and local subspace low-rankness priors into a unified framework. More specifically, we propose two mixed norm tensor penalties to describe local subspace low-rank structures for tensor completion through theoretical analysis. Besides, we point out that mixed norm on the factor subspaces can ensure the non-convex global low-rankness of tensor objects. We design a block coordinate descent algorithm with proximal technique to solve the models, which is guaranteed to converge to the coordinate-wise minimizers. Notably, our methods are much more tractable than existing tensor rank minimization methods with lower computational complexities. Finally, extensive experiments on three types of tensor datasets validate the superiority of the proposed methods, especially in extremely low sampling rates cases.
更多
查看译文
关键词
Tensor completion,Low-rankness,Mixed norm,Parallel matrix factorization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要