Column Subset Selection and Nyström Approximation via Continuous Optimization.

Winter Simulation Conference(2023)

引用 0|浏览1
暂无评分
摘要
We propose a continuous optimization algorithm for the Column Subset Selection Problem (CSSP) and Nyström approximation. The CSSP and Nyström method construct low-rank approximations of matrices based on a predetermined subset of columns. It is well known that choosing the best column subset of size k is a difficult combinatorial problem. In this work, we show how one can approximate the optimal solution by defining a penalized continuous loss function that is minimized via stochastic gradient descent. We show that the gradients of this loss function can be estimated efficiently using matrix-vector products with a data matrix X in the case of the CSSP or a kernel matrix K in the case of the Nyström approximation. We provide numerical results for a number of real datasets showing that this continuous optimization is competitive against existing methods.
更多
查看译文
关键词
Subset Of Columns,Column Subset Selection,Gradient Descent,Stochastic Gradient Descent,Combinatorial Problem,Null Space,Continuous Loss,Subset Size,Low-rank Approximation,Matrix-vector Product,Unbiased,Sampling Method,Diagonal Matrix,Column Vector,Linear System,Kernel Function,Binary Vector,Singular Value Decomposition,Matrix Multiplication,Conjugate Gradient,Corner Points,Optimal Course,Greedy Selection,Frobenius Norm Of A Matrix,Element-wise Multiplication,Projection Matrix,Interior Point,Exact Problem,Protein Dataset,Extension Of Problem
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要