Chrome Extension
WeChat Mini Program
Use on ChatGLM

Parametric kernel low-rank approximations using tensor train decomposition

Abraham Khan,Arvind K. Saibaba

CoRR(2024)

Cited 0|Views2
No score
Abstract
Computing low-rank approximations of kernel matrices is an important problem with many applications in scientific computing and data science. We propose methods to efficiently approximate and store low-rank approximations to kernel matrices that depend on certain hyperparameters. The main idea behind our method is to use multivariate Chebyshev function approximation along with the tensor train decomposition of the coefficient tensor. The computations are in two stages: an offline stage, which dominates the computational cost and is parameter-independent, and an online stage, which is inexpensive and instantiated for specific hyperparameters. A variation of this method addresses the case that the kernel matrix is symmetric and positive semi-definite. The resulting algorithms have linear complexity in terms of the sizes of the kernel matrices. We investigate the efficiency and accuracy of our method on parametric kernel matrices induced by various kernels, such as the Matérn kernel, through various numerical experiments. Our methods have speedups up to 200× in the online time compared to other methods with similar complexity and comparable accuracy.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined