谷歌浏览器插件
订阅小程序
在清言上使用

Consistent online Gaussian process regression without the sample complexity bottleneck

2019 AMERICAN CONTROL CONFERENCE (ACC)(2021)

引用 13|浏览30
暂无评分
摘要
Gaussian processes provide a framework for nonlinear nonparametric Bayesian inference widely applicable across science and engineering. Unfortunately, their computational burden scales cubically with the training sample size, which in the case that samples arrive in perpetuity, approaches infinity. This issue necessitates approximations for use with streaming data, which to date mostly lack convergence guarantees. Thus, we develop the first online Gaussian process approximation that preserves convergence to the population posterior, i.e., asymptotic posterior consistency , while ameliorating its intractable complexity growth with the sample size. We propose an online compression scheme that, following each a posteriori update, fixes an error neighborhood with respect to the Hellinger metric centered at the current posterior, and greedily tosses out past kernel dictionary elements until its boundary is hit. We call the resulting method Parsimonious Online Gaussian Processes (POG). For diminishing error radius, asymptotic statistical stationarity is achieved (Theorem 1 ii) at the cost of unbounded memory in the limit. On the other hand, for constant error radius, POG converges to a neighborhood of stationarity (Theorem 1 ii) but with finite memory at-worst determined by the metric entropy of the feature space (Theorem 2 ). Here stationarity refers to the distributional distance between sequential marginal posteriors approaching null with the time index. Experimental results are presented on several nonlinear regression problems which illuminates the merits of this approach as compared with alternatives that fix the subspace dimension defining the history of past points.
更多
查看译文
关键词
Bayesian inference, Gaussian process, Nonparametric statistics, Online learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要