Efficient Algorithms for Outlier-Robust Regression
CoRR(2018)
摘要
We give the first polynomial-time algorithm for performing linear or
polynomial regression resilient to adversarial corruptions in both examples and
labels.
Given a sufficiently large (polynomial-size) training set drawn i.i.d. from
distribution D and subsequently corrupted on some fraction of points, our
algorithm outputs a linear function whose squared error is close to the squared
error of the best-fitting linear function with respect to D, assuming that the
marginal distribution of D over the input space is certifiably
hypercontractive. This natural property is satisfied by many well-studied
distributions such as Gaussian, strongly log-concave distributions and, uniform
distribution on the hypercube among others. We also give a simple statistical
lower bound showing that some distributional assumption is necessary to succeed
in this setting.
These results are the first of their kind and were not known to be even
information-theoretically possible prior to our work.
Our approach is based on the sum-of-squares (SoS) method and is inspired by
the recent applications of the method for parameter recovery problems in
unsupervised learning. Our algorithm can be seen as a natural convex relaxation
of the following conceptually simple non-convex optimization problem: find a
linear function and a large subset of the input corrupted sample such that the
least squares loss of the function over the subset is minimized over all
possible large subsets.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要