Making \emph{ordinary least squares} linear classfiers more robust.

arXiv: Data Analysis, Statistics and Probability(2018)

Cited 23|Views0
No score
Abstract
In the field of statistics and machine learning, the sums-of-squares, commonly referred to as emph{ordinary least squares}, can be used as a convenient choice of cost function because of its many nice analytical properties, though not always the best choice. However, it has been long known that emph{ordinary least squares} is not robust to outliers. Several attempts to resolve this problem led to the creation of alternative methods that, either did not fully resolved the emph{outlier problem} or were computationally difficult. In this paper, we provide a very simple solution that can make emph{ordinary least squares} less sensitive to outliers in data classification, by emph{scaling the augmented input vector by its length}. We show some mathematical expositions of the emph{outlier problem} using some approximations and geometrical techniques. We present numerical results to support the efficacy of our method.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined