Abstract
Support vector machine (SVM) is sparse in that its classifier is expressed as a linear combination of only a few support vectors (SVs). Whenever an outlier is included as an SV in the classifier, the outlier may have serious impact on the estimated decision function. In this article, we propose a robust loss function that is convex. Our learning algorithm is more robust to outliers than SVM. Also the convexity of our loss function permits an efficient solution path algorithm. Through simulated and real data analysis, we illustrate that our method can be useful in the presence of labeling errors.
Original language | English |
---|---|
Pages (from-to) | 6061-6073 |
Number of pages | 13 |
Journal | Communications in Statistics Part B: Simulation and Computation |
Volume | 46 |
Issue number | 8 |
DOIs | |
State | Published - 14 Sep 2017 |
Keywords
- Classification
- Convexity
- Loss function
- Outlier
- Solution path algorithm