Publishing Service

Polishing & Checking

Journal of Zhejiang University SCIENCE C

ISSN 1869-1951(Print), 1869-196x(Online), Monthly

Integrating outlier filtering in large margin training

Abstract: Large margin classifiers such as support vector machines (SVM) have been applied successfully in various classification tasks. However, their performance may be significantly degraded in the presence of outliers. In this paper, we propose a robust SVM formulation which is shown to be less sensitive to outliers. The key idea is to employ an adaptively weighted hinge loss that explicitly incorporates outlier filtering in the SVM training, thus performing outlier filtering and classification simultaneously. The resulting robust SVM formulation is non-convex. We first relax it into a semi-definite programming which admits a global solution. To improve the efficiency, an iterative approach is developed. We have performed experiments using both synthetic and real-world data. Results show that the performance of the standard SVM degrades rapidly when more outliers are included, while the proposed robust SVM training is more stable in the presence of outliers.

Key words: Support vector machines, Outlier filter, Semi-definite programming, Multi-stage relaxation


Share this article to: More

Go to Contents

References:

<Show All>

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





DOI:

10.1631/jzus.C1000361

CLC number:

TP301

Download Full Text:

Click Here

Downloaded:

3447

Clicked:

8048

Cited:

1

On-line Access:

2024-08-27

Received:

2023-10-17

Revision Accepted:

2024-05-08

Crosschecked:

2011-04-07

Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952276; Fax: +86-571-87952331; E-mail: jzus@zju.edu.cn
Copyright © 2000~ Journal of Zhejiang University-SCIENCE