Loading…

Optimizing the Classification Cost using SVMs with a Double Hinge Loss

The objective of this study is to minimize the classification cost using Support Vector Machines (SVM) Classifier with a double hinge loss. Such binary classifiers have the option to reject observations when the cost of rejection is lower than that of misclassification. To train this classifier, the...

Full description

Saved in:
Bibliographic Details
Published in:Informatica (Ljubljana) 2014-06, Vol.38 (2), p.125-125
Main Authors: Ahmed, Amirou, Djaffar, Ould-Abdeslam, Zahia, Zidelmal, Mohamed, Aidene, Jean, Merckle
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The objective of this study is to minimize the classification cost using Support Vector Machines (SVM) Classifier with a double hinge loss. Such binary classifiers have the option to reject observations when the cost of rejection is lower than that of misclassification. To train this classifier, the standard SVM optimization problem was modified, by minimizing a double hinge loss function considered as a surrogate convex loss function. The impact of such classifier is illustrated on several discussed results obtained with artificial data and medical data.
ISSN:0350-5596
1854-3871
DOI:10.1142/S0218001412500012