Loading…

A Multi-Objective Programming Approach to Compromising Classification Performance Metrics

In this paper, we propose an MOP approach for finding the best compromise solution among more than two competing performance criteria. Our formulation for classifier learning, which we refer to as iterative constrained optimization (ICO), involves an iterative process of the optimization of individu...

Full description

Saved in:
Bibliographic Details
Main Authors: Yaman, S., Chin-Hui Lee
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we propose an MOP approach for finding the best compromise solution among more than two competing performance criteria. Our formulation for classifier learning, which we refer to as iterative constrained optimization (ICO), involves an iterative process of the optimization of individual objectives with proper constraints on the remaining competing objectives. The fundamental idea is improving one objective while the rest are allowed to degrade. One of the main components of ICO is the supervision mechanism based on the local changes on a selected utility function for controlling the changes in the individual objectives. The utility is an aggregated preference chosen to make a joint decision when evaluating the appropriateness of local changes in competing criteria, i.e. changes from one iteration to the next. Another important component is the adjustment of constraint bounds based on the objective functions attained in the previous iteration using a development set. Many MOP approaches developed so far are formal and extensible to large number of competing objectives. However, their utilities are illustrated using a few objectives. We illustrate the utility of the proposed framework in the context of automatic language identification of 12 languages and 3 dialects, i.e., with a total of 30 objectives. In our experiments, we observed that the ICO-trained classifiers give not only reduced error rates but also a better balance among the many competing objectives.
ISSN:1551-2541
2378-928X
DOI:10.1109/MLSP.2007.4414287