Loading…
A Modified Approach of Hyper-parameter Optimization to Assess The Classifier Performance
Modern algorithms are remarkably adept at identifying data that is too large or complex for humans to comprehend. It has become difficult to identify the list of hyperparameters that deliver an improvement in performance for a given geometry of the data set. This has shifted the emphasis from proces...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Modern algorithms are remarkably adept at identifying data that is too large or complex for humans to comprehend. It has become difficult to identify the list of hyperparameters that deliver an improvement in performance for a given geometry of the data set. This has shifted the emphasis from processing data (model improvement) to the hyper parameters (tuning) of the classifier. Since hyper parameters are set to default values for a generic case, they need not be specially tuned to the given classification task. The purpose of this paper is to demonstrate a strategy that avoids unnecessary tuning attempts and shows the best performance for various classifiers on various shapes of geometry. The findings of this experiment will assist the user in determining whether hyper parameter tuning activities is worth the time and computational resources. |
---|---|
ISSN: | 2831-5022 |
DOI: | 10.1109/PuneCon55413.2022.10014931 |