Loading…

Performance Comparison of New Adjusted Min-Max with Decimal Scaling and Statistical Column Normalization Methods for Artificial Neural Network Classification

In this research, the normalization performance of the proposed adjusted min-max methods was compared to the normalization performance of statistical column, decimal scaling, adjusted decimal scaling, and min-max methods, in terms of accuracy and mean square error of the final classification outcome...

Full description

Saved in:
Bibliographic Details
Published in:International journal of mathematics and mathematical sciences 2022-04, Vol.2022, p.1-9
Main Author: Sinsomboonthong, Saichon
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this research, the normalization performance of the proposed adjusted min-max methods was compared to the normalization performance of statistical column, decimal scaling, adjusted decimal scaling, and min-max methods, in terms of accuracy and mean square error of the final classification outcomes. The evaluation process employed an artificial neural network classification on a large variety of widely used datasets. The best method was min-max normalization, providing 84.0187% average ranking of accuracy and 0.1097 average ranking of mean square error across all six datasets. However, the proposed adjusted-2 min-max normalization achieved a higher accuracy and a lower mean square error than min-max normalization on each of the following datasets: white wine quality, Pima Indians diabetes, vertical column, and Indian liver disease datasets. For example, the proposed adjusted-2 min-max normalization on white wine quality dataset achieved 100% accuracy and 0.00000282 mean square error. To conclude, for some classification applications on one of these specific datasets, the proposed adjusted-2 min-max normalization should be used over the other tested normalization methods because it performed better.
ISSN:0161-1712
1687-0425
DOI:10.1155/2022/3584406