Loading…
A tied-weight autoencoder for the linear dimensionality reduction of sample data
Dimensionality reduction is a method used in machine learning and data science to reduce the dimensions in a dataset. While linear methods are generally less effective at dimensionality reduction than nonlinear methods, they can provide a linear relationship between the original data and the dimensi...
Saved in:
Published in: | Scientific reports 2024-11, Vol.14 (1), p.26801-18, Article 26801 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Dimensionality reduction is a method used in machine learning and data science to reduce the dimensions in a dataset. While linear methods are generally less effective at dimensionality reduction than nonlinear methods, they can provide a linear relationship between the original data and the dimensionality-reduced representation, leading to better interpretability. In this research, we present a tied-weight autoencoder as a dimensionality reduction model with the merit of both linear and nonlinear methods. Although the tied-weight autoencoder is a nonlinear dimensionality reduction model, we approximate it to function as a linear model. This is achieved by removing the hidden layer units that are largely inactivated by the input data, while preserving the model’s effectiveness. We evaluate the proposed model by comparing its performance with other linear and nonlinear models using benchmark datasets. Our results show that the proposed model performs comparably to the nonlinear model of a similar autoencoder structure to the proposed model. More importantly, we show that the proposed model outperforms the linear models in various metrics, including the mean square error, data reconstruction, and the classification of low-dimensional projections of the input data. Thus, our study provides general recommendations for best practices in dimensionality reduction. |
---|---|
ISSN: | 2045-2322 2045-2322 |
DOI: | 10.1038/s41598-024-77080-8 |