Loading…

Controlling Weight Update Probability of Sparse Features in Machine Learning

In machine learning, the feature frequency in learning data can be used for a value of the feature, and in this case, sparse feature is likely to create overfitting problems in the weight optimization process. This is called sparse data problem, and this paper proposes a method that reduce the proba...

Full description

Saved in:
Bibliographic Details
Main Authors: Shin, Joonchoul, Kim, Wansu, Lee, Jusang, Park, Jieun, Ock, Cheol Young
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In machine learning, the feature frequency in learning data can be used for a value of the feature, and in this case, sparse feature is likely to create overfitting problems in the weight optimization process. This is called sparse data problem, and this paper proposes a method that reduce the probability of weight update as the feature is sparse. We experimented with this method in four Natural Language Processing tasks, and the experiment results showed that this method had positive effects on all tasks. On average, this method had the effect of reducing 8 per 100 errors. Also it reduced the number of weight updates, therefore the learning time was reduced to 81% in Named Entity Recognition task.
ISSN:2694-4804
DOI:10.1109/KSE56063.2022.9953753