Loading…
Automatic Image Caption Generation Based on Some Machine Learning Algorithms
This paper is dedicated to machine learning, the branches of machine learning, which include the methods for solving this issue, and the practical implementation of the solution to the automatic image description generation. Automatic image caption generation is one of the frequent goals of computer...
Saved in:
Published in: | Mathematical problems in engineering 2022-04, Vol.2022, p.1-11 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper is dedicated to machine learning, the branches of machine learning, which include the methods for solving this issue, and the practical implementation of the solution to the automatic image description generation. Automatic image caption generation is one of the frequent goals of computer vision. Image description generation models must solve a larger number of complex problems to have this task successfully solved. The objects in the image must be detected and recognized, after which a logical and syntactically correct textual description is generated. For that reason, description generation is a complex problem. It is an extremely important challenge for machine learning algorithms because it represents an impersonation of a complicated human ability to encapsulate huge amounts of highlighted visual pieces of information in descriptive language. The results of the generated descriptions are compared depending on the used pretrained convolutional networks. The BLEU metrics are used to calculate the quality of the image description. Although the solution to the problem of image description automatic generation does provide us with good results, there is yet room for improvement since there are images that are not adequately described. |
---|---|
ISSN: | 1024-123X 1563-5147 |
DOI: | 10.1155/2022/4001460 |