Loading…
Efficient Hair Damage Detection Using SEM Images Based on Convolutional Neural Network
With increasing interest in hairstyles and hair color, bleaching, dyeing, straightening, and curling hair is being widely used worldwide, and the chemical and physical treatment of hair is also increasing. As a result, hair has suffered a lot of damage, and the degree of damage to hair has been meas...
Saved in:
Published in: | Applied sciences 2021-08, Vol.11 (16), p.7333 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | With increasing interest in hairstyles and hair color, bleaching, dyeing, straightening, and curling hair is being widely used worldwide, and the chemical and physical treatment of hair is also increasing. As a result, hair has suffered a lot of damage, and the degree of damage to hair has been measured only by the naked eye or touch. This has led to serious consequences, such as hair damage and scalp diseases. However, although these problems are serious, there is little research on hair damage. With the advancement of technology, people began to be interested in preventing and reversing hair damage. Manual observation methods cannot accurately and quickly identify hair damage areas. In recent years, with the rise of artificial intelligence technology, a large number of applications in various scenarios have given researchers new methods. In the project, we created a new hair damage data set based on SEM (scanning electron microscope) images. Through various physical and chemical analyses, we observe the changes in the hair surface according to the degree of hair damage, found the relationship between them, used a convolutional neural network to recognize and confirm the degree of hair damage, and categorized the degree of damage into weak damage, moderate damage and high damage. |
---|---|
ISSN: | 2076-3417 2076-3417 |
DOI: | 10.3390/app11167333 |