Loading…
Unseen family member classification using mixture of experts
All family members resemble each other in different ways which is recognizable by our brain. In this paper, we have developed family classification using AdaBoost, Support Vector Machines and K-Nearest Neighbor classifiers with different patches of training data. In some cases family classification...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | All family members resemble each other in different ways which is recognizable by our brain. In this paper, we have developed family classification using AdaBoost, Support Vector Machines and K-Nearest Neighbor classifiers with different patches of training data. In some cases family classification involve unseen data classification in which the classifiers' performance drop significantly. Therefore Mixture of Experts is conducted to improve their performance. To have a fair comparison of mentioned approaches 3 different families from 3 different ethnic groups are used. Experimental results show that we can achieve an average accuracy rate of 76 percent and up to 27 percent accuracy improvement by using majority voting of mixture of experts depending on the family data. |
---|---|
ISSN: | 2156-2318 2158-2297 |
DOI: | 10.1109/ICIEA.2010.5516872 |