Loading…
Dirt detection on brown eggs by means of color computer vision
In the last 20 yr, different methods for detecting defects in eggs were developed. Until now, no satisfying technique existed to sort and quantify dirt on eggshells. The work presented here focuses on the design of an off-line computer vision system to differentiate and quantify the presence of diff...
Saved in:
Published in: | Poultry science 2005-10, Vol.84 (10), p.1653-1659 |
---|---|
Main Authors: | , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In the last 20 yr, different methods for detecting defects in eggs were developed. Until now, no satisfying technique existed to sort and quantify dirt on eggshells. The work presented here focuses on the design of an off-line computer vision system to differentiate and quantify the presence of different dirt stains on brown eggs: dark (feces), white (uric acid), blood, and yolk stains. A system that provides uniform light exposure around the egg was designed. In this uniform light, pictures of dirty and clean eggs were taken, stored, and analyzed. The classification was based on a few standard logical operators, allowing for a quick implementation in an on-line set-up. In an experiment, 100 clean and 100 dirty eggs were used to validate the classification algorithm. The designed vision system showed an accuracy of 99% for the detection of dirt stains. Two percent of the clean eggs had a light-colored eggshell and were subsequently mistaken for showing large white stains. The accuracy of differentiation of the different kinds of dirt stains was 91%. Of the eggs with dark stains, 10.81% were mistaken for having bloodstains, and 33.33% of eggs with bloodstains were mistaken for having dark stains. The developed system is possibly a first step toward an on line dirt evaluation technique for brown eggs. |
---|---|
ISSN: | 0032-5791 1525-3171 |
DOI: | 10.1093/ps/84.10.1653 |