Loading…
Image analysis method to evaluate beak and head motion of broiler chickens during feeding
•Knowing precisely the jaw movement is essential for determining the optimum size of feed particles.•Detecting broilers feeding behaviour may help finding the right size of feed particles at all ages.•Using proper feed pellets size and composition will help minimizing feed wastage. While feeding bro...
Saved in:
Published in: | Computers and electronics in agriculture 2015-06, Vol.114, p.88-95 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •Knowing precisely the jaw movement is essential for determining the optimum size of feed particles.•Detecting broilers feeding behaviour may help finding the right size of feed particles at all ages.•Using proper feed pellets size and composition will help minimizing feed wastage.
While feeding broiler chickens may exhibit different biomechanical movements in relation to the physical properties of feed such as size, shape and hardness. Furthermore, the chicken’s anatomical features at various ages, genders and breeds in conjunction with feed type and feeder design parameters may also have an influence on biomechanical movement. To determine the significance of these parameters during feeding, kinematic measurements related to the biomechanical motions are required. However, determining this information manually from video by a human operator is tedious and prone to errors. The aim of this study was to develop a machine vision technique which visually identifies the relevant biomechanical variables attributed to broiler feeding behaviour from high-speed video footages. A total of 369 frames from three broiler chicks of 5days old were manually measured and compared to the automatic measurement. For each bird six mandibulations (i.e. a cycle of opening and closing the beak) were manually selected, which were two different sequences of three consecutive mandibulations starting right after a feed grasping. The kinematics variables considered were: (i) head displacement (eye centre position; x- and y-axis); (ii) beak opening speed (given in mmms−1); (iii) beak closing speed (measured in mmms−1); (iv) beak opening acceleration (given in mmms−2); and (v) beak closing acceleration (given in mmms−2). Results indicated that the highest error for eye position detection was 1.05mm for x-axis and 0.67 for the y-axis. The difference between manual and automatic (algorithm output) measurements for the beak gape was 0.22±0.009mm, in which the maximum difference was 0.76mm. Regression analysis indicated that both measures are highly correlated (R2=99.2%). Statistical tests suggested that the primary probably causes of error are the speed and acceleration of the beak motion (i.e. blurred image), as well as the presence of feed particles in the first and second mandibulations right after the feed grasping (i.e. occluded beak tips by feed particles). The presented method calculated automatically the position of the eye centre (x- and y-axis) and both upper and lower beak tips |
---|---|
ISSN: | 0168-1699 1872-7107 |
DOI: | 10.1016/j.compag.2015.03.017 |