Loading…

Determining salmon provenance with automated otolith reading

We present a computer vision method for identifying and reading hatchery marks in salmon otoliths. Synthetic otolith marks are used at hundreds of hatcheries throughout the Pacific Rim to record the release location of salmon. Each year, human readers examine hundreds of thousands of otolith samples...

Full description

Saved in:
Bibliographic Details
Published in:Fisheries research 2022-06, Vol.250, p.106295, Article 106295
Main Authors: Kemp, Chandler E., Doherty, Susan K.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We present a computer vision method for identifying and reading hatchery marks in salmon otoliths. Synthetic otolith marks are used at hundreds of hatcheries throughout the Pacific Rim to record the release location of salmon. Each year, human readers examine hundreds of thousands of otolith samples to identify the marks in captured salmon. The data guide hatchery investments and inform dynamic management practices that maximize allowable catch while preserving wild-hatched populations. However, the method is limited by the time required to process otoliths, the inability to distinguish between wild and un-marked hatchery fish, and in some cases classification processes are limited by the subjective decisions of human readers. Automated otolith reading using computer vision has the potential to improve on all three of these limitations. We tested the classification accuracy of transfer learning using previously published deep neural networks pretrained with the ImageNet database and compared it to the classification accuracy achieved using shallow networks developed specifically for otolith reading. The shallow networks achieved better classification accuracy with the available training and test sets. In particular, we report a novel otolith classification algorithm that uses two neural networks trained with an adversarial algorithm to achieve 93% classification accuracy between four hatchery marks and unmarked otoliths. The algorithm relies on hemi-section images of the otolith exclusively: no additional biological data are needed. Our work demonstrates a novel technique with modest training requirements that achieves unprecedented accuracy. The method can be easily adopted in existing otolith labs, scaled to accommodate additional marks, and does not require tracking additional information about the fish that the otolith was retrieved from. Future work should determine the value of expanding the training set and applying the algorithm to a more diverse set of otolith marks. [Display omitted]
ISSN:0165-7836
1872-6763
DOI:10.1016/j.fishres.2022.106295