Loading…

Cobots at the Museum: The Role of Robotics in Digitisation

In the last decade, the Natural History Museum, UK (NHM), has been at the forefront of the digitisation of natural history collections, with almost six million of its 80 million specimens digitised. This momentous undertaking has led to numerous innovations on how to optimise digitisation workflows....

Full description

Saved in:
Bibliographic Details
Published in:Biodiversity Information Science and Standards 2024-10, Vol.8
Main Authors: Scott, Ben, Salili-James, Arianna, Zhang, Naifeng, Poon, Sanson, Smith, Vincent
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In the last decade, the Natural History Museum, UK (NHM), has been at the forefront of the digitisation of natural history collections, with almost six million of its 80 million specimens digitised. This momentous undertaking has led to numerous innovations on how to optimise digitisation workflows. One avenue that is currently being explored is the use of collaborative robots—cobots. Since acquiring a Techman TM5 900 robotic arm in 2023 (Scott et al. 2023), we have been experimenting with its capabilities. Experiments began with simple pick-and-place tasks, using artificial specimens. Next, we focused on two use-cases, based on the digitisation of shark teeth and pinned-insects. Both shark teeth and pinned insects are in abundance at the NHM, making their manual digitisation a tedious task. Currently, we have trained the cobot to pick up a specimen, move it elsewhere to photograph and scan the specimen, then move it back to its original position or to a new place. Thus far, this has all been coordinate-based. Focusing on pinned insect specimens, we have now begun training deep learning models to perform segmentation, classification, and tracking tasks on images and manually taken videos. Segmentation and classification tasks range from distinguishing specimens from one another within drawers, to classifying different pins, labels, and insects. Meanwhile, object tracking methods are utilised to track labels from videos taken around the specimen. By tracking different labels simultaneously from multiple frames, we can combine the views of the labels in order to obtain a full picture for each label (for example, using tools described in Salili-James et al. 2022). Thus far, our machine learning pipelines have proved successful, for example, with F1 scores of 96–98% to classify and segment insects and to locate pin heads from dorsal views. Soon, we will be establishing workflows that integrate computer vision (CV) and machine learning (ML) techniques directly with the robotic arm, with pipelines that could be applied to different datasets, and that can significantly enhance efficiency. Broadly, these pipelines can be split into four sections: Specimen Identification : CV/ML to locate individual specimens or certain parts of specimens e.g., pinheads within pinned insects. Handling : With custom grippers, the cobot can delicately pick up, move, and place specimens, to and from photography stations for high quality scanning. Imaging & Scanning : The cobot will sc
ISSN:2535-0897
2535-0897
DOI:10.3897/biss.8.138846