Loading…

Articulated Multi-Instrument 2-D Pose Estimation Using Fully Convolutional Networks

Instrument detection, pose estimation, and tracking in surgical videos are an important vision component for computer-assisted interventions. While significant advances have been made in recent years, articulation detection is still a major challenge. In this paper, we propose a deep neural network...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on medical imaging 2018-05, Vol.37 (5), p.1276-1287
Main Authors: Du, Xiaofei, Kurmann, Thomas, Chang, Ping-Lin, Allan, Maximilian, Ourselin, Sebastien, Sznitman, Raphael, Kelly, John D., Stoyanov, Danail
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Instrument detection, pose estimation, and tracking in surgical videos are an important vision component for computer-assisted interventions. While significant advances have been made in recent years, articulation detection is still a major challenge. In this paper, we propose a deep neural network for articulated multi-instrument 2-D pose estimation, which is trained on detailed annotations of endoscopic and microscopic data sets. Our model is formed by a fully convolutional detection-regression network. Joints and associations between joint pairs in our instrument model are located by the detection subnetwork and are subsequently refined through a regression subnetwork. Based on the output from the model, the poses of the instruments are inferred using maximum bipartite graph matching. Our estimation framework is powered by deep learning techniques without any direct kinematic information from a robot. Our framework is tested on single-instrument RMIT data, and also on multi-instrument EndoVis and in vivo data with promising results. In addition, the data set annotations are publicly released along with our code and model.
ISSN:0278-0062
1558-254X
DOI:10.1109/TMI.2017.2787672