Loading…

Guiding the Last Centimeter: Novel Anatomy-Aware Probe Servoing for Standardized Imaging Plane Navigation in Robotic Lung Ultrasound

Navigating the ultrasound (US) probe to the standardized imaging plane (SIP) for image acquisition is a critical but operator-dependent task in conventional freehand diagnostic US. Robotic US systems (RUSS) offer the potential to enhance imaging consistency by leveraging real-time US image feedback...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on automation science and engineering 2024-09, p.1-12
Main Authors: Ma, Xihan, Zeng, Mingjie, Hill, Jeffrey C., Hoffmann, Beatrice, Zhang, Ziming, Zhang, Haichong K.
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 12
container_issue
container_start_page 1
container_title IEEE transactions on automation science and engineering
container_volume
creator Ma, Xihan
Zeng, Mingjie
Hill, Jeffrey C.
Hoffmann, Beatrice
Zhang, Ziming
Zhang, Haichong K.
description Navigating the ultrasound (US) probe to the standardized imaging plane (SIP) for image acquisition is a critical but operator-dependent task in conventional freehand diagnostic US. Robotic US systems (RUSS) offer the potential to enhance imaging consistency by leveraging real-time US image feedback to optimize the probe pose, thereby reducing reliance on operator expertise. However, determining the proper approach to extracting generalizable features from the US images for probe pose adjustment remains challenging. In this work, we propose a SIP navigation framework for RUSS, exemplified in the context of robotic lung ultrasound (LUS). This framework facilitates automatic probe adjustment when in proximity to the SIP. This is achieved by explicitly extracting multiple anatomical features presented in real-time LUS images and performing non-patient-specific template matching to generate probe motion towards the SIP using image-based visual servoing (IBVS). The framework is further integrated with the active-sensing end-effector (A-SEE), a customized robot end-effector that leverages patient external body geometry to maintain optimal probe alignment with the contact surface, thus preserving US signal quality throughout the navigation. The proposed approach ensures procedural interpretability and inter-patient adaptability. Validation is conducted through anatomy-mimicking phantom and in-vivo evaluations involving five human subjects. The results show the framework's high navigating precision with the probe correctly located at the SIP for all cases, exhibiting positioning error of under 2 mm in translation and under 2 degrees in rotation. These results demonstrate the navigation process's capability to accommodate anatomical variations among patients. Note to Practitioners -Compared with traditional freehand ultrasound (US) imaging, robotic ultrasound systems (RUSS) have the potential to largely standardize the US diagnosis outcome caused by varying operator expertise if an inter-patient consistent, automatic standardized imaging plane (SIP) navigation process is available. This paper presents a SIP navigation framework for lung US (LUS) examination, which recognizes anatomical landmarks from the US images and fine-tunes the pose of the US probe so that the landmarks are positioned in accordance with a non-patient-specific template image. The special end-effector, active-sensing end-effector (A-SEE), maintains the probe at an optimal orientation with respect
doi_str_mv 10.1109/TASE.2024.3448241
format article
fullrecord <record><control><sourceid>crossref_ieee_</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TASE_2024_3448241</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10670334</ieee_id><sourcerecordid>10_1109_TASE_2024_3448241</sourcerecordid><originalsourceid>FETCH-LOGICAL-c191t-c557ed1b39d72125d3bdeb391f8bd83624b5c6a9ef637d81f06c978231a2111f3</originalsourceid><addsrcrecordid>eNpNkEFOwzAQRS0EEqVwACQWvkBKJo4Th11VQakUlYq268iJJ8UosZHjFpU1BydRu2A1M_rvz-IRcg_hBCDMHjfT9fMkCqN4wuJYRDFckBFwLgKWCnY57DEPeMb5Nbnpus-wJ0UWjsjvfK-VNjvqP5DmsvN0hsbrFj26J7q0B2zo1Ehv22Mw_ZYO6crZEuka3cEOvdo6uvbSKOmU_kFFF63cDcGqkQbpUh70TnptDdWGvtvSel3RfN8D28Y72dm9UbfkqpZNh3fnOSbbl-fN7DXI3-aL2TQPKsjABxXnKSooWabSCCKuWKmwv6AWpRIsieKSV4nMsE5YqgTUYVJlqYgYyAgAajYmcPpbOdt1Duviy-lWumMBYTFoLAaNxaCxOGvsOw-njkbEf3yShozF7A-Z03CF</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Guiding the Last Centimeter: Novel Anatomy-Aware Probe Servoing for Standardized Imaging Plane Navigation in Robotic Lung Ultrasound</title><source>IEEE Xplore (Online service)</source><creator>Ma, Xihan ; Zeng, Mingjie ; Hill, Jeffrey C. ; Hoffmann, Beatrice ; Zhang, Ziming ; Zhang, Haichong K.</creator><creatorcontrib>Ma, Xihan ; Zeng, Mingjie ; Hill, Jeffrey C. ; Hoffmann, Beatrice ; Zhang, Ziming ; Zhang, Haichong K.</creatorcontrib><description>Navigating the ultrasound (US) probe to the standardized imaging plane (SIP) for image acquisition is a critical but operator-dependent task in conventional freehand diagnostic US. Robotic US systems (RUSS) offer the potential to enhance imaging consistency by leveraging real-time US image feedback to optimize the probe pose, thereby reducing reliance on operator expertise. However, determining the proper approach to extracting generalizable features from the US images for probe pose adjustment remains challenging. In this work, we propose a SIP navigation framework for RUSS, exemplified in the context of robotic lung ultrasound (LUS). This framework facilitates automatic probe adjustment when in proximity to the SIP. This is achieved by explicitly extracting multiple anatomical features presented in real-time LUS images and performing non-patient-specific template matching to generate probe motion towards the SIP using image-based visual servoing (IBVS). The framework is further integrated with the active-sensing end-effector (A-SEE), a customized robot end-effector that leverages patient external body geometry to maintain optimal probe alignment with the contact surface, thus preserving US signal quality throughout the navigation. The proposed approach ensures procedural interpretability and inter-patient adaptability. Validation is conducted through anatomy-mimicking phantom and in-vivo evaluations involving five human subjects. The results show the framework's high navigating precision with the probe correctly located at the SIP for all cases, exhibiting positioning error of under 2 mm in translation and under 2 degrees in rotation. These results demonstrate the navigation process's capability to accommodate anatomical variations among patients. Note to Practitioners -Compared with traditional freehand ultrasound (US) imaging, robotic ultrasound systems (RUSS) have the potential to largely standardize the US diagnosis outcome caused by varying operator expertise if an inter-patient consistent, automatic standardized imaging plane (SIP) navigation process is available. This paper presents a SIP navigation framework for lung US (LUS) examination, which recognizes anatomical landmarks from the US images and fine-tunes the pose of the US probe so that the landmarks are positioned in accordance with a non-patient-specific template image. The special end-effector, active-sensing end-effector (A-SEE), maintains the probe at an optimal orientation with respect to the body, allowing consistent-quality US images to be acquired throughout the navigation. Unlike previous works, our approach can navigate to complicated SIP containing multiple anatomies with interpretable robot arm motion. We verified our framework's ability to navigate the probe to the SIP with millimeter-level accuracy under phantom and human experiment settings. While preliminary results demonstrate the framework's efficacy in guiding the robotic LUS procedure, the performance of the system on other examinations (e.g., liver and thyroid US) involving soft tissues requires further validation. In the future, the framework can be applied in various US examinations by implementing specific anatomical feature detection modules.</description><identifier>ISSN: 1545-5955</identifier><identifier>EISSN: 1558-3783</identifier><identifier>DOI: 10.1109/TASE.2024.3448241</identifier><identifier>CODEN: ITASC7</identifier><language>eng</language><publisher>IEEE</publisher><subject>Anatomy ; Feature extraction ; image-based visual servoing ; Lung ; Lung ultrasound ; medical robots ; Navigation ; Probes ; robotic ultrasound ; Robots ; Ultrasonic imaging ; ultrasound segmentation</subject><ispartof>IEEE transactions on automation science and engineering, 2024-09, p.1-12</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0000-0003-1007-0910 ; 0000-0002-1314-8456 ; 0000-0002-5363-4539</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10670334$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,27924,27925,54796</link.rule.ids></links><search><creatorcontrib>Ma, Xihan</creatorcontrib><creatorcontrib>Zeng, Mingjie</creatorcontrib><creatorcontrib>Hill, Jeffrey C.</creatorcontrib><creatorcontrib>Hoffmann, Beatrice</creatorcontrib><creatorcontrib>Zhang, Ziming</creatorcontrib><creatorcontrib>Zhang, Haichong K.</creatorcontrib><title>Guiding the Last Centimeter: Novel Anatomy-Aware Probe Servoing for Standardized Imaging Plane Navigation in Robotic Lung Ultrasound</title><title>IEEE transactions on automation science and engineering</title><addtitle>TASE</addtitle><description>Navigating the ultrasound (US) probe to the standardized imaging plane (SIP) for image acquisition is a critical but operator-dependent task in conventional freehand diagnostic US. Robotic US systems (RUSS) offer the potential to enhance imaging consistency by leveraging real-time US image feedback to optimize the probe pose, thereby reducing reliance on operator expertise. However, determining the proper approach to extracting generalizable features from the US images for probe pose adjustment remains challenging. In this work, we propose a SIP navigation framework for RUSS, exemplified in the context of robotic lung ultrasound (LUS). This framework facilitates automatic probe adjustment when in proximity to the SIP. This is achieved by explicitly extracting multiple anatomical features presented in real-time LUS images and performing non-patient-specific template matching to generate probe motion towards the SIP using image-based visual servoing (IBVS). The framework is further integrated with the active-sensing end-effector (A-SEE), a customized robot end-effector that leverages patient external body geometry to maintain optimal probe alignment with the contact surface, thus preserving US signal quality throughout the navigation. The proposed approach ensures procedural interpretability and inter-patient adaptability. Validation is conducted through anatomy-mimicking phantom and in-vivo evaluations involving five human subjects. The results show the framework's high navigating precision with the probe correctly located at the SIP for all cases, exhibiting positioning error of under 2 mm in translation and under 2 degrees in rotation. These results demonstrate the navigation process's capability to accommodate anatomical variations among patients. Note to Practitioners -Compared with traditional freehand ultrasound (US) imaging, robotic ultrasound systems (RUSS) have the potential to largely standardize the US diagnosis outcome caused by varying operator expertise if an inter-patient consistent, automatic standardized imaging plane (SIP) navigation process is available. This paper presents a SIP navigation framework for lung US (LUS) examination, which recognizes anatomical landmarks from the US images and fine-tunes the pose of the US probe so that the landmarks are positioned in accordance with a non-patient-specific template image. The special end-effector, active-sensing end-effector (A-SEE), maintains the probe at an optimal orientation with respect to the body, allowing consistent-quality US images to be acquired throughout the navigation. Unlike previous works, our approach can navigate to complicated SIP containing multiple anatomies with interpretable robot arm motion. We verified our framework's ability to navigate the probe to the SIP with millimeter-level accuracy under phantom and human experiment settings. While preliminary results demonstrate the framework's efficacy in guiding the robotic LUS procedure, the performance of the system on other examinations (e.g., liver and thyroid US) involving soft tissues requires further validation. In the future, the framework can be applied in various US examinations by implementing specific anatomical feature detection modules.</description><subject>Anatomy</subject><subject>Feature extraction</subject><subject>image-based visual servoing</subject><subject>Lung</subject><subject>Lung ultrasound</subject><subject>medical robots</subject><subject>Navigation</subject><subject>Probes</subject><subject>robotic ultrasound</subject><subject>Robots</subject><subject>Ultrasonic imaging</subject><subject>ultrasound segmentation</subject><issn>1545-5955</issn><issn>1558-3783</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><recordid>eNpNkEFOwzAQRS0EEqVwACQWvkBKJo4Th11VQakUlYq268iJJ8UosZHjFpU1BydRu2A1M_rvz-IRcg_hBCDMHjfT9fMkCqN4wuJYRDFckBFwLgKWCnY57DEPeMb5Nbnpus-wJ0UWjsjvfK-VNjvqP5DmsvN0hsbrFj26J7q0B2zo1Ehv22Mw_ZYO6crZEuka3cEOvdo6uvbSKOmU_kFFF63cDcGqkQbpUh70TnptDdWGvtvSel3RfN8D28Y72dm9UbfkqpZNh3fnOSbbl-fN7DXI3-aL2TQPKsjABxXnKSooWabSCCKuWKmwv6AWpRIsieKSV4nMsE5YqgTUYVJlqYgYyAgAajYmcPpbOdt1Duviy-lWumMBYTFoLAaNxaCxOGvsOw-njkbEf3yShozF7A-Z03CF</recordid><startdate>20240911</startdate><enddate>20240911</enddate><creator>Ma, Xihan</creator><creator>Zeng, Mingjie</creator><creator>Hill, Jeffrey C.</creator><creator>Hoffmann, Beatrice</creator><creator>Zhang, Ziming</creator><creator>Zhang, Haichong K.</creator><general>IEEE</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0003-1007-0910</orcidid><orcidid>https://orcid.org/0000-0002-1314-8456</orcidid><orcidid>https://orcid.org/0000-0002-5363-4539</orcidid></search><sort><creationdate>20240911</creationdate><title>Guiding the Last Centimeter: Novel Anatomy-Aware Probe Servoing for Standardized Imaging Plane Navigation in Robotic Lung Ultrasound</title><author>Ma, Xihan ; Zeng, Mingjie ; Hill, Jeffrey C. ; Hoffmann, Beatrice ; Zhang, Ziming ; Zhang, Haichong K.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c191t-c557ed1b39d72125d3bdeb391f8bd83624b5c6a9ef637d81f06c978231a2111f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Anatomy</topic><topic>Feature extraction</topic><topic>image-based visual servoing</topic><topic>Lung</topic><topic>Lung ultrasound</topic><topic>medical robots</topic><topic>Navigation</topic><topic>Probes</topic><topic>robotic ultrasound</topic><topic>Robots</topic><topic>Ultrasonic imaging</topic><topic>ultrasound segmentation</topic><toplevel>online_resources</toplevel><creatorcontrib>Ma, Xihan</creatorcontrib><creatorcontrib>Zeng, Mingjie</creatorcontrib><creatorcontrib>Hill, Jeffrey C.</creatorcontrib><creatorcontrib>Hoffmann, Beatrice</creatorcontrib><creatorcontrib>Zhang, Ziming</creatorcontrib><creatorcontrib>Zhang, Haichong K.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><jtitle>IEEE transactions on automation science and engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ma, Xihan</au><au>Zeng, Mingjie</au><au>Hill, Jeffrey C.</au><au>Hoffmann, Beatrice</au><au>Zhang, Ziming</au><au>Zhang, Haichong K.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Guiding the Last Centimeter: Novel Anatomy-Aware Probe Servoing for Standardized Imaging Plane Navigation in Robotic Lung Ultrasound</atitle><jtitle>IEEE transactions on automation science and engineering</jtitle><stitle>TASE</stitle><date>2024-09-11</date><risdate>2024</risdate><spage>1</spage><epage>12</epage><pages>1-12</pages><issn>1545-5955</issn><eissn>1558-3783</eissn><coden>ITASC7</coden><abstract>Navigating the ultrasound (US) probe to the standardized imaging plane (SIP) for image acquisition is a critical but operator-dependent task in conventional freehand diagnostic US. Robotic US systems (RUSS) offer the potential to enhance imaging consistency by leveraging real-time US image feedback to optimize the probe pose, thereby reducing reliance on operator expertise. However, determining the proper approach to extracting generalizable features from the US images for probe pose adjustment remains challenging. In this work, we propose a SIP navigation framework for RUSS, exemplified in the context of robotic lung ultrasound (LUS). This framework facilitates automatic probe adjustment when in proximity to the SIP. This is achieved by explicitly extracting multiple anatomical features presented in real-time LUS images and performing non-patient-specific template matching to generate probe motion towards the SIP using image-based visual servoing (IBVS). The framework is further integrated with the active-sensing end-effector (A-SEE), a customized robot end-effector that leverages patient external body geometry to maintain optimal probe alignment with the contact surface, thus preserving US signal quality throughout the navigation. The proposed approach ensures procedural interpretability and inter-patient adaptability. Validation is conducted through anatomy-mimicking phantom and in-vivo evaluations involving five human subjects. The results show the framework's high navigating precision with the probe correctly located at the SIP for all cases, exhibiting positioning error of under 2 mm in translation and under 2 degrees in rotation. These results demonstrate the navigation process's capability to accommodate anatomical variations among patients. Note to Practitioners -Compared with traditional freehand ultrasound (US) imaging, robotic ultrasound systems (RUSS) have the potential to largely standardize the US diagnosis outcome caused by varying operator expertise if an inter-patient consistent, automatic standardized imaging plane (SIP) navigation process is available. This paper presents a SIP navigation framework for lung US (LUS) examination, which recognizes anatomical landmarks from the US images and fine-tunes the pose of the US probe so that the landmarks are positioned in accordance with a non-patient-specific template image. The special end-effector, active-sensing end-effector (A-SEE), maintains the probe at an optimal orientation with respect to the body, allowing consistent-quality US images to be acquired throughout the navigation. Unlike previous works, our approach can navigate to complicated SIP containing multiple anatomies with interpretable robot arm motion. We verified our framework's ability to navigate the probe to the SIP with millimeter-level accuracy under phantom and human experiment settings. While preliminary results demonstrate the framework's efficacy in guiding the robotic LUS procedure, the performance of the system on other examinations (e.g., liver and thyroid US) involving soft tissues requires further validation. In the future, the framework can be applied in various US examinations by implementing specific anatomical feature detection modules.</abstract><pub>IEEE</pub><doi>10.1109/TASE.2024.3448241</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0003-1007-0910</orcidid><orcidid>https://orcid.org/0000-0002-1314-8456</orcidid><orcidid>https://orcid.org/0000-0002-5363-4539</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1545-5955
ispartof IEEE transactions on automation science and engineering, 2024-09, p.1-12
issn 1545-5955
1558-3783
language eng
recordid cdi_crossref_primary_10_1109_TASE_2024_3448241
source IEEE Xplore (Online service)
subjects Anatomy
Feature extraction
image-based visual servoing
Lung
Lung ultrasound
medical robots
Navigation
Probes
robotic ultrasound
Robots
Ultrasonic imaging
ultrasound segmentation
title Guiding the Last Centimeter: Novel Anatomy-Aware Probe Servoing for Standardized Imaging Plane Navigation in Robotic Lung Ultrasound
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T16%3A55%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-crossref_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Guiding%20the%20Last%20Centimeter:%20Novel%20Anatomy-Aware%20Probe%20Servoing%20for%20Standardized%20Imaging%20Plane%20Navigation%20in%20Robotic%20Lung%20Ultrasound&rft.jtitle=IEEE%20transactions%20on%20automation%20science%20and%20engineering&rft.au=Ma,%20Xihan&rft.date=2024-09-11&rft.spage=1&rft.epage=12&rft.pages=1-12&rft.issn=1545-5955&rft.eissn=1558-3783&rft.coden=ITASC7&rft_id=info:doi/10.1109/TASE.2024.3448241&rft_dat=%3Ccrossref_ieee_%3E10_1109_TASE_2024_3448241%3C/crossref_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c191t-c557ed1b39d72125d3bdeb391f8bd83624b5c6a9ef637d81f06c978231a2111f3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10670334&rfr_iscdi=true