Loading…
Unsupervised Deep Learning-Driven Stabilization of Smartphone-Based Quantitative Pupillometry for Mobile Emergency Medicine
Pupillometry. the assessment of pupil size and reactivity, is crucial in critical care and emergency medicine, serving as a primary method for non-invasive evaluation of neurological health after a severe acute brain injury (SABI), such as stroke or traumatic brain injury (TBI). The advent of smartp...
Saved in:
Main Authors: | , , , , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | 5 |
container_issue | |
container_start_page | 1 |
container_title | |
container_volume | |
creator | John, Ivo Yari, Zipei Bogucki, Aleksander Swiatek, Michal Chrost, Hugo Wlodarski, Michal Chrapkiewicz, Radoslaw Li, Jizhou |
description | Pupillometry. the assessment of pupil size and reactivity, is crucial in critical care and emergency medicine, serving as a primary method for non-invasive evaluation of neurological health after a severe acute brain injury (SABI), such as stroke or traumatic brain injury (TBI). The advent of smartphone-based quantitative pupillometry has enabled its new potential applications, for example in mobile emergency medicine in ambulances and helicopters, where traditional hardware-based pupillometers are impractical. However, these environments can be highly dynamic and pose challenges to the 3D stability of recordings acquired using a handheld device, implemented as software as a medical device (SaMD). The lack of 3D stability in mobile settings can lead to motion artifacts, significantly distorting measurements. This paper introduces a robust method that effectively stabilizes the pupillometry video input acquired under unstable conditions. Our two-stage approach first utilizes deep feature matching to mitigate the effects of motion coarsely. Subsequently, an implicit neural representation is employed for fine displacement estimation between frames, resulting in significantly stabilized output. We demonstrate enhanced sensitivity and noise reduction in the measured pupil dynamics. The effectiveness of the proposed unsupervised method is validated in challenging conditions, with substantial lateral and axial motions of the smartphone camera, emulating dynamic conditions experienced by emergency medicine teams in ambulances and helicopters. |
doi_str_mv | 10.1109/ISBI56570.2024.10635305 |
format | conference_proceeding |
fullrecord | <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_10635305</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10635305</ieee_id><sourcerecordid>10635305</sourcerecordid><originalsourceid>FETCH-ieee_primary_106353053</originalsourceid><addsrcrecordid>eNqFj9FKwzAYhaMgOFzfQDAv0Jo0ydrezk0cOHBUr0fc_s5f2iQkaaH68qug156bc3HOx-EQcsdZxjmr7jf1cqMWqmBZznKZcbYQSjB1QZKqqEqhmOBCiPKSzHglVVpKlV-TJIRPNqmQUjA5I99vJvQO_IABjnQF4OgzaG_QnNKVxwEMraN-xxa_dERrqG1o3Wkf3Yc1kC71D7brtYkYp8IA9KV32La2g-hH2lhPt3bCga478Ccwh5Fu4YgHNDAnV41uAyS_fkNuH9evD08pAsDeeZx2xv3fL_FPfAbxKFOI</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Unsupervised Deep Learning-Driven Stabilization of Smartphone-Based Quantitative Pupillometry for Mobile Emergency Medicine</title><source>IEEE Xplore All Conference Series</source><creator>John, Ivo ; Yari, Zipei ; Bogucki, Aleksander ; Swiatek, Michal ; Chrost, Hugo ; Wlodarski, Michal ; Chrapkiewicz, Radoslaw ; Li, Jizhou</creator><creatorcontrib>John, Ivo ; Yari, Zipei ; Bogucki, Aleksander ; Swiatek, Michal ; Chrost, Hugo ; Wlodarski, Michal ; Chrapkiewicz, Radoslaw ; Li, Jizhou</creatorcontrib><description>Pupillometry. the assessment of pupil size and reactivity, is crucial in critical care and emergency medicine, serving as a primary method for non-invasive evaluation of neurological health after a severe acute brain injury (SABI), such as stroke or traumatic brain injury (TBI). The advent of smartphone-based quantitative pupillometry has enabled its new potential applications, for example in mobile emergency medicine in ambulances and helicopters, where traditional hardware-based pupillometers are impractical. However, these environments can be highly dynamic and pose challenges to the 3D stability of recordings acquired using a handheld device, implemented as software as a medical device (SaMD). The lack of 3D stability in mobile settings can lead to motion artifacts, significantly distorting measurements. This paper introduces a robust method that effectively stabilizes the pupillometry video input acquired under unstable conditions. Our two-stage approach first utilizes deep feature matching to mitigate the effects of motion coarsely. Subsequently, an implicit neural representation is employed for fine displacement estimation between frames, resulting in significantly stabilized output. We demonstrate enhanced sensitivity and noise reduction in the measured pupil dynamics. The effectiveness of the proposed unsupervised method is validated in challenging conditions, with substantial lateral and axial motions of the smartphone camera, emulating dynamic conditions experienced by emergency medicine teams in ambulances and helicopters.</description><identifier>EISSN: 1945-8452</identifier><identifier>EISBN: 9798350313338</identifier><identifier>DOI: 10.1109/ISBI56570.2024.10635305</identifier><language>eng</language><publisher>IEEE</publisher><subject>Dynamics ; Emergency medicine ; Helicopters ; implicit neural representation ; Medical devices ; Quantitative pupillometry ; smartphone-based software as a medical device (SaMD) ; Software ; Stability analysis ; Three-dimensional displays</subject><ispartof>2024 IEEE International Symposium on Biomedical Imaging (ISBI), 2024, p.1-5</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10635305$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,27925,54555,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10635305$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>John, Ivo</creatorcontrib><creatorcontrib>Yari, Zipei</creatorcontrib><creatorcontrib>Bogucki, Aleksander</creatorcontrib><creatorcontrib>Swiatek, Michal</creatorcontrib><creatorcontrib>Chrost, Hugo</creatorcontrib><creatorcontrib>Wlodarski, Michal</creatorcontrib><creatorcontrib>Chrapkiewicz, Radoslaw</creatorcontrib><creatorcontrib>Li, Jizhou</creatorcontrib><title>Unsupervised Deep Learning-Driven Stabilization of Smartphone-Based Quantitative Pupillometry for Mobile Emergency Medicine</title><title>2024 IEEE International Symposium on Biomedical Imaging (ISBI)</title><addtitle>ISBI</addtitle><description>Pupillometry. the assessment of pupil size and reactivity, is crucial in critical care and emergency medicine, serving as a primary method for non-invasive evaluation of neurological health after a severe acute brain injury (SABI), such as stroke or traumatic brain injury (TBI). The advent of smartphone-based quantitative pupillometry has enabled its new potential applications, for example in mobile emergency medicine in ambulances and helicopters, where traditional hardware-based pupillometers are impractical. However, these environments can be highly dynamic and pose challenges to the 3D stability of recordings acquired using a handheld device, implemented as software as a medical device (SaMD). The lack of 3D stability in mobile settings can lead to motion artifacts, significantly distorting measurements. This paper introduces a robust method that effectively stabilizes the pupillometry video input acquired under unstable conditions. Our two-stage approach first utilizes deep feature matching to mitigate the effects of motion coarsely. Subsequently, an implicit neural representation is employed for fine displacement estimation between frames, resulting in significantly stabilized output. We demonstrate enhanced sensitivity and noise reduction in the measured pupil dynamics. The effectiveness of the proposed unsupervised method is validated in challenging conditions, with substantial lateral and axial motions of the smartphone camera, emulating dynamic conditions experienced by emergency medicine teams in ambulances and helicopters.</description><subject>Dynamics</subject><subject>Emergency medicine</subject><subject>Helicopters</subject><subject>implicit neural representation</subject><subject>Medical devices</subject><subject>Quantitative pupillometry</subject><subject>smartphone-based software as a medical device (SaMD)</subject><subject>Software</subject><subject>Stability analysis</subject><subject>Three-dimensional displays</subject><issn>1945-8452</issn><isbn>9798350313338</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2024</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNqFj9FKwzAYhaMgOFzfQDAv0Jo0ydrezk0cOHBUr0fc_s5f2iQkaaH68qug156bc3HOx-EQcsdZxjmr7jf1cqMWqmBZznKZcbYQSjB1QZKqqEqhmOBCiPKSzHglVVpKlV-TJIRPNqmQUjA5I99vJvQO_IABjnQF4OgzaG_QnNKVxwEMraN-xxa_dERrqG1o3Wkf3Yc1kC71D7brtYkYp8IA9KV32La2g-hH2lhPt3bCga478Ccwh5Fu4YgHNDAnV41uAyS_fkNuH9evD08pAsDeeZx2xv3fL_FPfAbxKFOI</recordid><startdate>20240527</startdate><enddate>20240527</enddate><creator>John, Ivo</creator><creator>Yari, Zipei</creator><creator>Bogucki, Aleksander</creator><creator>Swiatek, Michal</creator><creator>Chrost, Hugo</creator><creator>Wlodarski, Michal</creator><creator>Chrapkiewicz, Radoslaw</creator><creator>Li, Jizhou</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>20240527</creationdate><title>Unsupervised Deep Learning-Driven Stabilization of Smartphone-Based Quantitative Pupillometry for Mobile Emergency Medicine</title><author>John, Ivo ; Yari, Zipei ; Bogucki, Aleksander ; Swiatek, Michal ; Chrost, Hugo ; Wlodarski, Michal ; Chrapkiewicz, Radoslaw ; Li, Jizhou</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-ieee_primary_106353053</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Dynamics</topic><topic>Emergency medicine</topic><topic>Helicopters</topic><topic>implicit neural representation</topic><topic>Medical devices</topic><topic>Quantitative pupillometry</topic><topic>smartphone-based software as a medical device (SaMD)</topic><topic>Software</topic><topic>Stability analysis</topic><topic>Three-dimensional displays</topic><toplevel>online_resources</toplevel><creatorcontrib>John, Ivo</creatorcontrib><creatorcontrib>Yari, Zipei</creatorcontrib><creatorcontrib>Bogucki, Aleksander</creatorcontrib><creatorcontrib>Swiatek, Michal</creatorcontrib><creatorcontrib>Chrost, Hugo</creatorcontrib><creatorcontrib>Wlodarski, Michal</creatorcontrib><creatorcontrib>Chrapkiewicz, Radoslaw</creatorcontrib><creatorcontrib>Li, Jizhou</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library Online</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>John, Ivo</au><au>Yari, Zipei</au><au>Bogucki, Aleksander</au><au>Swiatek, Michal</au><au>Chrost, Hugo</au><au>Wlodarski, Michal</au><au>Chrapkiewicz, Radoslaw</au><au>Li, Jizhou</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Unsupervised Deep Learning-Driven Stabilization of Smartphone-Based Quantitative Pupillometry for Mobile Emergency Medicine</atitle><btitle>2024 IEEE International Symposium on Biomedical Imaging (ISBI)</btitle><stitle>ISBI</stitle><date>2024-05-27</date><risdate>2024</risdate><spage>1</spage><epage>5</epage><pages>1-5</pages><eissn>1945-8452</eissn><eisbn>9798350313338</eisbn><abstract>Pupillometry. the assessment of pupil size and reactivity, is crucial in critical care and emergency medicine, serving as a primary method for non-invasive evaluation of neurological health after a severe acute brain injury (SABI), such as stroke or traumatic brain injury (TBI). The advent of smartphone-based quantitative pupillometry has enabled its new potential applications, for example in mobile emergency medicine in ambulances and helicopters, where traditional hardware-based pupillometers are impractical. However, these environments can be highly dynamic and pose challenges to the 3D stability of recordings acquired using a handheld device, implemented as software as a medical device (SaMD). The lack of 3D stability in mobile settings can lead to motion artifacts, significantly distorting measurements. This paper introduces a robust method that effectively stabilizes the pupillometry video input acquired under unstable conditions. Our two-stage approach first utilizes deep feature matching to mitigate the effects of motion coarsely. Subsequently, an implicit neural representation is employed for fine displacement estimation between frames, resulting in significantly stabilized output. We demonstrate enhanced sensitivity and noise reduction in the measured pupil dynamics. The effectiveness of the proposed unsupervised method is validated in challenging conditions, with substantial lateral and axial motions of the smartphone camera, emulating dynamic conditions experienced by emergency medicine teams in ambulances and helicopters.</abstract><pub>IEEE</pub><doi>10.1109/ISBI56570.2024.10635305</doi></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | EISSN: 1945-8452 |
ispartof | 2024 IEEE International Symposium on Biomedical Imaging (ISBI), 2024, p.1-5 |
issn | 1945-8452 |
language | eng |
recordid | cdi_ieee_primary_10635305 |
source | IEEE Xplore All Conference Series |
subjects | Dynamics Emergency medicine Helicopters implicit neural representation Medical devices Quantitative pupillometry smartphone-based software as a medical device (SaMD) Software Stability analysis Three-dimensional displays |
title | Unsupervised Deep Learning-Driven Stabilization of Smartphone-Based Quantitative Pupillometry for Mobile Emergency Medicine |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T06%3A04%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Unsupervised%20Deep%20Learning-Driven%20Stabilization%20of%20Smartphone-Based%20Quantitative%20Pupillometry%20for%20Mobile%20Emergency%20Medicine&rft.btitle=2024%20IEEE%20International%20Symposium%20on%20Biomedical%20Imaging%20(ISBI)&rft.au=John,%20Ivo&rft.date=2024-05-27&rft.spage=1&rft.epage=5&rft.pages=1-5&rft.eissn=1945-8452&rft_id=info:doi/10.1109/ISBI56570.2024.10635305&rft.eisbn=9798350313338&rft_dat=%3Cieee_CHZPO%3E10635305%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-ieee_primary_106353053%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10635305&rfr_iscdi=true |