Loading…

A Neuronavigation System Using a Mobile Augmented Reality Solution

Image-guided surgery has shown great utility in neurosurgery, especially in allowing for more accurate surgical planning and navigation. The current gold standard for image-guided neurosurgery is neuronavigation, which provides millimetric accuracy on such tasks. However, these approaches often requ...

Full description

Saved in:
Bibliographic Details
Published in:World neurosurgery 2022-11, Vol.167, p.e1261-e1267
Main Authors: de Almeida, Antonio Guilherme C., Fernandes de Oliveira Santos, Bruno, Oliveira, Joselina L.M.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Image-guided surgery has shown great utility in neurosurgery, especially in allowing for more accurate surgical planning and navigation. The current gold standard for image-guided neurosurgery is neuronavigation, which provides millimetric accuracy on such tasks. However, these approaches often require a complicated setup and have high cost, hindering their potential in low- and middle-income countries. The aim of this study was to develop and evaluate the performance of a mobile-based augmented reality neuronavigation solution under different conditions in a preclinical environment. The application was developed using the Swift programming language and was tested on a replica of a human scalp under variable lighting, with different numbers of registration points and target point position conditions. For each condition, reference points were input into the application, and the target points were computed for 10 iterations. The mean registration error and target error were used to assess the performance of the application. In the best-case scenario, the proposed solution had a mean target error of 2.6 ± 1.6 mm. Our approach provides a viable, low-cost, easy-to-use, portable method for locating points on the scalp surface with an accuracy of 2.6 ± 1.6 mm in the best-case scenario.
ISSN:1878-8750
1878-8769
1878-8769
DOI:10.1016/j.wneu.2022.09.014