Loading…

A Hybrid Gaze Distance Estimation via Cross-Reference of Vergence and Depth

To deliver an optimal Mixed Reality (MR) experience, wherein virtual elements and real-world objects are seamlessly merged, it is vital to ensure a consistent vergence-accommodation distance. This necessitates the advancement of technology to precisely estimate the user's gaze distance. Present...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2024, Vol.12, p.182618-182626
Main Authors: Cho, Dae-Yong, Kang, Min-Koo
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:To deliver an optimal Mixed Reality (MR) experience, wherein virtual elements and real-world objects are seamlessly merged, it is vital to ensure a consistent vergence-accommodation distance. This necessitates the advancement of technology to precisely estimate the user's gaze distance. Presently, various MR devices employ small eye-tracking cameras to capture both eyes and infer the gaze distance based on vergence angle data. However, this technique faces significant challenges, as it is highly sensitive to several human errors, such as strabismus, blinking, and fatigue of the eyes due to prolonged use. To address these issues, this paper introduces an innovative hybrid algorithm for estimating gaze distances. The proposed approach concurrently utilizes an eye camera and a depth camera to conduct parallel estimations: one based on the conventional vergence angle and the other on gaze-mapped depth information. The confidence of each method is then assessed and cross-referenced, and an adaptive weighted average is computed to derive a more precise and stable gaze distance estimation. In the experiment, three challenging test scenarios designed to induce human and environmental errors were administered to 12 subjects under uniform conditions to evaluate the accuracy and stability of the proposed method. The experimental results were validated through both qualitative and quantitative analysis. The findings showed that the proposed method significantly outperformed current methods with a visual angle error of 0.132 degrees under ideal conditions. Furthermore, it consistently maintained robustness against human and environmental errors, achieving an error range of 0.14 to 0.21 degrees even in demanding environments.
ISSN:2169-3536
DOI:10.1109/ACCESS.2024.3510357