Loading…

Audio-visual attention: Eye-tracking dataset and analysis toolbox

Although many visual attention models have been proposed, very few saliency models investigated the impact of audio information. To develop audio-visual attention models, researchers need to have a ground truth of eye movements recorded while exploring complex natural scenes in different audio condi...

Full description

Saved in:
Bibliographic Details
Main Authors: Marighetto, Pierre, Coutrot, Antoine, Riche, Nicolas, Guyader, Nathalie, Mancas, Matei, Gosselin, Bernard, Laganiere, Robert
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Although many visual attention models have been proposed, very few saliency models investigated the impact of audio information. To develop audio-visual attention models, researchers need to have a ground truth of eye movements recorded while exploring complex natural scenes in different audio conditions. They also need tools to compare eye movements and gaze patterns between these different audio conditions. This paper describes a toolbox that answer these needs by proposing a new eye-tracking dataset and its associated analysis ToolBox that contains common metrics to analysis eye movements. Our eye-tracking dataset contains the eye positions gathered during four eye-tracking experiments. A total of 176 observers were recorded while exploring 148 videos (mean duration = 22 s) split between different audio conditions (with or without sound) and visual categories (moving objects, landscapes and faces). Our ToolBox allows to visualize the temporal evolution of different metrics computed from the recorded eye positions. Both dataset and ToolBox are freely available to help design and assess visual saliency models for audiovisual dynamic stimuli.
ISSN:2381-8549
DOI:10.1109/ICIP.2017.8296592