Loading…

A common representation of time across visual and auditory modalities

Humans' and non-human animals' ability to process time on the scale of milliseconds and seconds is essential for adaptive behaviour. A central question of how brains keep track of time is how specific temporal information across different sensory modalities is. In the present study, we sho...

Full description

Saved in:
Bibliographic Details
Published in:Neuropsychologia 2018-10, Vol.119, p.223-232
Main Authors: Barne, Louise C., Sato, João R., de Camargo, Raphael Y., Claessens, Peter M.E., Caetano, Marcelo S., Cravo, André M.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Humans' and non-human animals' ability to process time on the scale of milliseconds and seconds is essential for adaptive behaviour. A central question of how brains keep track of time is how specific temporal information across different sensory modalities is. In the present study, we show that encoding of temporal intervals in auditory and visual modalities are qualitatively similar. Human participants were instructed to reproduce intervals in the range from 750 ms to 1500 ms marked by auditory or visual stimuli. Our behavioural results suggest that, although participants were more accurate in reproducing intervals marked by auditory stimuli, there was a strong correlation in performance between modalities. Using multivariate pattern analysis in scalp EEG, we show that activity during late periods of the intervals was similar within and between modalities. Critically, we show that a multivariate pattern classifier was able to accurately predict the elapsed interval, even when trained on an interval marked by a stimulus of a different sensory modality. Taken together, our results suggest that, while there are differences in the processing of intervals marked by auditory and visual stimuli, they also share a common neural representation. •Using EEG we investigated the encoding of temporal intervals in audition and vision.•EEG activity in late periods was similar within and between modalities.•A MVPA classifier was able to predict time within and between modalities.
ISSN:0028-3932
1873-3514
DOI:10.1016/j.neuropsychologia.2018.08.014