Loading…

Tracking sentence comprehension: Test-retest reliability in people with aphasia and unimpaired adults

Visual-world eyetracking is increasingly used to investigate online language processing in normal and language impaired listeners. Tracking changes in eye movements over time also may be useful for indexing language recovery in those with language impairments. Therefore, it is critical to determine...

Full description

Saved in:
Bibliographic Details
Published in:Journal of neurolinguistics 2016-11, Vol.40, p.98-111
Main Authors: Mack, Jennifer E., Wei, Andrew Zu-Sern, Gutierrez, Stephanie, Thompson, Cynthia K.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Visual-world eyetracking is increasingly used to investigate online language processing in normal and language impaired listeners. Tracking changes in eye movements over time also may be useful for indexing language recovery in those with language impairments. Therefore, it is critical to determine the test-retest reliability of results obtained using this method. Unimpaired young adults and people with aphasia took part in two eyetracking sessions spaced about one week apart. In each session, participants completed a sentence-picture matching task in which they listened to active and passive sentences (e.g., The [N1+Auxwoman was] [Vvisiting/visited] [NP/PP2(by) the man]) and selected between two pictures with reversed thematic roles. We used intraclass correlations (ICCs) to examine the test-retest reliability of response measures (accuracy, reaction time (RT)) and online eye movements (i.e., the likelihood of fixating the target picture in each region of the sentence) in each participant group. In the unimpaired adults, accuracy was at ceiling (thus ICCs were not computed), with moderate ICCs for RT (i.e., 0.4–0.58) for passive sentences and low (
ISSN:0911-6044
1873-8052
DOI:10.1016/j.jneuroling.2016.06.001