Loading…

On the temporal dynamics of language-mediated vision and vision-mediated language

Recent converging evidence suggests that language and vision interact immediately in non-trivial ways, although the exact nature of this interaction is still unclear. Not only does linguistic information influence visual perception in real-time, but visual information also influences language compre...

Full description

Saved in:
Bibliographic Details
Published in:Acta psychologica 2011-06, Vol.137 (2), p.181-189
Main Authors: Anderson, Sarah E., Chiu, Eric, Huette, Stephanie, Spivey, Michael J.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c440t-3f55d8a6cbb47ed306862c20fa490de1af62dec7d8c0420615f6d068cf2f888c3
cites cdi_FETCH-LOGICAL-c440t-3f55d8a6cbb47ed306862c20fa490de1af62dec7d8c0420615f6d068cf2f888c3
container_end_page 189
container_issue 2
container_start_page 181
container_title Acta psychologica
container_volume 137
creator Anderson, Sarah E.
Chiu, Eric
Huette, Stephanie
Spivey, Michael J.
description Recent converging evidence suggests that language and vision interact immediately in non-trivial ways, although the exact nature of this interaction is still unclear. Not only does linguistic information influence visual perception in real-time, but visual information also influences language comprehension in real-time. For example, in visual search tasks, incremental spoken delivery of the target features (e.g., “Is there a red vertical?”) can increase the efficiency of conjunction search because only one feature is heard at a time. Moreover, in spoken word recognition tasks, the visual presence of an object whose name is similar to the word being spoken (e.g., a candle present when instructed to “pick up the candy”) can alter the process of comprehension. Dense sampling methods, such as eye-tracking and reach-tracking, richly illustrate the nature of this interaction, providing a semi-continuous measure of the temporal dynamics of individual behavioral responses. We review a variety of studies that demonstrate how these methods are particularly promising in further elucidating the dynamic competition that takes place between underlying linguistic and visual representations in multimodal contexts, and we conclude with a discussion of the consequences that these findings have for theories of embodied cognition.
doi_str_mv 10.1016/j.actpsy.2010.09.008
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_870546281</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0001691810001915</els_id><sourcerecordid>1221438939</sourcerecordid><originalsourceid>FETCH-LOGICAL-c440t-3f55d8a6cbb47ed306862c20fa490de1af62dec7d8c0420615f6d068cf2f888c3</originalsourceid><addsrcrecordid>eNp9kE1LAzEQhoMoWqv_QGRvetk6yabZ5CJI8QsKRdBzSJNJTenu1s220H9vSlu99ZRk8sy8w0PIDYUBBSoe5gNju2XcDBikEqgBgDwhPSrLIhdMlaekBwA0F4rKC3IZ4zw9OVX0nFwwUIIOqeqRj0mddd-YdVgtm9YsMrepTRVszBqfLUw9W5kZ5hW6YDp02TrE0NSZqQ_X_68DfEXOvFlEvN6fffL18vw5esvHk9f30dM4t5xDlxd-OHTSCDud8hJdAUIKZhl4wxU4pMYL5tCWTlrgDNK6XrgEWc-8lNIWfXK3m7tsm58Vxk5XIVpcpD2wWUUtSxhywSRN5P1RkjJGeSFVoRLKd6htmxhb9HrZhsq0G01Bb7Xrud5p11vtGpRO2lPb7T5hNU1C_poOnhPwuAMwGVkHbHW0AWub5LVoO-2acDzhFyA2ldI</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1221438939</pqid></control><display><type>article</type><title>On the temporal dynamics of language-mediated vision and vision-mediated language</title><source>ScienceDirect</source><source>ScienceDirect Freedom Collection</source><source>Linguistics and Language Behavior Abstracts (LLBA)</source><creator>Anderson, Sarah E. ; Chiu, Eric ; Huette, Stephanie ; Spivey, Michael J.</creator><creatorcontrib>Anderson, Sarah E. ; Chiu, Eric ; Huette, Stephanie ; Spivey, Michael J.</creatorcontrib><description>Recent converging evidence suggests that language and vision interact immediately in non-trivial ways, although the exact nature of this interaction is still unclear. Not only does linguistic information influence visual perception in real-time, but visual information also influences language comprehension in real-time. For example, in visual search tasks, incremental spoken delivery of the target features (e.g., “Is there a red vertical?”) can increase the efficiency of conjunction search because only one feature is heard at a time. Moreover, in spoken word recognition tasks, the visual presence of an object whose name is similar to the word being spoken (e.g., a candle present when instructed to “pick up the candy”) can alter the process of comprehension. Dense sampling methods, such as eye-tracking and reach-tracking, richly illustrate the nature of this interaction, providing a semi-continuous measure of the temporal dynamics of individual behavioral responses. We review a variety of studies that demonstrate how these methods are particularly promising in further elucidating the dynamic competition that takes place between underlying linguistic and visual representations in multimodal contexts, and we conclude with a discussion of the consequences that these findings have for theories of embodied cognition.</description><identifier>ISSN: 0001-6918</identifier><identifier>EISSN: 1873-6297</identifier><identifier>DOI: 10.1016/j.actpsy.2010.09.008</identifier><identifier>PMID: 20961519</identifier><identifier>CODEN: APSOAZ</identifier><language>eng</language><publisher>Netherlands: Elsevier B.V</publisher><subject>Comprehension - physiology ; Eye Movements - physiology ; Eye-tracking ; Humans ; Language ; Psycholinguistics ; Speech Perception - physiology ; Visual perception ; Visual Perception - physiology</subject><ispartof>Acta psychologica, 2011-06, Vol.137 (2), p.181-189</ispartof><rights>2010 Elsevier B.V.</rights><rights>Copyright © 2010 Elsevier B.V. All rights reserved.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c440t-3f55d8a6cbb47ed306862c20fa490de1af62dec7d8c0420615f6d068cf2f888c3</citedby><cites>FETCH-LOGICAL-c440t-3f55d8a6cbb47ed306862c20fa490de1af62dec7d8c0420615f6d068cf2f888c3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0001691810001915$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3536,27901,27902,31247,45756</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/20961519$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Anderson, Sarah E.</creatorcontrib><creatorcontrib>Chiu, Eric</creatorcontrib><creatorcontrib>Huette, Stephanie</creatorcontrib><creatorcontrib>Spivey, Michael J.</creatorcontrib><title>On the temporal dynamics of language-mediated vision and vision-mediated language</title><title>Acta psychologica</title><addtitle>Acta Psychol (Amst)</addtitle><description>Recent converging evidence suggests that language and vision interact immediately in non-trivial ways, although the exact nature of this interaction is still unclear. Not only does linguistic information influence visual perception in real-time, but visual information also influences language comprehension in real-time. For example, in visual search tasks, incremental spoken delivery of the target features (e.g., “Is there a red vertical?”) can increase the efficiency of conjunction search because only one feature is heard at a time. Moreover, in spoken word recognition tasks, the visual presence of an object whose name is similar to the word being spoken (e.g., a candle present when instructed to “pick up the candy”) can alter the process of comprehension. Dense sampling methods, such as eye-tracking and reach-tracking, richly illustrate the nature of this interaction, providing a semi-continuous measure of the temporal dynamics of individual behavioral responses. We review a variety of studies that demonstrate how these methods are particularly promising in further elucidating the dynamic competition that takes place between underlying linguistic and visual representations in multimodal contexts, and we conclude with a discussion of the consequences that these findings have for theories of embodied cognition.</description><subject>Comprehension - physiology</subject><subject>Eye Movements - physiology</subject><subject>Eye-tracking</subject><subject>Humans</subject><subject>Language</subject><subject>Psycholinguistics</subject><subject>Speech Perception - physiology</subject><subject>Visual perception</subject><subject>Visual Perception - physiology</subject><issn>0001-6918</issn><issn>1873-6297</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2011</creationdate><recordtype>article</recordtype><sourceid>7T9</sourceid><recordid>eNp9kE1LAzEQhoMoWqv_QGRvetk6yabZ5CJI8QsKRdBzSJNJTenu1s220H9vSlu99ZRk8sy8w0PIDYUBBSoe5gNju2XcDBikEqgBgDwhPSrLIhdMlaekBwA0F4rKC3IZ4zw9OVX0nFwwUIIOqeqRj0mddd-YdVgtm9YsMrepTRVszBqfLUw9W5kZ5hW6YDp02TrE0NSZqQ_X_68DfEXOvFlEvN6fffL18vw5esvHk9f30dM4t5xDlxd-OHTSCDud8hJdAUIKZhl4wxU4pMYL5tCWTlrgDNK6XrgEWc-8lNIWfXK3m7tsm58Vxk5XIVpcpD2wWUUtSxhywSRN5P1RkjJGeSFVoRLKd6htmxhb9HrZhsq0G01Bb7Xrud5p11vtGpRO2lPb7T5hNU1C_poOnhPwuAMwGVkHbHW0AWub5LVoO-2acDzhFyA2ldI</recordid><startdate>201106</startdate><enddate>201106</enddate><creator>Anderson, Sarah E.</creator><creator>Chiu, Eric</creator><creator>Huette, Stephanie</creator><creator>Spivey, Michael J.</creator><general>Elsevier B.V</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7T9</scope><scope>7X8</scope></search><sort><creationdate>201106</creationdate><title>On the temporal dynamics of language-mediated vision and vision-mediated language</title><author>Anderson, Sarah E. ; Chiu, Eric ; Huette, Stephanie ; Spivey, Michael J.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c440t-3f55d8a6cbb47ed306862c20fa490de1af62dec7d8c0420615f6d068cf2f888c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2011</creationdate><topic>Comprehension - physiology</topic><topic>Eye Movements - physiology</topic><topic>Eye-tracking</topic><topic>Humans</topic><topic>Language</topic><topic>Psycholinguistics</topic><topic>Speech Perception - physiology</topic><topic>Visual perception</topic><topic>Visual Perception - physiology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Anderson, Sarah E.</creatorcontrib><creatorcontrib>Chiu, Eric</creatorcontrib><creatorcontrib>Huette, Stephanie</creatorcontrib><creatorcontrib>Spivey, Michael J.</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Linguistics and Language Behavior Abstracts (LLBA)</collection><collection>MEDLINE - Academic</collection><jtitle>Acta psychologica</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Anderson, Sarah E.</au><au>Chiu, Eric</au><au>Huette, Stephanie</au><au>Spivey, Michael J.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>On the temporal dynamics of language-mediated vision and vision-mediated language</atitle><jtitle>Acta psychologica</jtitle><addtitle>Acta Psychol (Amst)</addtitle><date>2011-06</date><risdate>2011</risdate><volume>137</volume><issue>2</issue><spage>181</spage><epage>189</epage><pages>181-189</pages><issn>0001-6918</issn><eissn>1873-6297</eissn><coden>APSOAZ</coden><abstract>Recent converging evidence suggests that language and vision interact immediately in non-trivial ways, although the exact nature of this interaction is still unclear. Not only does linguistic information influence visual perception in real-time, but visual information also influences language comprehension in real-time. For example, in visual search tasks, incremental spoken delivery of the target features (e.g., “Is there a red vertical?”) can increase the efficiency of conjunction search because only one feature is heard at a time. Moreover, in spoken word recognition tasks, the visual presence of an object whose name is similar to the word being spoken (e.g., a candle present when instructed to “pick up the candy”) can alter the process of comprehension. Dense sampling methods, such as eye-tracking and reach-tracking, richly illustrate the nature of this interaction, providing a semi-continuous measure of the temporal dynamics of individual behavioral responses. We review a variety of studies that demonstrate how these methods are particularly promising in further elucidating the dynamic competition that takes place between underlying linguistic and visual representations in multimodal contexts, and we conclude with a discussion of the consequences that these findings have for theories of embodied cognition.</abstract><cop>Netherlands</cop><pub>Elsevier B.V</pub><pmid>20961519</pmid><doi>10.1016/j.actpsy.2010.09.008</doi><tpages>9</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0001-6918
ispartof Acta psychologica, 2011-06, Vol.137 (2), p.181-189
issn 0001-6918
1873-6297
language eng
recordid cdi_proquest_miscellaneous_870546281
source ScienceDirect; ScienceDirect Freedom Collection; Linguistics and Language Behavior Abstracts (LLBA)
subjects Comprehension - physiology
Eye Movements - physiology
Eye-tracking
Humans
Language
Psycholinguistics
Speech Perception - physiology
Visual perception
Visual Perception - physiology
title On the temporal dynamics of language-mediated vision and vision-mediated language
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-04T15%3A53%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=On%20the%20temporal%20dynamics%20of%20language-mediated%20vision%20and%20vision-mediated%20language&rft.jtitle=Acta%20psychologica&rft.au=Anderson,%20Sarah%20E.&rft.date=2011-06&rft.volume=137&rft.issue=2&rft.spage=181&rft.epage=189&rft.pages=181-189&rft.issn=0001-6918&rft.eissn=1873-6297&rft.coden=APSOAZ&rft_id=info:doi/10.1016/j.actpsy.2010.09.008&rft_dat=%3Cproquest_cross%3E1221438939%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c440t-3f55d8a6cbb47ed306862c20fa490de1af62dec7d8c0420615f6d068cf2f888c3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1221438939&rft_id=info:pmid/20961519&rfr_iscdi=true