Loading…
Rapid Invariant Encoding of Scene Layout in Human OPA
Successful visual navigation requires a sense of the geometry of the local environment. How do our brains extract this information from retinal images? Here we visually presented scenes with all possible combinations of five scene-bounding elements (left, right, and back walls; ceiling; floor) to hu...
Saved in:
Published in: | Neuron (Cambridge, Mass.) Mass.), 2019-07, Vol.103 (1), p.161-171.e3 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Successful visual navigation requires a sense of the geometry of the local environment. How do our brains extract this information from retinal images? Here we visually presented scenes with all possible combinations of five scene-bounding elements (left, right, and back walls; ceiling; floor) to human subjects during functional magnetic resonance imaging (fMRI) and magnetoencephalography (MEG). The fMRI response patterns in the scene-responsive occipital place area (OPA) reflected scene layout with invariance to changes in surface texture. This result contrasted sharply with the primary visual cortex (V1), which reflected low-level image features of the stimuli, and the parahippocampal place area (PPA), which showed better texture than layout decoding. MEG indicated that the texture-invariant scene layout representation is computed from visual input within ∼100 ms, suggesting a rapid computational mechanism. Taken together, these results suggest that the cortical representation underlying our instant sense of the environmental geometry is located in the OPA.
•A complete set of scene layouts is constructed in three surface textures (96 stimuli)•Layout discrimination in OPA generalizes across surface textures•PPA shows better texture than layout decoding•MEG reveals rapid emergence of the layout encoding
Environmental boundaries, such as walls, define the geometry of the local environment. How do our brains extract this information from the visual input? Using brain imaging, Henriksson et al. show that scene layout is encoded by the scene-responsive OPA. |
---|---|
ISSN: | 0896-6273 1097-4199 |
DOI: | 10.1016/j.neuron.2019.04.014 |