Loading…

Generative Adversarial Networks Conditioned on Brain Activity Reconstruct Seen Images

We consider the inference problem of reconstructing a visual stimulus from brain activity measurements (e.g. fMRI) that encode this stimulus. Recovering a complete image is complicated by the fact that neural representations are noisy, high-dimensional, and contain incomplete information about image...

Full description

Saved in:
Bibliographic Details
Published in:2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC) Man, and Cybernetics (SMC), 2018-10, Vol.2018, p.1054-1061
Main Authors: St-Yves, Ghislain, Naselaris, Thomas
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We consider the inference problem of reconstructing a visual stimulus from brain activity measurements (e.g. fMRI) that encode this stimulus. Recovering a complete image is complicated by the fact that neural representations are noisy, high-dimensional, and contain incomplete information about image details. Thus, reconstructions of complex images from brain activity require a strong prior. Here we propose to train generative adversarial networks (GANs) to learn a generative model of images that is conditioned on measurements of brain activity. We consider two challenges of this approach: First, given that GANs require far more data to train than is typically collected in an fMRI experiment, how do we obtain enough samples to train a GAN that is conditioned on brain activity? Secondly, how do we ensure that our generated samples are robust against noise present in fMRI data? Our strategy to surmount both of these problems centers around the creation of surrogate brain activity samples that are generated by an encoding model. We find that the generative model thus trained generalizes to real fRMI data measured during perception of images and is able to reconstruct the basic outline of the stimuli.
ISSN:1062-922X
2577-1655
DOI:10.1109/SMC.2018.00187