Loading…

Data-driven analysis of functional brain interactions during free listening to music and speech

Natural stimulus functional magnetic resonance imaging (N-fMRI) such as fMRI acquired when participants were watching video streams or listening to audio streams has been increasingly used to investigate functional mechanisms of the human brain in recent years. One of the fundamental challenges in f...

Full description

Saved in:
Bibliographic Details
Published in:Brain imaging and behavior 2015-06, Vol.9 (2), p.162-177
Main Authors: Fang, Jun, Hu, Xintao, Han, Junwei, Jiang, Xi, Zhu, Dajiang, Guo, Lei, Liu, Tianming
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c442t-9a4365b10ecf256939cd0208af638c36c3a7dae6280dd3755462610e0455e0133
cites cdi_FETCH-LOGICAL-c442t-9a4365b10ecf256939cd0208af638c36c3a7dae6280dd3755462610e0455e0133
container_end_page 177
container_issue 2
container_start_page 162
container_title Brain imaging and behavior
container_volume 9
creator Fang, Jun
Hu, Xintao
Han, Junwei
Jiang, Xi
Zhu, Dajiang
Guo, Lei
Liu, Tianming
description Natural stimulus functional magnetic resonance imaging (N-fMRI) such as fMRI acquired when participants were watching video streams or listening to audio streams has been increasingly used to investigate functional mechanisms of the human brain in recent years. One of the fundamental challenges in functional brain mapping based on N-fMRI is to model the brain’s functional responses to continuous, naturalistic and dynamic natural stimuli. To address this challenge, in this paper we present a data-driven approach to exploring functional interactions in the human brain during free listening to music and speech streams. Specifically, we model the brain responses using N-fMRI by measuring the functional interactions on large-scale brain networks with intrinsically established structural correspondence, and perform music and speech classification tasks to guide the systematic identification of consistent and discriminative functional interactions when multiple subjects were listening music and speech in multiple categories. The underlying premise is that the functional interactions derived from N-fMRI data of multiple subjects should exhibit both consistency and discriminability. Our experimental results show that a variety of brain systems including attention, memory, auditory/language, emotion, and action networks are among the most relevant brain systems involved in classic music, pop music and speech differentiation. Our study provides an alternative approach to investigating the human brain’s mechanism in comprehension of complex natural music and speech.
doi_str_mv 10.1007/s11682-014-9293-0
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1683357307</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3693409601</sourcerecordid><originalsourceid>FETCH-LOGICAL-c442t-9a4365b10ecf256939cd0208af638c36c3a7dae6280dd3755462610e0455e0133</originalsourceid><addsrcrecordid>eNp1kMtKAzEUhoMotlYfwI0E3LgZzX1mllKvUHCj65BmzmjKXGoyI_TtzXRqEcFVksN3_pPzIXROyTUlJL0JlKqMJYSKJGc5T8gBmtKc0ySVSh7u7zKdoJMQVoRIkeX0GE2YkExJlU-RvjOdSQrvvqDBpjHVJriA2xKXfWM718YKXnrjGuyaDrzZ1gIueu-ad1x6AFy50EEzPLsW131wNgYVOKwB7McpOipNFeBsd87Q28P96_wpWbw8Ps9vF4kVgnVJbgRXckkJ2JLFj_HcFoSRzJSKZ5Yry01aGFAsI0XB40pCMRVpIqQEQjmfoasxd-3bzx5Cp2sXLFSVaaDtg46iOJcpJ2lEL_-gq7b3cdEtxVKeUzpQdKSsb0PwUOq1d7XxG02JHuzr0b6O9vVgX5PYc7FL7pc1FPuOH90RYCMQ1oM_8L9G_5v6DTRejrE</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1682739117</pqid></control><display><type>article</type><title>Data-driven analysis of functional brain interactions during free listening to music and speech</title><source>Springer Nature</source><creator>Fang, Jun ; Hu, Xintao ; Han, Junwei ; Jiang, Xi ; Zhu, Dajiang ; Guo, Lei ; Liu, Tianming</creator><creatorcontrib>Fang, Jun ; Hu, Xintao ; Han, Junwei ; Jiang, Xi ; Zhu, Dajiang ; Guo, Lei ; Liu, Tianming</creatorcontrib><description>Natural stimulus functional magnetic resonance imaging (N-fMRI) such as fMRI acquired when participants were watching video streams or listening to audio streams has been increasingly used to investigate functional mechanisms of the human brain in recent years. One of the fundamental challenges in functional brain mapping based on N-fMRI is to model the brain’s functional responses to continuous, naturalistic and dynamic natural stimuli. To address this challenge, in this paper we present a data-driven approach to exploring functional interactions in the human brain during free listening to music and speech streams. Specifically, we model the brain responses using N-fMRI by measuring the functional interactions on large-scale brain networks with intrinsically established structural correspondence, and perform music and speech classification tasks to guide the systematic identification of consistent and discriminative functional interactions when multiple subjects were listening music and speech in multiple categories. The underlying premise is that the functional interactions derived from N-fMRI data of multiple subjects should exhibit both consistency and discriminability. Our experimental results show that a variety of brain systems including attention, memory, auditory/language, emotion, and action networks are among the most relevant brain systems involved in classic music, pop music and speech differentiation. Our study provides an alternative approach to investigating the human brain’s mechanism in comprehension of complex natural music and speech.</description><identifier>ISSN: 1931-7557</identifier><identifier>EISSN: 1931-7565</identifier><identifier>DOI: 10.1007/s11682-014-9293-0</identifier><identifier>PMID: 24526569</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Acoustic Stimulation - methods ; Acoustics ; Auditory Perception - physiology ; Biomedical and Life Sciences ; Biomedicine ; Brain - physiology ; Brain Mapping - methods ; Brain research ; Humans ; Listening ; Magnetic resonance imaging ; Magnetic Resonance Imaging - methods ; Medical imaging ; Memory ; Models, Neurological ; Multimedia ; Music ; Neuroimaging ; Neuropsychology ; Neuroradiology ; Neurosciences ; Original Research ; Psychiatry ; Semantics ; Speech</subject><ispartof>Brain imaging and behavior, 2015-06, Vol.9 (2), p.162-177</ispartof><rights>Springer Science+Business Media New York 2014</rights><rights>Springer Science+Business Media New York 2015</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c442t-9a4365b10ecf256939cd0208af638c36c3a7dae6280dd3755462610e0455e0133</citedby><cites>FETCH-LOGICAL-c442t-9a4365b10ecf256939cd0208af638c36c3a7dae6280dd3755462610e0455e0133</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,777,781,27905,27906</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/24526569$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Fang, Jun</creatorcontrib><creatorcontrib>Hu, Xintao</creatorcontrib><creatorcontrib>Han, Junwei</creatorcontrib><creatorcontrib>Jiang, Xi</creatorcontrib><creatorcontrib>Zhu, Dajiang</creatorcontrib><creatorcontrib>Guo, Lei</creatorcontrib><creatorcontrib>Liu, Tianming</creatorcontrib><title>Data-driven analysis of functional brain interactions during free listening to music and speech</title><title>Brain imaging and behavior</title><addtitle>Brain Imaging and Behavior</addtitle><addtitle>Brain Imaging Behav</addtitle><description>Natural stimulus functional magnetic resonance imaging (N-fMRI) such as fMRI acquired when participants were watching video streams or listening to audio streams has been increasingly used to investigate functional mechanisms of the human brain in recent years. One of the fundamental challenges in functional brain mapping based on N-fMRI is to model the brain’s functional responses to continuous, naturalistic and dynamic natural stimuli. To address this challenge, in this paper we present a data-driven approach to exploring functional interactions in the human brain during free listening to music and speech streams. Specifically, we model the brain responses using N-fMRI by measuring the functional interactions on large-scale brain networks with intrinsically established structural correspondence, and perform music and speech classification tasks to guide the systematic identification of consistent and discriminative functional interactions when multiple subjects were listening music and speech in multiple categories. The underlying premise is that the functional interactions derived from N-fMRI data of multiple subjects should exhibit both consistency and discriminability. Our experimental results show that a variety of brain systems including attention, memory, auditory/language, emotion, and action networks are among the most relevant brain systems involved in classic music, pop music and speech differentiation. Our study provides an alternative approach to investigating the human brain’s mechanism in comprehension of complex natural music and speech.</description><subject>Acoustic Stimulation - methods</subject><subject>Acoustics</subject><subject>Auditory Perception - physiology</subject><subject>Biomedical and Life Sciences</subject><subject>Biomedicine</subject><subject>Brain - physiology</subject><subject>Brain Mapping - methods</subject><subject>Brain research</subject><subject>Humans</subject><subject>Listening</subject><subject>Magnetic resonance imaging</subject><subject>Magnetic Resonance Imaging - methods</subject><subject>Medical imaging</subject><subject>Memory</subject><subject>Models, Neurological</subject><subject>Multimedia</subject><subject>Music</subject><subject>Neuroimaging</subject><subject>Neuropsychology</subject><subject>Neuroradiology</subject><subject>Neurosciences</subject><subject>Original Research</subject><subject>Psychiatry</subject><subject>Semantics</subject><subject>Speech</subject><issn>1931-7557</issn><issn>1931-7565</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2015</creationdate><recordtype>article</recordtype><recordid>eNp1kMtKAzEUhoMotlYfwI0E3LgZzX1mllKvUHCj65BmzmjKXGoyI_TtzXRqEcFVksN3_pPzIXROyTUlJL0JlKqMJYSKJGc5T8gBmtKc0ySVSh7u7zKdoJMQVoRIkeX0GE2YkExJlU-RvjOdSQrvvqDBpjHVJriA2xKXfWM718YKXnrjGuyaDrzZ1gIueu-ad1x6AFy50EEzPLsW131wNgYVOKwB7McpOipNFeBsd87Q28P96_wpWbw8Ps9vF4kVgnVJbgRXckkJ2JLFj_HcFoSRzJSKZ5Yry01aGFAsI0XB40pCMRVpIqQEQjmfoasxd-3bzx5Cp2sXLFSVaaDtg46iOJcpJ2lEL_-gq7b3cdEtxVKeUzpQdKSsb0PwUOq1d7XxG02JHuzr0b6O9vVgX5PYc7FL7pc1FPuOH90RYCMQ1oM_8L9G_5v6DTRejrE</recordid><startdate>20150601</startdate><enddate>20150601</enddate><creator>Fang, Jun</creator><creator>Hu, Xintao</creator><creator>Han, Junwei</creator><creator>Jiang, Xi</creator><creator>Zhu, Dajiang</creator><creator>Guo, Lei</creator><creator>Liu, Tianming</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7RV</scope><scope>7TK</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>88G</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB0</scope><scope>LK8</scope><scope>M0S</scope><scope>M1P</scope><scope>M2M</scope><scope>M7P</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>Q9U</scope><scope>7X8</scope></search><sort><creationdate>20150601</creationdate><title>Data-driven analysis of functional brain interactions during free listening to music and speech</title><author>Fang, Jun ; Hu, Xintao ; Han, Junwei ; Jiang, Xi ; Zhu, Dajiang ; Guo, Lei ; Liu, Tianming</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c442t-9a4365b10ecf256939cd0208af638c36c3a7dae6280dd3755462610e0455e0133</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2015</creationdate><topic>Acoustic Stimulation - methods</topic><topic>Acoustics</topic><topic>Auditory Perception - physiology</topic><topic>Biomedical and Life Sciences</topic><topic>Biomedicine</topic><topic>Brain - physiology</topic><topic>Brain Mapping - methods</topic><topic>Brain research</topic><topic>Humans</topic><topic>Listening</topic><topic>Magnetic resonance imaging</topic><topic>Magnetic Resonance Imaging - methods</topic><topic>Medical imaging</topic><topic>Memory</topic><topic>Models, Neurological</topic><topic>Multimedia</topic><topic>Music</topic><topic>Neuroimaging</topic><topic>Neuropsychology</topic><topic>Neuroradiology</topic><topic>Neurosciences</topic><topic>Original Research</topic><topic>Psychiatry</topic><topic>Semantics</topic><topic>Speech</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Fang, Jun</creatorcontrib><creatorcontrib>Hu, Xintao</creatorcontrib><creatorcontrib>Han, Junwei</creatorcontrib><creatorcontrib>Jiang, Xi</creatorcontrib><creatorcontrib>Zhu, Dajiang</creatorcontrib><creatorcontrib>Guo, Lei</creatorcontrib><creatorcontrib>Liu, Tianming</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Neurosciences Abstracts</collection><collection>ProQuest - Health &amp; Medical Complete保健、医学与药学数据库</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Psychology Database (Alumni)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>ProQuest Biological Science Collection</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Psychology Database</collection><collection>ProQuest Biological Science Journals</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><jtitle>Brain imaging and behavior</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Fang, Jun</au><au>Hu, Xintao</au><au>Han, Junwei</au><au>Jiang, Xi</au><au>Zhu, Dajiang</au><au>Guo, Lei</au><au>Liu, Tianming</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Data-driven analysis of functional brain interactions during free listening to music and speech</atitle><jtitle>Brain imaging and behavior</jtitle><stitle>Brain Imaging and Behavior</stitle><addtitle>Brain Imaging Behav</addtitle><date>2015-06-01</date><risdate>2015</risdate><volume>9</volume><issue>2</issue><spage>162</spage><epage>177</epage><pages>162-177</pages><issn>1931-7557</issn><eissn>1931-7565</eissn><abstract>Natural stimulus functional magnetic resonance imaging (N-fMRI) such as fMRI acquired when participants were watching video streams or listening to audio streams has been increasingly used to investigate functional mechanisms of the human brain in recent years. One of the fundamental challenges in functional brain mapping based on N-fMRI is to model the brain’s functional responses to continuous, naturalistic and dynamic natural stimuli. To address this challenge, in this paper we present a data-driven approach to exploring functional interactions in the human brain during free listening to music and speech streams. Specifically, we model the brain responses using N-fMRI by measuring the functional interactions on large-scale brain networks with intrinsically established structural correspondence, and perform music and speech classification tasks to guide the systematic identification of consistent and discriminative functional interactions when multiple subjects were listening music and speech in multiple categories. The underlying premise is that the functional interactions derived from N-fMRI data of multiple subjects should exhibit both consistency and discriminability. Our experimental results show that a variety of brain systems including attention, memory, auditory/language, emotion, and action networks are among the most relevant brain systems involved in classic music, pop music and speech differentiation. Our study provides an alternative approach to investigating the human brain’s mechanism in comprehension of complex natural music and speech.</abstract><cop>New York</cop><pub>Springer US</pub><pmid>24526569</pmid><doi>10.1007/s11682-014-9293-0</doi><tpages>16</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1931-7557
ispartof Brain imaging and behavior, 2015-06, Vol.9 (2), p.162-177
issn 1931-7557
1931-7565
language eng
recordid cdi_proquest_miscellaneous_1683357307
source Springer Nature
subjects Acoustic Stimulation - methods
Acoustics
Auditory Perception - physiology
Biomedical and Life Sciences
Biomedicine
Brain - physiology
Brain Mapping - methods
Brain research
Humans
Listening
Magnetic resonance imaging
Magnetic Resonance Imaging - methods
Medical imaging
Memory
Models, Neurological
Multimedia
Music
Neuroimaging
Neuropsychology
Neuroradiology
Neurosciences
Original Research
Psychiatry
Semantics
Speech
title Data-driven analysis of functional brain interactions during free listening to music and speech
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T01%3A28%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Data-driven%20analysis%20of%20functional%20brain%20interactions%20during%20free%20listening%20to%20music%20and%20speech&rft.jtitle=Brain%20imaging%20and%20behavior&rft.au=Fang,%20Jun&rft.date=2015-06-01&rft.volume=9&rft.issue=2&rft.spage=162&rft.epage=177&rft.pages=162-177&rft.issn=1931-7557&rft.eissn=1931-7565&rft_id=info:doi/10.1007/s11682-014-9293-0&rft_dat=%3Cproquest_cross%3E3693409601%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c442t-9a4365b10ecf256939cd0208af638c36c3a7dae6280dd3755462610e0455e0133%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1682739117&rft_id=info:pmid/24526569&rfr_iscdi=true