Loading…

GameVibe: a multimodal affective game corpus

As online video and streaming platforms continue to grow, affective computing research has undergone a shift towards more complex studies involving multiple modalities. However, there is still a lack of readily available datasets with high-quality audiovisual stimuli. In this paper, we present GameV...

Full description

Saved in:
Bibliographic Details
Published in:Scientific data 2024-11, Vol.11 (1), p.1306-10
Main Authors: Barthet, Matthew, Kaselimi, Maria, Pinitas, Kosmas, Makantasis, Konstantinos, Liapis, Antonios, Yannakakis, Georgios N.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-d379t-55cd87b95aaf5fa581de9ee1d6f17f717e8e64cf57289da2b1150a049f87fb123
container_end_page 10
container_issue 1
container_start_page 1306
container_title Scientific data
container_volume 11
creator Barthet, Matthew
Kaselimi, Maria
Pinitas, Kosmas
Makantasis, Konstantinos
Liapis, Antonios
Yannakakis, Georgios N.
description As online video and streaming platforms continue to grow, affective computing research has undergone a shift towards more complex studies involving multiple modalities. However, there is still a lack of readily available datasets with high-quality audiovisual stimuli. In this paper, we present GameVibe, a novel affect corpus which consists of multimodal audiovisual stimuli, including in-game behavioural observations and third-person affect traces for viewer engagement. The corpus consists of videos from a diverse set of publicly available gameplay sessions across 30 games, with particular attention to ensure high-quality stimuli with good audiovisual and gameplay diversity. Furthermore, we present an analysis on the reliability of the annotators in terms of inter-annotator agreement.
doi_str_mv 10.1038/s41597-024-04022-4
format article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_13c0cf92a500416a99de945ce55045cd</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_13c0cf92a500416a99de945ce55045cd</doaj_id><sourcerecordid>3134183795</sourcerecordid><originalsourceid>FETCH-LOGICAL-d379t-55cd87b95aaf5fa581de9ee1d6f17f717e8e64cf57289da2b1150a049f87fb123</originalsourceid><addsrcrecordid>eNpdkU9v1DAQxS0EotXSL8ABReLCgcD4XxxzQVXVlkqVuABXa2KPl6ySeImTSnx7zG6BltOMZn56M3qPsZcc3nGQ7fusuLamBqFqUCBErZ6wUwG6NKqRTx_0J-ws5x0AcKlAG3jOTqRtuDRGn7K31zjSt76jDxVW4zos_ZgCDhXGSH7p76jaFqDyad6v-QV7FnHIdHZfN-zr1eWXi0_17efrm4vz2zpIY5daax9a01mNGHVE3fJAloiHJnITDTfUUqN81Ea0NqDoONeAoGxsTey4kBt2c9QNCXduP_cjzj9dwt4dBmneOpyX3g_kuPTgoxWoARRv0NpyS2lPWkMpoWh9PGrt126k4GlaZhweiT7eTP13t013jvMGjCo-bdibe4U5_VgpL27ss6dhwInSmp0svkoppJEFff0fukvrPBWvDhRviz-6UK8evvT3lz-pFEAegVxW05bmfzIc3O_43TF-V-J3h_idkr8AxGOeCA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3134183795</pqid></control><display><type>article</type><title>GameVibe: a multimodal affective game corpus</title><source>Publicly Available Content (ProQuest)</source><source>PubMed Central</source><source>Springer Nature - nature.com Journals - Fully Open Access</source><creator>Barthet, Matthew ; Kaselimi, Maria ; Pinitas, Kosmas ; Makantasis, Konstantinos ; Liapis, Antonios ; Yannakakis, Georgios N.</creator><creatorcontrib>Barthet, Matthew ; Kaselimi, Maria ; Pinitas, Kosmas ; Makantasis, Konstantinos ; Liapis, Antonios ; Yannakakis, Georgios N.</creatorcontrib><description>As online video and streaming platforms continue to grow, affective computing research has undergone a shift towards more complex studies involving multiple modalities. However, there is still a lack of readily available datasets with high-quality audiovisual stimuli. In this paper, we present GameVibe, a novel affect corpus which consists of multimodal audiovisual stimuli, including in-game behavioural observations and third-person affect traces for viewer engagement. The corpus consists of videos from a diverse set of publicly available gameplay sessions across 30 games, with particular attention to ensure high-quality stimuli with good audiovisual and gameplay diversity. Furthermore, we present an analysis on the reliability of the annotators in terms of inter-annotator agreement.</description><identifier>ISSN: 2052-4463</identifier><identifier>EISSN: 2052-4463</identifier><identifier>DOI: 10.1038/s41597-024-04022-4</identifier><identifier>PMID: 39613775</identifier><language>eng</language><publisher>London: Nature Publishing Group UK</publisher><subject>706/648/697/129 ; 706/689/680 ; Affect ; Annotations ; Data collection ; Data Descriptor ; Datasets ; Design ; Games ; Genre ; Humanities and Social Sciences ; Humans ; multidisciplinary ; Quality control ; Science ; Science (multidisciplinary) ; Sensory integration ; Streaming ; Streaming media ; User behavior ; Video Games</subject><ispartof>Scientific data, 2024-11, Vol.11 (1), p.1306-10</ispartof><rights>The Author(s) 2024</rights><rights>2024. The Author(s).</rights><rights>The Author(s) 2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>The Author(s) 2024 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-d379t-55cd87b95aaf5fa581de9ee1d6f17f717e8e64cf57289da2b1150a049f87fb123</cites><orcidid>0000-0002-7121-5290</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/3134183795/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/3134183795?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25753,27924,27925,37012,37013,44590,53791,53793,75126</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/39613775$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Barthet, Matthew</creatorcontrib><creatorcontrib>Kaselimi, Maria</creatorcontrib><creatorcontrib>Pinitas, Kosmas</creatorcontrib><creatorcontrib>Makantasis, Konstantinos</creatorcontrib><creatorcontrib>Liapis, Antonios</creatorcontrib><creatorcontrib>Yannakakis, Georgios N.</creatorcontrib><title>GameVibe: a multimodal affective game corpus</title><title>Scientific data</title><addtitle>Sci Data</addtitle><addtitle>Sci Data</addtitle><description>As online video and streaming platforms continue to grow, affective computing research has undergone a shift towards more complex studies involving multiple modalities. However, there is still a lack of readily available datasets with high-quality audiovisual stimuli. In this paper, we present GameVibe, a novel affect corpus which consists of multimodal audiovisual stimuli, including in-game behavioural observations and third-person affect traces for viewer engagement. The corpus consists of videos from a diverse set of publicly available gameplay sessions across 30 games, with particular attention to ensure high-quality stimuli with good audiovisual and gameplay diversity. Furthermore, we present an analysis on the reliability of the annotators in terms of inter-annotator agreement.</description><subject>706/648/697/129</subject><subject>706/689/680</subject><subject>Affect</subject><subject>Annotations</subject><subject>Data collection</subject><subject>Data Descriptor</subject><subject>Datasets</subject><subject>Design</subject><subject>Games</subject><subject>Genre</subject><subject>Humanities and Social Sciences</subject><subject>Humans</subject><subject>multidisciplinary</subject><subject>Quality control</subject><subject>Science</subject><subject>Science (multidisciplinary)</subject><subject>Sensory integration</subject><subject>Streaming</subject><subject>Streaming media</subject><subject>User behavior</subject><subject>Video Games</subject><issn>2052-4463</issn><issn>2052-4463</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNpdkU9v1DAQxS0EotXSL8ABReLCgcD4XxxzQVXVlkqVuABXa2KPl6ySeImTSnx7zG6BltOMZn56M3qPsZcc3nGQ7fusuLamBqFqUCBErZ6wUwG6NKqRTx_0J-ws5x0AcKlAG3jOTqRtuDRGn7K31zjSt76jDxVW4zos_ZgCDhXGSH7p76jaFqDyad6v-QV7FnHIdHZfN-zr1eWXi0_17efrm4vz2zpIY5daax9a01mNGHVE3fJAloiHJnITDTfUUqN81Ea0NqDoONeAoGxsTey4kBt2c9QNCXduP_cjzj9dwt4dBmneOpyX3g_kuPTgoxWoARRv0NpyS2lPWkMpoWh9PGrt126k4GlaZhweiT7eTP13t013jvMGjCo-bdibe4U5_VgpL27ss6dhwInSmp0svkoppJEFff0fukvrPBWvDhRviz-6UK8evvT3lz-pFEAegVxW05bmfzIc3O_43TF-V-J3h_idkr8AxGOeCA</recordid><startdate>20241129</startdate><enddate>20241129</enddate><creator>Barthet, Matthew</creator><creator>Kaselimi, Maria</creator><creator>Pinitas, Kosmas</creator><creator>Makantasis, Konstantinos</creator><creator>Liapis, Antonios</creator><creator>Yannakakis, Georgios N.</creator><general>Nature Publishing Group UK</general><general>Nature Publishing Group</general><general>Nature Portfolio</general><scope>C6C</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>LK8</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-7121-5290</orcidid></search><sort><creationdate>20241129</creationdate><title>GameVibe: a multimodal affective game corpus</title><author>Barthet, Matthew ; Kaselimi, Maria ; Pinitas, Kosmas ; Makantasis, Konstantinos ; Liapis, Antonios ; Yannakakis, Georgios N.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-d379t-55cd87b95aaf5fa581de9ee1d6f17f717e8e64cf57289da2b1150a049f87fb123</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>706/648/697/129</topic><topic>706/689/680</topic><topic>Affect</topic><topic>Annotations</topic><topic>Data collection</topic><topic>Data Descriptor</topic><topic>Datasets</topic><topic>Design</topic><topic>Games</topic><topic>Genre</topic><topic>Humanities and Social Sciences</topic><topic>Humans</topic><topic>multidisciplinary</topic><topic>Quality control</topic><topic>Science</topic><topic>Science (multidisciplinary)</topic><topic>Sensory integration</topic><topic>Streaming</topic><topic>Streaming media</topic><topic>User behavior</topic><topic>Video Games</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Barthet, Matthew</creatorcontrib><creatorcontrib>Kaselimi, Maria</creatorcontrib><creatorcontrib>Pinitas, Kosmas</creatorcontrib><creatorcontrib>Makantasis, Konstantinos</creatorcontrib><creatorcontrib>Liapis, Antonios</creatorcontrib><creatorcontrib>Yannakakis, Georgios N.</creatorcontrib><collection>SpringerOpen</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>ProQuest Biological Science Collection</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Biological Science Database</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Scientific data</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Barthet, Matthew</au><au>Kaselimi, Maria</au><au>Pinitas, Kosmas</au><au>Makantasis, Konstantinos</au><au>Liapis, Antonios</au><au>Yannakakis, Georgios N.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>GameVibe: a multimodal affective game corpus</atitle><jtitle>Scientific data</jtitle><stitle>Sci Data</stitle><addtitle>Sci Data</addtitle><date>2024-11-29</date><risdate>2024</risdate><volume>11</volume><issue>1</issue><spage>1306</spage><epage>10</epage><pages>1306-10</pages><issn>2052-4463</issn><eissn>2052-4463</eissn><abstract>As online video and streaming platforms continue to grow, affective computing research has undergone a shift towards more complex studies involving multiple modalities. However, there is still a lack of readily available datasets with high-quality audiovisual stimuli. In this paper, we present GameVibe, a novel affect corpus which consists of multimodal audiovisual stimuli, including in-game behavioural observations and third-person affect traces for viewer engagement. The corpus consists of videos from a diverse set of publicly available gameplay sessions across 30 games, with particular attention to ensure high-quality stimuli with good audiovisual and gameplay diversity. Furthermore, we present an analysis on the reliability of the annotators in terms of inter-annotator agreement.</abstract><cop>London</cop><pub>Nature Publishing Group UK</pub><pmid>39613775</pmid><doi>10.1038/s41597-024-04022-4</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0002-7121-5290</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2052-4463
ispartof Scientific data, 2024-11, Vol.11 (1), p.1306-10
issn 2052-4463
2052-4463
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_13c0cf92a500416a99de945ce55045cd
source Publicly Available Content (ProQuest); PubMed Central; Springer Nature - nature.com Journals - Fully Open Access
subjects 706/648/697/129
706/689/680
Affect
Annotations
Data collection
Data Descriptor
Datasets
Design
Games
Genre
Humanities and Social Sciences
Humans
multidisciplinary
Quality control
Science
Science (multidisciplinary)
Sensory integration
Streaming
Streaming media
User behavior
Video Games
title GameVibe: a multimodal affective game corpus
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-03T23%3A08%3A48IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=GameVibe:%20a%20multimodal%20affective%20game%20corpus&rft.jtitle=Scientific%20data&rft.au=Barthet,%20Matthew&rft.date=2024-11-29&rft.volume=11&rft.issue=1&rft.spage=1306&rft.epage=10&rft.pages=1306-10&rft.issn=2052-4463&rft.eissn=2052-4463&rft_id=info:doi/10.1038/s41597-024-04022-4&rft_dat=%3Cproquest_doaj_%3E3134183795%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-d379t-55cd87b95aaf5fa581de9ee1d6f17f717e8e64cf57289da2b1150a049f87fb123%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3134183795&rft_id=info:pmid/39613775&rfr_iscdi=true