Loading…

Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables

The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust,...

Full description

Saved in:
Bibliographic Details
Published in:Scientific data 2022-04, Vol.9 (1), p.158-158, Article 158
Main Authors: Saganowski, Stanisław, Komoszyńska, Joanna, Behnke, Maciej, Perz, Bartosz, Kunc, Dominika, Klich, Bartłomiej, Kaczmarek, Łukasz D., Kazienko, Przemysław
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c540t-f36deaf115008631b6efdf54a9945bd06d09f8d3f1a857fb6e3817806d0fe3ac3
cites cdi_FETCH-LOGICAL-c540t-f36deaf115008631b6efdf54a9945bd06d09f8d3f1a857fb6e3817806d0fe3ac3
container_end_page 158
container_issue 1
container_start_page 158
container_title Scientific data
container_volume 9
creator Saganowski, Stanisław
Komoszyńska, Joanna
Behnke, Maciej
Perz, Bartosz
Kunc, Dominika
Klich, Bartłomiej
Kaczmarek, Łukasz D.
Kazienko, Przemysław
description The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were used to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x); in parallel with the upper-body videos. After each film clip, participants completed two types of self-reports: (1) related to nine discrete emotions and (2) three affective dimensions: valence, arousal, and motivation. The obtained data facilitates various ER approaches, e.g., multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation indicated that watching film clips elicited the targeted emotions. It also supported signals’ high quality. Measurement(s) cardiac output measurement • Electroencephalography • Galvanic Skin Response • Temperature • acceleration • facial expressions Technology Type(s) photoplethysmogram • electroencephalogram (5 electrodes) • electrodermal activity measurement • Sensor • Accelerometer • Video Recording Sample Characteristic - Organism Homo sapiens Sample Characteristic - Environment laboratory environment
doi_str_mv 10.1038/s41597-022-01262-0
format article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_54f44d7f0446415a95e05de1f1acf332</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_54f44d7f0446415a95e05de1f1acf332</doaj_id><sourcerecordid>2648897310</sourcerecordid><originalsourceid>FETCH-LOGICAL-c540t-f36deaf115008631b6efdf54a9945bd06d09f8d3f1a857fb6e3817806d0fe3ac3</originalsourceid><addsrcrecordid>eNp9ks1u1DAUhSMEolXpC7BAkdiwIOU6thObBRKqCq1UiQ2sLU98nfEoiYOdtMzb40zK0LJg4597jj_bVyfLXhO4IEDFh8gIl3UBZVkAKas0PstOS-BlwVhFnz9an2TnMe4AgFAGvIaX2QnlVFJG2WnWX_W-Hdzk_JAbPemI08cce38oBGyO4r2btnnEzhYBRx-m-D63unG6y_HXGDDGZEo1PZh83O7TrvPtPp-jG9r8HnXQmw7jq-yF1V3E84f5LPvx5er75XVx--3rzeXn26LhDKbC0sqgtoRwAFFRsqnQGsuZlpLxjYHKgLTCUEu04LVNMhWkFkvdItUNPctuVq7xeqfG4Hod9sprpw4FH1qlw-SaDhVnljFTW0itSi3VkiNwgySxG0tpmVifVtY4b3o0DQ5T0N0T6FNlcFvV-jslpJCyhgR49wAI_ueMcVK9iw12nR7Qz1GVFRNC1pQs1rf_WHd-DkNq1eKqZQWS1slVrq4m-BgD2uNjCKglHGoNh0rhUIdwqAX95vE3jkf-RCEZ6GqISRpaDH_v_g_2N1H3xvA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2647960937</pqid></control><display><type>article</type><title>Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables</title><source>Publicly Available Content Database</source><source>PubMed Central</source><source>Springer Nature - nature.com Journals - Fully Open Access</source><creator>Saganowski, Stanisław ; Komoszyńska, Joanna ; Behnke, Maciej ; Perz, Bartosz ; Kunc, Dominika ; Klich, Bartłomiej ; Kaczmarek, Łukasz D. ; Kazienko, Przemysław</creator><creatorcontrib>Saganowski, Stanisław ; Komoszyńska, Joanna ; Behnke, Maciej ; Perz, Bartosz ; Kunc, Dominika ; Klich, Bartłomiej ; Kaczmarek, Łukasz D. ; Kazienko, Przemysław</creatorcontrib><description>The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were used to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x); in parallel with the upper-body videos. After each film clip, participants completed two types of self-reports: (1) related to nine discrete emotions and (2) three affective dimensions: valence, arousal, and motivation. The obtained data facilitates various ER approaches, e.g., multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation indicated that watching film clips elicited the targeted emotions. It also supported signals’ high quality. Measurement(s) cardiac output measurement • Electroencephalography • Galvanic Skin Response • Temperature • acceleration • facial expressions Technology Type(s) photoplethysmogram • electroencephalogram (5 electrodes) • electrodermal activity measurement • Sensor • Accelerometer • Video Recording Sample Characteristic - Organism Homo sapiens Sample Characteristic - Environment laboratory environment</description><identifier>ISSN: 2052-4463</identifier><identifier>EISSN: 2052-4463</identifier><identifier>DOI: 10.1038/s41597-022-01262-0</identifier><identifier>PMID: 35393434</identifier><language>eng</language><publisher>London: Nature Publishing Group UK</publisher><subject>639/705/117 ; 706/689/477 ; Anger ; Arousal ; Data Descriptor ; EEG ; Emotional behavior ; Emotions ; Emotions - physiology ; Facial Expression ; Galvanic skin response ; Humanities and Social Sciences ; Humans ; Motivation ; multidisciplinary ; Pattern recognition ; Physiology ; Sadness - psychology ; Science ; Science (multidisciplinary) ; Self Report</subject><ispartof>Scientific data, 2022-04, Vol.9 (1), p.158-158, Article 158</ispartof><rights>The Author(s) 2022</rights><rights>2022. The Author(s).</rights><rights>The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c540t-f36deaf115008631b6efdf54a9945bd06d09f8d3f1a857fb6e3817806d0fe3ac3</citedby><cites>FETCH-LOGICAL-c540t-f36deaf115008631b6efdf54a9945bd06d09f8d3f1a857fb6e3817806d0fe3ac3</cites><orcidid>0000-0002-9940-5117 ; 0000-0001-5868-356X ; 0000-0003-2918-0423 ; 0000-0003-1018-5574 ; 0000-0003-3607-5920 ; 0000-0001-7840-1830 ; 0000-0001-5565-8334 ; 0000-0002-2455-4556</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2647960937/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2647960937?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,881,25733,27903,27904,36991,36992,44569,53769,53771,74872</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35393434$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Saganowski, Stanisław</creatorcontrib><creatorcontrib>Komoszyńska, Joanna</creatorcontrib><creatorcontrib>Behnke, Maciej</creatorcontrib><creatorcontrib>Perz, Bartosz</creatorcontrib><creatorcontrib>Kunc, Dominika</creatorcontrib><creatorcontrib>Klich, Bartłomiej</creatorcontrib><creatorcontrib>Kaczmarek, Łukasz D.</creatorcontrib><creatorcontrib>Kazienko, Przemysław</creatorcontrib><title>Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables</title><title>Scientific data</title><addtitle>Sci Data</addtitle><addtitle>Sci Data</addtitle><description>The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were used to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x); in parallel with the upper-body videos. After each film clip, participants completed two types of self-reports: (1) related to nine discrete emotions and (2) three affective dimensions: valence, arousal, and motivation. The obtained data facilitates various ER approaches, e.g., multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation indicated that watching film clips elicited the targeted emotions. It also supported signals’ high quality. Measurement(s) cardiac output measurement • Electroencephalography • Galvanic Skin Response • Temperature • acceleration • facial expressions Technology Type(s) photoplethysmogram • electroencephalogram (5 electrodes) • electrodermal activity measurement • Sensor • Accelerometer • Video Recording Sample Characteristic - Organism Homo sapiens Sample Characteristic - Environment laboratory environment</description><subject>639/705/117</subject><subject>706/689/477</subject><subject>Anger</subject><subject>Arousal</subject><subject>Data Descriptor</subject><subject>EEG</subject><subject>Emotional behavior</subject><subject>Emotions</subject><subject>Emotions - physiology</subject><subject>Facial Expression</subject><subject>Galvanic skin response</subject><subject>Humanities and Social Sciences</subject><subject>Humans</subject><subject>Motivation</subject><subject>multidisciplinary</subject><subject>Pattern recognition</subject><subject>Physiology</subject><subject>Sadness - psychology</subject><subject>Science</subject><subject>Science (multidisciplinary)</subject><subject>Self Report</subject><issn>2052-4463</issn><issn>2052-4463</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNp9ks1u1DAUhSMEolXpC7BAkdiwIOU6thObBRKqCq1UiQ2sLU98nfEoiYOdtMzb40zK0LJg4597jj_bVyfLXhO4IEDFh8gIl3UBZVkAKas0PstOS-BlwVhFnz9an2TnMe4AgFAGvIaX2QnlVFJG2WnWX_W-Hdzk_JAbPemI08cce38oBGyO4r2btnnEzhYBRx-m-D63unG6y_HXGDDGZEo1PZh83O7TrvPtPp-jG9r8HnXQmw7jq-yF1V3E84f5LPvx5er75XVx--3rzeXn26LhDKbC0sqgtoRwAFFRsqnQGsuZlpLxjYHKgLTCUEu04LVNMhWkFkvdItUNPctuVq7xeqfG4Hod9sprpw4FH1qlw-SaDhVnljFTW0itSi3VkiNwgySxG0tpmVifVtY4b3o0DQ5T0N0T6FNlcFvV-jslpJCyhgR49wAI_ueMcVK9iw12nR7Qz1GVFRNC1pQs1rf_WHd-DkNq1eKqZQWS1slVrq4m-BgD2uNjCKglHGoNh0rhUIdwqAX95vE3jkf-RCEZ6GqISRpaDH_v_g_2N1H3xvA</recordid><startdate>20220407</startdate><enddate>20220407</enddate><creator>Saganowski, Stanisław</creator><creator>Komoszyńska, Joanna</creator><creator>Behnke, Maciej</creator><creator>Perz, Bartosz</creator><creator>Kunc, Dominika</creator><creator>Klich, Bartłomiej</creator><creator>Kaczmarek, Łukasz D.</creator><creator>Kazienko, Przemysław</creator><general>Nature Publishing Group UK</general><general>Nature Publishing Group</general><general>Nature Portfolio</general><scope>C6C</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>LK8</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-9940-5117</orcidid><orcidid>https://orcid.org/0000-0001-5868-356X</orcidid><orcidid>https://orcid.org/0000-0003-2918-0423</orcidid><orcidid>https://orcid.org/0000-0003-1018-5574</orcidid><orcidid>https://orcid.org/0000-0003-3607-5920</orcidid><orcidid>https://orcid.org/0000-0001-7840-1830</orcidid><orcidid>https://orcid.org/0000-0001-5565-8334</orcidid><orcidid>https://orcid.org/0000-0002-2455-4556</orcidid></search><sort><creationdate>20220407</creationdate><title>Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables</title><author>Saganowski, Stanisław ; Komoszyńska, Joanna ; Behnke, Maciej ; Perz, Bartosz ; Kunc, Dominika ; Klich, Bartłomiej ; Kaczmarek, Łukasz D. ; Kazienko, Przemysław</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c540t-f36deaf115008631b6efdf54a9945bd06d09f8d3f1a857fb6e3817806d0fe3ac3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>639/705/117</topic><topic>706/689/477</topic><topic>Anger</topic><topic>Arousal</topic><topic>Data Descriptor</topic><topic>EEG</topic><topic>Emotional behavior</topic><topic>Emotions</topic><topic>Emotions - physiology</topic><topic>Facial Expression</topic><topic>Galvanic skin response</topic><topic>Humanities and Social Sciences</topic><topic>Humans</topic><topic>Motivation</topic><topic>multidisciplinary</topic><topic>Pattern recognition</topic><topic>Physiology</topic><topic>Sadness - psychology</topic><topic>Science</topic><topic>Science (multidisciplinary)</topic><topic>Self Report</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Saganowski, Stanisław</creatorcontrib><creatorcontrib>Komoszyńska, Joanna</creatorcontrib><creatorcontrib>Behnke, Maciej</creatorcontrib><creatorcontrib>Perz, Bartosz</creatorcontrib><creatorcontrib>Kunc, Dominika</creatorcontrib><creatorcontrib>Klich, Bartłomiej</creatorcontrib><creatorcontrib>Kaczmarek, Łukasz D.</creatorcontrib><creatorcontrib>Kazienko, Przemysław</creatorcontrib><collection>SpringerOpen</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Biological Sciences</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Biological Science Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Scientific data</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Saganowski, Stanisław</au><au>Komoszyńska, Joanna</au><au>Behnke, Maciej</au><au>Perz, Bartosz</au><au>Kunc, Dominika</au><au>Klich, Bartłomiej</au><au>Kaczmarek, Łukasz D.</au><au>Kazienko, Przemysław</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables</atitle><jtitle>Scientific data</jtitle><stitle>Sci Data</stitle><addtitle>Sci Data</addtitle><date>2022-04-07</date><risdate>2022</risdate><volume>9</volume><issue>1</issue><spage>158</spage><epage>158</epage><pages>158-158</pages><artnum>158</artnum><issn>2052-4463</issn><eissn>2052-4463</eissn><abstract>The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were used to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x); in parallel with the upper-body videos. After each film clip, participants completed two types of self-reports: (1) related to nine discrete emotions and (2) three affective dimensions: valence, arousal, and motivation. The obtained data facilitates various ER approaches, e.g., multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation indicated that watching film clips elicited the targeted emotions. It also supported signals’ high quality. Measurement(s) cardiac output measurement • Electroencephalography • Galvanic Skin Response • Temperature • acceleration • facial expressions Technology Type(s) photoplethysmogram • electroencephalogram (5 electrodes) • electrodermal activity measurement • Sensor • Accelerometer • Video Recording Sample Characteristic - Organism Homo sapiens Sample Characteristic - Environment laboratory environment</abstract><cop>London</cop><pub>Nature Publishing Group UK</pub><pmid>35393434</pmid><doi>10.1038/s41597-022-01262-0</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0002-9940-5117</orcidid><orcidid>https://orcid.org/0000-0001-5868-356X</orcidid><orcidid>https://orcid.org/0000-0003-2918-0423</orcidid><orcidid>https://orcid.org/0000-0003-1018-5574</orcidid><orcidid>https://orcid.org/0000-0003-3607-5920</orcidid><orcidid>https://orcid.org/0000-0001-7840-1830</orcidid><orcidid>https://orcid.org/0000-0001-5565-8334</orcidid><orcidid>https://orcid.org/0000-0002-2455-4556</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2052-4463
ispartof Scientific data, 2022-04, Vol.9 (1), p.158-158, Article 158
issn 2052-4463
2052-4463
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_54f44d7f0446415a95e05de1f1acf332
source Publicly Available Content Database; PubMed Central; Springer Nature - nature.com Journals - Fully Open Access
subjects 639/705/117
706/689/477
Anger
Arousal
Data Descriptor
EEG
Emotional behavior
Emotions
Emotions - physiology
Facial Expression
Galvanic skin response
Humanities and Social Sciences
Humans
Motivation
multidisciplinary
Pattern recognition
Physiology
Sadness - psychology
Science
Science (multidisciplinary)
Self Report
title Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T17%3A30%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Emognition%20dataset:%20emotion%20recognition%20with%20self-reports,%20facial%20expressions,%20and%20physiology%20using%20wearables&rft.jtitle=Scientific%20data&rft.au=Saganowski,%20Stanis%C5%82aw&rft.date=2022-04-07&rft.volume=9&rft.issue=1&rft.spage=158&rft.epage=158&rft.pages=158-158&rft.artnum=158&rft.issn=2052-4463&rft.eissn=2052-4463&rft_id=info:doi/10.1038/s41597-022-01262-0&rft_dat=%3Cproquest_doaj_%3E2648897310%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c540t-f36deaf115008631b6efdf54a9945bd06d09f8d3f1a857fb6e3817806d0fe3ac3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2647960937&rft_id=info:pmid/35393434&rfr_iscdi=true