Loading…
Detecting Buildings and Nonbuildings from Satellite Images Using U-Net
Automatic building detection from high-resolution satellite imaging images has many applications. Understanding socioeconomic development and keeping track of population migrations are essential for effective civic planning. These civil feature systems may also help update maps after natural disaste...
Saved in:
Published in: | Computational intelligence and neuroscience 2022-05, Vol.2022, p.4831223-13 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c476t-e4fc8f03b0fd9497b442a8a819386f98a9da1288d30932c99b9094a7b217cb593 |
---|---|
cites | cdi_FETCH-LOGICAL-c476t-e4fc8f03b0fd9497b442a8a819386f98a9da1288d30932c99b9094a7b217cb593 |
container_end_page | 13 |
container_issue | |
container_start_page | 4831223 |
container_title | Computational intelligence and neuroscience |
container_volume | 2022 |
creator | Alsabhan, Waleed Alotaiby, Turky Dudin, Basil |
description | Automatic building detection from high-resolution satellite imaging images has many applications. Understanding socioeconomic development and keeping track of population migrations are essential for effective civic planning. These civil feature systems may also help update maps after natural disasters or in geographic regions undergoing dramatic population expansion. To accomplish the desired goal, a variety of image processing techniques were employed. They are often inaccurate or take a long time to process. Convolutional neural networks (CNNs) are being designed to extract buildings from satellite images, based on the U-Net, which was first developed to segment medical images. The minimal number of images from the open dataset, in RGB format with variable shapes, reveals one of the advantages of the U-Net; that is, it develops excellent accuracy from a limited amount of training material with minimal effort and training time. The encoder portion of U-Net was altered to test the feasibility of using a transfer learning facility. VGGNet and ResNet were both used for the same purpose. The findings of these models were also compared to our own bespoke U-Net, which was designed from the ground up. With an accuracy of 84.9%, the VGGNet backbone was shown to be the best feature extractor. Compared to the current best models for tackling a similar problem with a larger dataset, the present results are considered superior. |
doi_str_mv | 10.1155/2022/4831223 |
format | article |
fullrecord | <record><control><sourceid>gale_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9098279</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A703852586</galeid><sourcerecordid>A703852586</sourcerecordid><originalsourceid>FETCH-LOGICAL-c476t-e4fc8f03b0fd9497b442a8a819386f98a9da1288d30932c99b9094a7b217cb593</originalsourceid><addsrcrecordid>eNp9kclP3DAUhy1ExVZunFEkLkglxftyQWIvEqKHds6W4ziDUWJDnID63-NohmE59OQn-_P3_PwDYA_BnwgxdowhxsdUEoQxWQNbiEtRMizI-qrmbBNsp_QAIRMM4g2wSRgTSEC5Ba4u3ODs4MO8OBt9W-ciFSbUxV0M1Wqj6WNX_DGDa1s_uOKmM3OXilmars3KOzd8B98a0ya3u1x3wOzq8u_5r_L29_XN-eltaangQ-loY2UDSQWbWlElKkqxkUYiRSRvlDSqNghLWROoCLZKVQoqakSFkbAVU2QHnCy8j2PVudq6MPSm1Y-970z_T0fj9eeT4O_1PD7r7JFYTILDpaCPT6NLg-58snkuE1wck8acMwQpZTCjB1_Qhzj2IY83UZTnz-fsnZqb1mkfmpj72kmqTwUkkmEmeaaOFpTtY0q9a1ZPRlBPMeopRr2MMeP7H8dcwW-5ZeDHArj3oTYv_v-6V8XEokg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2664615565</pqid></control><display><type>article</type><title>Detecting Buildings and Nonbuildings from Satellite Images Using U-Net</title><source>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</source><source>Wiley Online Library Open Access</source><creator>Alsabhan, Waleed ; Alotaiby, Turky ; Dudin, Basil</creator><contributor>Bhardwaj, Arpit ; Arpit Bhardwaj</contributor><creatorcontrib>Alsabhan, Waleed ; Alotaiby, Turky ; Dudin, Basil ; Bhardwaj, Arpit ; Arpit Bhardwaj</creatorcontrib><description>Automatic building detection from high-resolution satellite imaging images has many applications. Understanding socioeconomic development and keeping track of population migrations are essential for effective civic planning. These civil feature systems may also help update maps after natural disasters or in geographic regions undergoing dramatic population expansion. To accomplish the desired goal, a variety of image processing techniques were employed. They are often inaccurate or take a long time to process. Convolutional neural networks (CNNs) are being designed to extract buildings from satellite images, based on the U-Net, which was first developed to segment medical images. The minimal number of images from the open dataset, in RGB format with variable shapes, reveals one of the advantages of the U-Net; that is, it develops excellent accuracy from a limited amount of training material with minimal effort and training time. The encoder portion of U-Net was altered to test the feasibility of using a transfer learning facility. VGGNet and ResNet were both used for the same purpose. The findings of these models were also compared to our own bespoke U-Net, which was designed from the ground up. With an accuracy of 84.9%, the VGGNet backbone was shown to be the best feature extractor. Compared to the current best models for tackling a similar problem with a larger dataset, the present results are considered superior.</description><identifier>ISSN: 1687-5265</identifier><identifier>ISSN: 1687-5273</identifier><identifier>EISSN: 1687-5273</identifier><identifier>DOI: 10.1155/2022/4831223</identifier><identifier>PMID: 35571708</identifier><language>eng</language><publisher>United States: Hindawi</publisher><subject>Accuracy ; Artificial neural networks ; Automation ; Buildings ; Coders ; Datasets ; Deep learning ; Feature extraction ; Geospatial data ; Image processing ; Image Processing, Computer-Assisted ; Image resolution ; Medical imaging ; Medical imaging equipment ; Natural disasters ; Neural networks ; Neural Networks, Computer ; Random variables ; Remote sensing ; Rural areas ; Satellite imagery ; Satellite imaging ; Satellite tracking ; Semantics ; System effectiveness ; Technology application ; Training ; Transfer learning ; Unmanned aerial vehicles</subject><ispartof>Computational intelligence and neuroscience, 2022-05, Vol.2022, p.4831223-13</ispartof><rights>Copyright © 2022 Waleed Alsabhan et al.</rights><rights>COPYRIGHT 2022 John Wiley & Sons, Inc.</rights><rights>Copyright © 2022 Waleed Alsabhan et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0</rights><rights>Copyright © 2022 Waleed Alsabhan et al. 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c476t-e4fc8f03b0fd9497b442a8a819386f98a9da1288d30932c99b9094a7b217cb593</citedby><cites>FETCH-LOGICAL-c476t-e4fc8f03b0fd9497b442a8a819386f98a9da1288d30932c99b9094a7b217cb593</cites><orcidid>0000-0002-6921-8812 ; 0000-0002-0924-1746</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2664615565/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2664615565?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,780,784,885,25752,27923,27924,37011,37012,44589,74997</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35571708$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Bhardwaj, Arpit</contributor><contributor>Arpit Bhardwaj</contributor><creatorcontrib>Alsabhan, Waleed</creatorcontrib><creatorcontrib>Alotaiby, Turky</creatorcontrib><creatorcontrib>Dudin, Basil</creatorcontrib><title>Detecting Buildings and Nonbuildings from Satellite Images Using U-Net</title><title>Computational intelligence and neuroscience</title><addtitle>Comput Intell Neurosci</addtitle><description>Automatic building detection from high-resolution satellite imaging images has many applications. Understanding socioeconomic development and keeping track of population migrations are essential for effective civic planning. These civil feature systems may also help update maps after natural disasters or in geographic regions undergoing dramatic population expansion. To accomplish the desired goal, a variety of image processing techniques were employed. They are often inaccurate or take a long time to process. Convolutional neural networks (CNNs) are being designed to extract buildings from satellite images, based on the U-Net, which was first developed to segment medical images. The minimal number of images from the open dataset, in RGB format with variable shapes, reveals one of the advantages of the U-Net; that is, it develops excellent accuracy from a limited amount of training material with minimal effort and training time. The encoder portion of U-Net was altered to test the feasibility of using a transfer learning facility. VGGNet and ResNet were both used for the same purpose. The findings of these models were also compared to our own bespoke U-Net, which was designed from the ground up. With an accuracy of 84.9%, the VGGNet backbone was shown to be the best feature extractor. Compared to the current best models for tackling a similar problem with a larger dataset, the present results are considered superior.</description><subject>Accuracy</subject><subject>Artificial neural networks</subject><subject>Automation</subject><subject>Buildings</subject><subject>Coders</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Feature extraction</subject><subject>Geospatial data</subject><subject>Image processing</subject><subject>Image Processing, Computer-Assisted</subject><subject>Image resolution</subject><subject>Medical imaging</subject><subject>Medical imaging equipment</subject><subject>Natural disasters</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>Random variables</subject><subject>Remote sensing</subject><subject>Rural areas</subject><subject>Satellite imagery</subject><subject>Satellite imaging</subject><subject>Satellite tracking</subject><subject>Semantics</subject><subject>System effectiveness</subject><subject>Technology application</subject><subject>Training</subject><subject>Transfer learning</subject><subject>Unmanned aerial vehicles</subject><issn>1687-5265</issn><issn>1687-5273</issn><issn>1687-5273</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNp9kclP3DAUhy1ExVZunFEkLkglxftyQWIvEqKHds6W4ziDUWJDnID63-NohmE59OQn-_P3_PwDYA_BnwgxdowhxsdUEoQxWQNbiEtRMizI-qrmbBNsp_QAIRMM4g2wSRgTSEC5Ba4u3ODs4MO8OBt9W-ciFSbUxV0M1Wqj6WNX_DGDa1s_uOKmM3OXilmars3KOzd8B98a0ya3u1x3wOzq8u_5r_L29_XN-eltaangQ-loY2UDSQWbWlElKkqxkUYiRSRvlDSqNghLWROoCLZKVQoqakSFkbAVU2QHnCy8j2PVudq6MPSm1Y-970z_T0fj9eeT4O_1PD7r7JFYTILDpaCPT6NLg-58snkuE1wck8acMwQpZTCjB1_Qhzj2IY83UZTnz-fsnZqb1mkfmpj72kmqTwUkkmEmeaaOFpTtY0q9a1ZPRlBPMeopRr2MMeP7H8dcwW-5ZeDHArj3oTYv_v-6V8XEokg</recordid><startdate>20220505</startdate><enddate>20220505</enddate><creator>Alsabhan, Waleed</creator><creator>Alotaiby, Turky</creator><creator>Dudin, Basil</creator><general>Hindawi</general><general>John Wiley & Sons, Inc</general><general>Hindawi Limited</general><scope>RHU</scope><scope>RHW</scope><scope>RHX</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7QF</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>7X7</scope><scope>7XB</scope><scope>8AL</scope><scope>8BQ</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>CWDGH</scope><scope>DWQXO</scope><scope>F28</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H8D</scope><scope>H8G</scope><scope>HCIFZ</scope><scope>JG9</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>KR7</scope><scope>L6V</scope><scope>L7M</scope><scope>LK8</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0002-6921-8812</orcidid><orcidid>https://orcid.org/0000-0002-0924-1746</orcidid></search><sort><creationdate>20220505</creationdate><title>Detecting Buildings and Nonbuildings from Satellite Images Using U-Net</title><author>Alsabhan, Waleed ; Alotaiby, Turky ; Dudin, Basil</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c476t-e4fc8f03b0fd9497b442a8a819386f98a9da1288d30932c99b9094a7b217cb593</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Accuracy</topic><topic>Artificial neural networks</topic><topic>Automation</topic><topic>Buildings</topic><topic>Coders</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Feature extraction</topic><topic>Geospatial data</topic><topic>Image processing</topic><topic>Image Processing, Computer-Assisted</topic><topic>Image resolution</topic><topic>Medical imaging</topic><topic>Medical imaging equipment</topic><topic>Natural disasters</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>Random variables</topic><topic>Remote sensing</topic><topic>Rural areas</topic><topic>Satellite imagery</topic><topic>Satellite imaging</topic><topic>Satellite tracking</topic><topic>Semantics</topic><topic>System effectiveness</topic><topic>Technology application</topic><topic>Training</topic><topic>Transfer learning</topic><topic>Unmanned aerial vehicles</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Alsabhan, Waleed</creatorcontrib><creatorcontrib>Alotaiby, Turky</creatorcontrib><creatorcontrib>Dudin, Basil</creatorcontrib><collection>Hindawi Publishing Complete</collection><collection>Hindawi Publishing Subscription Journals</collection><collection>Hindawi Publishing Open Access Journals</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Aluminium Industry Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Computing Database (Alumni Edition)</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Databases</collection><collection>Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>Middle East & Africa Database</collection><collection>ProQuest Central</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Aerospace Database</collection><collection>Copper Technical Reference Library</collection><collection>SciTech Premium Collection</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>ProQuest Biological Science Collection</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Computational intelligence and neuroscience</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Alsabhan, Waleed</au><au>Alotaiby, Turky</au><au>Dudin, Basil</au><au>Bhardwaj, Arpit</au><au>Arpit Bhardwaj</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Detecting Buildings and Nonbuildings from Satellite Images Using U-Net</atitle><jtitle>Computational intelligence and neuroscience</jtitle><addtitle>Comput Intell Neurosci</addtitle><date>2022-05-05</date><risdate>2022</risdate><volume>2022</volume><spage>4831223</spage><epage>13</epage><pages>4831223-13</pages><issn>1687-5265</issn><issn>1687-5273</issn><eissn>1687-5273</eissn><abstract>Automatic building detection from high-resolution satellite imaging images has many applications. Understanding socioeconomic development and keeping track of population migrations are essential for effective civic planning. These civil feature systems may also help update maps after natural disasters or in geographic regions undergoing dramatic population expansion. To accomplish the desired goal, a variety of image processing techniques were employed. They are often inaccurate or take a long time to process. Convolutional neural networks (CNNs) are being designed to extract buildings from satellite images, based on the U-Net, which was first developed to segment medical images. The minimal number of images from the open dataset, in RGB format with variable shapes, reveals one of the advantages of the U-Net; that is, it develops excellent accuracy from a limited amount of training material with minimal effort and training time. The encoder portion of U-Net was altered to test the feasibility of using a transfer learning facility. VGGNet and ResNet were both used for the same purpose. The findings of these models were also compared to our own bespoke U-Net, which was designed from the ground up. With an accuracy of 84.9%, the VGGNet backbone was shown to be the best feature extractor. Compared to the current best models for tackling a similar problem with a larger dataset, the present results are considered superior.</abstract><cop>United States</cop><pub>Hindawi</pub><pmid>35571708</pmid><doi>10.1155/2022/4831223</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-6921-8812</orcidid><orcidid>https://orcid.org/0000-0002-0924-1746</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1687-5265 |
ispartof | Computational intelligence and neuroscience, 2022-05, Vol.2022, p.4831223-13 |
issn | 1687-5265 1687-5273 1687-5273 |
language | eng |
recordid | cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9098279 |
source | Publicly Available Content Database (Proquest) (PQ_SDU_P3); Wiley Online Library Open Access |
subjects | Accuracy Artificial neural networks Automation Buildings Coders Datasets Deep learning Feature extraction Geospatial data Image processing Image Processing, Computer-Assisted Image resolution Medical imaging Medical imaging equipment Natural disasters Neural networks Neural Networks, Computer Random variables Remote sensing Rural areas Satellite imagery Satellite imaging Satellite tracking Semantics System effectiveness Technology application Training Transfer learning Unmanned aerial vehicles |
title | Detecting Buildings and Nonbuildings from Satellite Images Using U-Net |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T18%3A09%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Detecting%20Buildings%20and%20Nonbuildings%20from%20Satellite%20Images%20Using%20U-Net&rft.jtitle=Computational%20intelligence%20and%20neuroscience&rft.au=Alsabhan,%20Waleed&rft.date=2022-05-05&rft.volume=2022&rft.spage=4831223&rft.epage=13&rft.pages=4831223-13&rft.issn=1687-5265&rft.eissn=1687-5273&rft_id=info:doi/10.1155/2022/4831223&rft_dat=%3Cgale_pubme%3EA703852586%3C/gale_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c476t-e4fc8f03b0fd9497b442a8a819386f98a9da1288d30932c99b9094a7b217cb593%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2664615565&rft_id=info:pmid/35571708&rft_galeid=A703852586&rfr_iscdi=true |