Loading…

A New Scene Sensing Model Based on Multi-Source Data from Smartphones

Smartphones with integrated sensors play an important role in people's lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Switzerland), 2024-10, Vol.24 (20), p.6669
Main Authors: Ding, Zhenke, Deng, Zhongliang, Hu, Enwen, Liu, Bingxun, Zhang, Zhichao, Ma, Mingyang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c399t-b5083d5a5d9e2980c33b9f38ec3f63f4284a5a57190b5d5a108d7e79979b66593
container_end_page
container_issue 20
container_start_page 6669
container_title Sensors (Basel, Switzerland)
container_volume 24
creator Ding, Zhenke
Deng, Zhongliang
Hu, Enwen
Liu, Bingxun
Zhang, Zhichao
Ma, Mingyang
description Smartphones with integrated sensors play an important role in people's lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect the method and results of multi-source fusion positioning. Based on the multi-source data from smartphone sensors, this study explores five types of information-Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), cellular networks, optical sensors, and Wi-Fi sensors-characterizing the temporal, spatial, and mathematical statistical features of the data, and it constructs a multi-scale, multi-window, and context-connected scene sensing model to accurately detect the environmental scene in indoor, semi-indoor, outdoor, and semi-outdoor spaces, thus providing a good basis for multi-sensor positioning in a multi-sensor navigation system. Detecting environmental scenes provides an environmental positioning basis for multi-sensor fusion localization. This model is divided into four main parts: multi-sensor-based data mining, a multi-scale convolutional neural network (CNN), a bidirectional long short-term memory (BiLSTM) network combined with contextual information, and a meta-heuristic optimization algorithm.
doi_str_mv 10.3390/s24206669
format article
fullrecord <record><control><sourceid>gale_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_5fb4a6deaedd420c8cfd1917ba1d1066</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A814418254</galeid><doaj_id>oai_doaj_org_article_5fb4a6deaedd420c8cfd1917ba1d1066</doaj_id><sourcerecordid>A814418254</sourcerecordid><originalsourceid>FETCH-LOGICAL-c399t-b5083d5a5d9e2980c33b9f38ec3f63f4284a5a57190b5d5a108d7e79979b66593</originalsourceid><addsrcrecordid>eNpdkk1v1DAQhiMEoqVw4A8gS1zgkOLPxD6hpRSo1MJh4Ww59njrVRJv4wTEv2fKllWLfLA18_idmVdTVS8ZPRXC0HeFS06bpjGPqmMmuaw15_TxvfdR9ayULaVcCKGfVkfCyIYyaY6r8xX5Cr_I2sMIZA1jSeOGXOUAPfngCgSSR3K19HOq13mZPJCPbnYkTnkg68FN8-46j1CeV0-i6wu8uLtPqh-fzr-ffakvv32-OFtd1l4YM9edoloE5VQwwI2mXojORKHBi9iIKLmWDrMtM7RTyDGqQwutMa3pmkYZcVJd7HVDdlu7mxK28Ntml-zfQJ42FntKvgerYiddE8BBCOiO1z4GZljbORYYmoVa7_dau6UbIKAB8-T6B6IPM2O6tpv80zKmGDVMoMKbO4Up3yxQZjuk4qHv3Qh5KVYwjpWkbBWir_9Dt2jniF7dUrRVXDccqdM9tXE4QRpjxsIeT4AheTQ6JoyvNJOSaa4kfni7_-CnXMoE8dA-o_Z2NexhNZB9dX_eA_lvF8QfJ72w1g</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3120752862</pqid></control><display><type>article</type><title>A New Scene Sensing Model Based on Multi-Source Data from Smartphones</title><source>PubMed (Medline)</source><source>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</source><creator>Ding, Zhenke ; Deng, Zhongliang ; Hu, Enwen ; Liu, Bingxun ; Zhang, Zhichao ; Ma, Mingyang</creator><creatorcontrib>Ding, Zhenke ; Deng, Zhongliang ; Hu, Enwen ; Liu, Bingxun ; Zhang, Zhichao ; Ma, Mingyang</creatorcontrib><description>Smartphones with integrated sensors play an important role in people's lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect the method and results of multi-source fusion positioning. Based on the multi-source data from smartphone sensors, this study explores five types of information-Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), cellular networks, optical sensors, and Wi-Fi sensors-characterizing the temporal, spatial, and mathematical statistical features of the data, and it constructs a multi-scale, multi-window, and context-connected scene sensing model to accurately detect the environmental scene in indoor, semi-indoor, outdoor, and semi-outdoor spaces, thus providing a good basis for multi-sensor positioning in a multi-sensor navigation system. Detecting environmental scenes provides an environmental positioning basis for multi-sensor fusion localization. This model is divided into four main parts: multi-sensor-based data mining, a multi-scale convolutional neural network (CNN), a bidirectional long short-term memory (BiLSTM) network combined with contextual information, and a meta-heuristic optimization algorithm.</description><identifier>ISSN: 1424-8220</identifier><identifier>EISSN: 1424-8220</identifier><identifier>DOI: 10.3390/s24206669</identifier><identifier>PMID: 39460149</identifier><language>eng</language><publisher>Switzerland: MDPI AG</publisher><subject>Accuracy ; Algorithms ; Altitude ; Artificial satellites ; Classification ; CNN ; Data mining ; Datasets ; Electronics in navigation ; GNSS ; Machine learning ; Mathematical optimization ; multi-source sensor ; Neural networks ; Satellites ; scene classification ; Sensors ; Smart phones ; Smartphones ; Telecommunication systems ; Wi-Fi</subject><ispartof>Sensors (Basel, Switzerland), 2024-10, Vol.24 (20), p.6669</ispartof><rights>COPYRIGHT 2024 MDPI AG</rights><rights>2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2024 by the authors. 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c399t-b5083d5a5d9e2980c33b9f38ec3f63f4284a5a57190b5d5a108d7e79979b66593</cites><orcidid>0009-0009-0540-5561</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/3120752862/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/3120752862?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25753,27924,27925,37012,37013,44590,53791,53793,75126</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/39460149$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Ding, Zhenke</creatorcontrib><creatorcontrib>Deng, Zhongliang</creatorcontrib><creatorcontrib>Hu, Enwen</creatorcontrib><creatorcontrib>Liu, Bingxun</creatorcontrib><creatorcontrib>Zhang, Zhichao</creatorcontrib><creatorcontrib>Ma, Mingyang</creatorcontrib><title>A New Scene Sensing Model Based on Multi-Source Data from Smartphones</title><title>Sensors (Basel, Switzerland)</title><addtitle>Sensors (Basel)</addtitle><description>Smartphones with integrated sensors play an important role in people's lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect the method and results of multi-source fusion positioning. Based on the multi-source data from smartphone sensors, this study explores five types of information-Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), cellular networks, optical sensors, and Wi-Fi sensors-characterizing the temporal, spatial, and mathematical statistical features of the data, and it constructs a multi-scale, multi-window, and context-connected scene sensing model to accurately detect the environmental scene in indoor, semi-indoor, outdoor, and semi-outdoor spaces, thus providing a good basis for multi-sensor positioning in a multi-sensor navigation system. Detecting environmental scenes provides an environmental positioning basis for multi-sensor fusion localization. This model is divided into four main parts: multi-sensor-based data mining, a multi-scale convolutional neural network (CNN), a bidirectional long short-term memory (BiLSTM) network combined with contextual information, and a meta-heuristic optimization algorithm.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Altitude</subject><subject>Artificial satellites</subject><subject>Classification</subject><subject>CNN</subject><subject>Data mining</subject><subject>Datasets</subject><subject>Electronics in navigation</subject><subject>GNSS</subject><subject>Machine learning</subject><subject>Mathematical optimization</subject><subject>multi-source sensor</subject><subject>Neural networks</subject><subject>Satellites</subject><subject>scene classification</subject><subject>Sensors</subject><subject>Smart phones</subject><subject>Smartphones</subject><subject>Telecommunication systems</subject><subject>Wi-Fi</subject><issn>1424-8220</issn><issn>1424-8220</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNpdkk1v1DAQhiMEoqVw4A8gS1zgkOLPxD6hpRSo1MJh4Ww59njrVRJv4wTEv2fKllWLfLA18_idmVdTVS8ZPRXC0HeFS06bpjGPqmMmuaw15_TxvfdR9ayULaVcCKGfVkfCyIYyaY6r8xX5Cr_I2sMIZA1jSeOGXOUAPfngCgSSR3K19HOq13mZPJCPbnYkTnkg68FN8-46j1CeV0-i6wu8uLtPqh-fzr-ffakvv32-OFtd1l4YM9edoloE5VQwwI2mXojORKHBi9iIKLmWDrMtM7RTyDGqQwutMa3pmkYZcVJd7HVDdlu7mxK28Ntml-zfQJ42FntKvgerYiddE8BBCOiO1z4GZljbORYYmoVa7_dau6UbIKAB8-T6B6IPM2O6tpv80zKmGDVMoMKbO4Up3yxQZjuk4qHv3Qh5KVYwjpWkbBWir_9Dt2jniF7dUrRVXDccqdM9tXE4QRpjxsIeT4AheTQ6JoyvNJOSaa4kfni7_-CnXMoE8dA-o_Z2NexhNZB9dX_eA_lvF8QfJ72w1g</recordid><startdate>20241016</startdate><enddate>20241016</enddate><creator>Ding, Zhenke</creator><creator>Deng, Zhongliang</creator><creator>Hu, Enwen</creator><creator>Liu, Bingxun</creator><creator>Zhang, Zhichao</creator><creator>Ma, Mingyang</creator><general>MDPI AG</general><general>MDPI</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0009-0009-0540-5561</orcidid></search><sort><creationdate>20241016</creationdate><title>A New Scene Sensing Model Based on Multi-Source Data from Smartphones</title><author>Ding, Zhenke ; Deng, Zhongliang ; Hu, Enwen ; Liu, Bingxun ; Zhang, Zhichao ; Ma, Mingyang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c399t-b5083d5a5d9e2980c33b9f38ec3f63f4284a5a57190b5d5a108d7e79979b66593</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Altitude</topic><topic>Artificial satellites</topic><topic>Classification</topic><topic>CNN</topic><topic>Data mining</topic><topic>Datasets</topic><topic>Electronics in navigation</topic><topic>GNSS</topic><topic>Machine learning</topic><topic>Mathematical optimization</topic><topic>multi-source sensor</topic><topic>Neural networks</topic><topic>Satellites</topic><topic>scene classification</topic><topic>Sensors</topic><topic>Smart phones</topic><topic>Smartphones</topic><topic>Telecommunication systems</topic><topic>Wi-Fi</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ding, Zhenke</creatorcontrib><creatorcontrib>Deng, Zhongliang</creatorcontrib><creatorcontrib>Hu, Enwen</creatorcontrib><creatorcontrib>Liu, Bingxun</creatorcontrib><creatorcontrib>Zhang, Zhichao</creatorcontrib><creatorcontrib>Ma, Mingyang</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Sensors (Basel, Switzerland)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ding, Zhenke</au><au>Deng, Zhongliang</au><au>Hu, Enwen</au><au>Liu, Bingxun</au><au>Zhang, Zhichao</au><au>Ma, Mingyang</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A New Scene Sensing Model Based on Multi-Source Data from Smartphones</atitle><jtitle>Sensors (Basel, Switzerland)</jtitle><addtitle>Sensors (Basel)</addtitle><date>2024-10-16</date><risdate>2024</risdate><volume>24</volume><issue>20</issue><spage>6669</spage><pages>6669-</pages><issn>1424-8220</issn><eissn>1424-8220</eissn><abstract>Smartphones with integrated sensors play an important role in people's lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect the method and results of multi-source fusion positioning. Based on the multi-source data from smartphone sensors, this study explores five types of information-Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), cellular networks, optical sensors, and Wi-Fi sensors-characterizing the temporal, spatial, and mathematical statistical features of the data, and it constructs a multi-scale, multi-window, and context-connected scene sensing model to accurately detect the environmental scene in indoor, semi-indoor, outdoor, and semi-outdoor spaces, thus providing a good basis for multi-sensor positioning in a multi-sensor navigation system. Detecting environmental scenes provides an environmental positioning basis for multi-sensor fusion localization. This model is divided into four main parts: multi-sensor-based data mining, a multi-scale convolutional neural network (CNN), a bidirectional long short-term memory (BiLSTM) network combined with contextual information, and a meta-heuristic optimization algorithm.</abstract><cop>Switzerland</cop><pub>MDPI AG</pub><pmid>39460149</pmid><doi>10.3390/s24206669</doi><orcidid>https://orcid.org/0009-0009-0540-5561</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1424-8220
ispartof Sensors (Basel, Switzerland), 2024-10, Vol.24 (20), p.6669
issn 1424-8220
1424-8220
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_5fb4a6deaedd420c8cfd1917ba1d1066
source PubMed (Medline); Publicly Available Content Database (Proquest) (PQ_SDU_P3)
subjects Accuracy
Algorithms
Altitude
Artificial satellites
Classification
CNN
Data mining
Datasets
Electronics in navigation
GNSS
Machine learning
Mathematical optimization
multi-source sensor
Neural networks
Satellites
scene classification
Sensors
Smart phones
Smartphones
Telecommunication systems
Wi-Fi
title A New Scene Sensing Model Based on Multi-Source Data from Smartphones
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T19%3A06%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20New%20Scene%20Sensing%20Model%20Based%20on%20Multi-Source%20Data%20from%20Smartphones&rft.jtitle=Sensors%20(Basel,%20Switzerland)&rft.au=Ding,%20Zhenke&rft.date=2024-10-16&rft.volume=24&rft.issue=20&rft.spage=6669&rft.pages=6669-&rft.issn=1424-8220&rft.eissn=1424-8220&rft_id=info:doi/10.3390/s24206669&rft_dat=%3Cgale_doaj_%3EA814418254%3C/gale_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c399t-b5083d5a5d9e2980c33b9f38ec3f63f4284a5a57190b5d5a108d7e79979b66593%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3120752862&rft_id=info:pmid/39460149&rft_galeid=A814418254&rfr_iscdi=true