Loading…

Surface and underwater human pose recognition based on temporal 3D point cloud deep learning

Airborne surface and underwater human pose recognition are crucial for various safety and surveillance applications, including the detection of individuals in distress or drowning situations. However, airborne optical cameras struggle to achieve simultaneous imaging of the surface and underwater bec...

Full description

Saved in:
Bibliographic Details
Published in:Scientific reports 2024-01, Vol.14 (1), p.55-55, Article 55
Main Authors: Wang, Haijian, Wu, Zhenyu, Zhao, Xuemei
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Airborne surface and underwater human pose recognition are crucial for various safety and surveillance applications, including the detection of individuals in distress or drowning situations. However, airborne optical cameras struggle to achieve simultaneous imaging of the surface and underwater because of limitations imposed by visible-light wavelengths. To address this problem, this study proposes the use of light detection and ranging (LiDAR) to simultaneously detect humans on the surface and underwater, whereby human poses are recognized using a neural network designed for irregular data. First, a temporal point-cloud dataset was constructed for surface and underwater human pose recognition to enhance the recognition of comparable movements. Subsequently, radius outlier removal (ROR) and statistical outlier removal (SOR) were employed to alleviate the impact of noise and outliers in the constructed dataset. Finally, different combinations of secondary sampling methods and sample sizes were tested to improve recognition accuracy using PointNet++. The experimental results show that the highest recognition accuracy reached 97.5012%, demonstrating the effectiveness of the proposed human pose detection and recognition method.
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-023-50658-4