Loading…

Individual Tree Segmentation Quality Evaluation Using Deep Learning Models LiDAR Based

The study of the forest structure makes it possible to solve many important problems of forest inventory. LiDAR scanning is one of the most widely used methods for obtaining information about a forest area today. To calculate the structural parameters of plantations, a reliable segmentation of the i...

Full description

Saved in:
Bibliographic Details
Published in:Optical memory & neural networks 2023-12, Vol.32 (Suppl 2), p.S270-S276
Main Authors: Grishin, I. A., Krutov, T. Y., Kanev, A. I., Terekhov, V. I.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The study of the forest structure makes it possible to solve many important problems of forest inventory. LiDAR scanning is one of the most widely used methods for obtaining information about a forest area today. To calculate the structural parameters of plantations, a reliable segmentation of the initial data is required, the quality of segmentation can be difficult to assess in conditions of large volumes of forest areas. For this purpose, in this work, a system of correctness and quality of segmentation was developed using deep learning models. Segmentation was carried out on a forest area with a high planting density, using a phased segmentation of layers using the DBSCAN method with preliminary detection of planting coordinates and partitioning the plot using a Voronoi diagram. The correctness model was trained and tested on the extracted data of individual trees on the PointNet ++ and CurveNet neural networks, and good model accuracies were obtained in 89 and 88%, respectively, and are proposed to use the quality assessment of clustering methods, as well as improve the quality of LiDAR data segmentation on separate point clouds of forest plantations by detecting frequently occurring segmentation defects.
ISSN:1060-992X
1934-7898
DOI:10.3103/S1060992X23060061