Loading…

Advancing remote sensing: a unified deep learning approach with pretrained and custom architectures for high-precision classification

Advancement of remote sensing is vital for accurate land cover mapping and ecological surveillance. This research proposes a cutting-edge deep learning framework that integrates a meticulously designed customary network architecture with the effectiveness of pre-trained models, namely GoogleNet, VGG...

Full description

Saved in:
Bibliographic Details
Published in:Physica scripta 2024-11, Vol.99 (11), p.116012
Main Authors: N, Salma, G R, Madhuri, Jagadale, Basavaraj
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Advancement of remote sensing is vital for accurate land cover mapping and ecological surveillance. This research proposes a cutting-edge deep learning framework that integrates a meticulously designed customary network architecture with the effectiveness of pre-trained models, namely GoogleNet, VGG16, and InceptionV3. Our approach captures the complex features of RGB satellite images throughout various land cover categories using diverse EuroSAT datasets. Our system achieves an impressive 99.40% test accuracy by implementing feature extraction through effective models with a customized deep learning network in an intuitive manner. High class-wise accuracies are observed, ranging from 96.00% to 100.00%, whereas F1-scores, precision, and recall all converge at 99.40%. The aforementioned results demonstrate our approach’s perspective to significantly improve analysis while also confirming its superiority. These results not only confirm the performance of our algorithm but also show that it could be used to redesign satellite visual analysis behavior. These results open the window to further improve and accurate remote sensing systems, giving a helpful viewpoint for researchers as well as professionals.
ISSN:0031-8949
1402-4896
DOI:10.1088/1402-4896/ad8491