Loading…
Using Convolutional Neural Networks to Build a Lightweight Flood Height Prediction Model with Grad-Cam for the Selection of Key Grid Cells in Radar Echo Maps
Recent climate change has brought extremely heavy rains and widescale flooding to many areas around the globe. However, previous flood prediction methods usually require a lot of computation to obtain the prediction results and impose a heavy burden on the unit cost of the prediction. This paper pro...
Saved in:
Published in: | Water (Basel) 2022-01, Vol.14 (2), p.155 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Recent climate change has brought extremely heavy rains and widescale flooding to many areas around the globe. However, previous flood prediction methods usually require a lot of computation to obtain the prediction results and impose a heavy burden on the unit cost of the prediction. This paper proposes the use of a deep learning model (DLM) to overcome these problems. We alleviated the high computational overhead of this approach by developing a novel framework for the construction of lightweight DLMs. The proposed scheme involves training a convolutional neural network (CNN) by using a radar echo map in conjunction with historical flood records at target sites and using Grad-Cam to extract key grid cells from these maps (representing regions with the greatest impact on flooding) for use as inputs in another DLM. Finally, we used real radar echo maps of five locations and the flood heights record to verify the validity of the method proposed in this paper. The experimental results show that our proposed lightweight model can achieve similar or even better prediction accuracy at all locations with only about 5~15% of the operation time and about 30~35% of the memory space of the CNN. |
---|---|
ISSN: | 2073-4441 2073-4441 |
DOI: | 10.3390/w14020155 |