Loading…
Deep learning‐based method for solving seepage equation under unsteady boundary
Deep learning‐based methods for solving partial differential equations have become a research hotspot. The approach builds on the previous work of applying deep learning methods to partial differential equations, which avoid the need for meshing and linearization. However, deep learning‐based method...
Saved in:
Published in: | International journal for numerical methods in fluids 2024-01, Vol.96 (1), p.87-101 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Deep learning‐based methods for solving partial differential equations have become a research hotspot. The approach builds on the previous work of applying deep learning methods to partial differential equations, which avoid the need for meshing and linearization. However, deep learning‐based methods face difficulties in effectively solving complex turbulent systems without using labeled data. Moreover, issues such as failure to converge and unstable solution are frequently encountered. In light of this objective, this paper presents an approximation‐correction model designed for solving the seepage equation featuring unsteady boundaries. The model consists of two neural networks. The first network acts as an asymptotic block, estimating the progression of the solution based on its asymptotic form. The second network serves to fine‐tune any errors identified in the asymptotic block. The solution to the unsteady boundary problem is achieved by superimposing these progressive blocks. In numerical experiments, both a constant flow scenario and a three‐stage flow scenario in reservoir exploitation are considered. The obtained results show the method's effectiveness when compared to numerical solutions. Furthermore, the error analysis reveals that this method exhibits superior solution accuracy compared to other baseline methods.
We use a deep learning‐based method to solve the problem of underground seepage without any labeled data. A novel approximation‐correction model is proposed in this paper, which combines neural networks with the asymptotic solution of partial differential equations to construct an asymptotic block. By superimposing the asymptotic block, it can solve problems with unsteady boundary conditions, which greatly enhances the solution accuracy. |
---|---|
ISSN: | 0271-2091 1097-0363 |
DOI: | 10.1002/fld.5238 |