Abstract
Physics-informed neural networks (PINNs) are an emerging technology in the scientific computing domain. Contrary to data-driven methods, PINNs have been shown to be able to approximate and generalize well a wide range of partial differential equations (PDEs) by imbedding the underlying physical laws describing the PDE. PINNs, however, can struggle with the modeling of hyperbolic conservation laws that develop shocks, and a classic example of this is the Buckley–Leverett problem for fluid flow in porous media. In this work, we explore specialized neural network architectures for modeling the Buckley–Leverett shock front. We present extensions of the standard multilayer perceptron (MLP) that are inspired by the attention mechanism. The attention-based model was, compared to the multilayer perceptron model, and the results show that the attention-based architecture is more robust in solving the hyperbolic Buckley–Leverett problem, more data-efficient, and more accurate. Moreover, by utilizing distance functions, we can obtain truly data-free solutions to the Buckley–Leverett problem. In this approach, the initial and boundary conditions (I/BCs) are imposed in a hard manner as opposed to a soft manner, where labeled data are provided on the I/BCs. This allows us to use a substantially smaller NN to approximate the solution to the PDE.
Original language | British English |
---|---|
Article number | 7864 |
Journal | Energies |
Volume | 15 |
Issue number | 21 |
DOIs | |
State | Published - Nov 2022 |
Keywords
- fluid flow in porous media
- machine-learning
- PINNs
- transformers