Abstract
We developed a GPU-accelerated 2D physically based distributed rainfall runoff model for a PC environment. The governing equations were derived from the diffusive wave model for surface flow and the Horton infiltration model for rainfall loss. A numerical method for the diffusive wave equations was implemented based on a Godunov-type finite volume scheme. The flux at the computational cell interface was reconstructed using the piecewise linear monotonic upwind scheme for conservation laws with a van Leer slope total variation diminishing limiter. Parallelization was implemented using CUDA-Fortran with an NVIDIA GeForce GTX 1060 GPU. The proposed model was tested and verified against several 1D and 2D rainfall runoff processes with various topographies containing depressions. Simulated hydrographs, water depth, and velocity were compared to analytical solutions, dynamic wave modeling results, and measurement data. The diffusive wave model reproduced the runoff processes of impermeable basins with results similar to those of analytical solutions and the numerical results of a dynamic wave model. For ideal permeable basins containing depressions such as furrows and ponds, physically reasonable rainfall runoff processes were observed. From tests on a real basin with complex terrain, reasonable agreement with the measured data was observed. The performance of parallel computing was very efficient as the number of grids increased, achieving a maximum speedup of approximately 150 times compared to a CPU version using an Intel i7 4.7-GHz CPU in a PC environment.
Original language | English |
---|---|
Article number | 1447 |
Journal | Water (Switzerland) |
Volume | 11 |
Issue number | 7 |
DOIs | |
State | Published - 1 Jul 2019 |
Keywords
- CUDA
- Diffusive wave model
- GPU
- Rainfall-runoff