Loading…
Stingray-HPC: A Scalable Parallel Seismic Raytracing System
The Stingray raytracer was developed for marine seismology to compute minimum travel time from all sources in an earth model to determine the 3D geophysical structure below the ocean floor. The original sequential implementation of Stingray used Dijkstra's single-source, shortest-path (SSSP) al...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The Stingray raytracer was developed for marine seismology to compute minimum travel time from all sources in an earth model to determine the 3D geophysical structure below the ocean floor. The original sequential implementation of Stingray used Dijkstra's single-source, shortest-path (SSSP) algorithm. A data parallel version of Stingray was developed based on the Bellman-Ford-Moore iterative SSSP algorithm. Single node experiments demonstrated performance improvements from parallelization with multicore (using OpenMP) and manycore processors (using CUDA). Calculating seismic ray paths for larger earth models requires distributed, multi-node algorithms utilizing domain decomposition methods. Preliminary 2D decomposition strategies show promising scaling results. However, a general 3D decomposition methodology is needed to handle any seismic raytracing problem on any HPC computing platform. In this paper, we present Stingray-HPC, a framework for scalable seismic raytracing which can automatically decompose a 3D earth model across nodes in a distributed environment, allocate ghost cell regions for iterative updates, coordinate ghost cell communications, and test for global convergence. Stingray-HPC is implemented with MPI and either OpenMP or CUDA for node- level calculations. Our results validate Stingray-HPC's ability to handle large models (over a billion points) and to solve these models efficiently at scale up to 512 GPU nodes. |
---|---|
ISSN: | 2377-5750 |
DOI: | 10.1109/PDP2018.2018.00035 |