CyberShake Study 17.3

From SCECpedia
Jump to navigationJump to search

CyberShake Study 16.8 is a computational study to calculate 2 CyberShake hazard models - one with a 1D velocity model, one with a 3D - at 1 Hz in a new region, CyberShake Central California. We will use the GPU implementation of AWP-ODC-SGT, the Graves & Pitarka (2014) rupture variations with 200m spacing and uniform hypocenters, and the UCERF2 ERF. The SGT and post-processing calculations will both be run on both NCSA Blue Waters and OLCF Titan.

Status

Currently we are in the planning stages and hope to begin the study by the end of August, 2016.

Science Goals

The science goals for Study 16.8 are:

  • Expand CyberShake to include Central California sites.
  • Create CyberShake models using both a Central California 1D velocity model and a 3D model (CCA-06).
  • Calculate hazard curves for PG&E pumping stations.

Technical Goals

The technical goals for Study 16.8 are:

  • Run end-to-end CyberShake workflows on Titan, including post-processing.
  • Show that the database migration improved database performance.

Sites

We will run a total of 428 sites as part of Study 16.8. A KML file of these sites, along with the Central and Southern California boxes, is available here (with names) or here (without names).

We created a Central California CyberShake box, defined here.

We have identified a list of 397 sites which fall within the box and outside of the CyberShake Southern California box. These include:

  • 302 sites on a 10 km grid
  • 56 CISN broadband or PG&E stations, decimated so they are at least 5 km apart, and no closer than 2 km from another station.
  • 28 cities used by the USGS in locating earthquakes
  • 4 PG&E pumping stations
  • 6 historic Spanish missions
  • Diablo Canyon, already a CyberShake site (DBCN)

In addition, we will include 31 sites which overlap with the Southern California box, enabling direct comparison of results.

We will prioritize the pumping stations and the overlapping sites.

Velocity Models

We are planning to use 2 velocity models in Study 16.8:

  1. CCA-06, a 3D model created via tomographic inversion by En-Jui Lee. This model has a minimum Vs of 900 m/s and no GTL. Our order of preference will be:
    1. CCA-06
    2. CVM-S4.26
    3. SCEC background 1D model
  2. CCA-1D, a 1D model created by averaging CCA-06 throughout the Central California region.

We will run the 1D and 3D model concurrently.

Verification

As part of our verification work, we plan to first do runs using both the 1D and 3D model, with and without the Northern SAF events, for the following 3 sites in the overlapping region:

  • s001
  • OSI
  • s169

Once we are comfortable with those results, we will do runs with the 1D and 3D models for the following sites:

  • Bakersfield (-119.018711,35.373292), Wald Vs30 = 206
  • Santa Barbara (-119.698189,34.420831), Wald Vs30 = 332
  • Parkfield (-120.432800,35.899700), Wald Vs30 = 438

Performance Enhancements (over Study 15.4)

Codes

Computational and Data Estimates

Computational Time

Since we are using a min Vs=900 m/s, we will use a grid spacing of 175 m, and dt=0.00875 in the SGT simulation (and 0.0875 in the seismogram synthesis).

For computing these estimates, we are using a volume of 420 km x 1160 km x 50 km, or 2400 x 6630 x 286 grid points. This is about 4.5 billion grid points, approximately half the size of the Study 15.4 typical volume.

We estimate that we will run 75% of the sites from each model on Blue Waters, and 25% on Titan. This is because we are charged less for Blue Waters sites (we are charged for the Titan GPUs even if we don't use them), and we have more time available on Blue Waters. However, we will use a dynamic approach during runtime, so the resulting numbers may differ.

Titan

SGTs (GPU): 750 node-hrs per site x 252 sites = 189,000 node-hours.

Post-processing (CPU): 1400 node-hrs per site x 252 sites = 352,800 node-hours.

Total: 20.3M SUs ((189,000 + 352,800) x 30 SUs/node-hr + 25% margin)

We have 36M SUs available on Titan.

Blue Waters

Pre-processing (CPU): 100 node-hrs/site x 754 sites = 75,400 node-hours.

SGTs (GPU): 750 node-hrs per site x 754 sites = 565,500 node-hours.

Post-processing (CPU): 700 node-hrs per site x 754 sites = 527,800 node-hours.

Total: 1.46M node-hrs ((75,400 + 565,500 + 527,800) + 25% margin)

We have 3.29M node-hrs available on Blue Waters.

Storage Requirements

We plan to calculate geometric mean, RotD values, and duration metrics for all seismograms. We will use Pegasus's cleanup capabilities to avoid exceeding quotas.

Titan

Purged space to store intermediate data products: (900 GB SGTs + 60 GB mesh + 900 GB reformatted SGTs)/site x 252 sites = 458 TB

Purged space to store output data: (15 GB seismograms + 0.2 GB PSA + 0.2 GB RotD + 0.2 GB duration) x 252 sites = 3.8 TB

Blue Waters

Purged space to store intermediate data products: (900 GB SGTs + 60 GB mesh + 900 GB reformatted SGTs)/site x 754 sites = 1370 TB

Purged space to store output data: (15 GB seismograms + 0.2 GB PSA + 0.2 GB RotD + 0.2 GB duration) x 754 sites = 11.5 TB

SCEC

Archival disk usage: 15 TB seismograms + 0.1 TB PSA files + 0.1 TB RotD files + 0.1 TB duration files on scec-02 (has 109 TB free) & 24 GB curves, disaggregations, reports, etc. on scec-00 (109 TB free)

Database usage: (5 rows PSA + 7 rows RotD + 9 rows duration)/rupture variation x 500K rupture variations/site x 1006 sites = 10.5 billion rows x 125 bytes/row = 1.3 TB (3.9 TB free on moment.usc.edu disk)

Temporary disk usage: 1 TB workflow logs. scec-02 has 109 TB free.

Production Checklist

  • Check CISN stations; preference for broadband and PG&E stations
  • Add DBCN to station list.
  • Complete the database migration outlined in 2016_CyberShake_database_migration.
  • Install 1D model in UCVM 15.10.0 on Blue Waters and Titan.
  • Decide on 3D velocity model to use.
  • Upgrade Condor on shock to v8.4.8.
  • Get the Pegasus Dashboard up and running.
  • Generate test hazard curves for 4 sites, including an overlapping site between Blue Waters and Titan.
  • Confirm test results with science group.
  • Determine CyberShake volume for corner points in Central CA region, and if we need to modify the 200 km cutoff.
  • Modify submit job on shock to distribute end-to-end workflows between Blue Waters and Titan.
  • Add new velocity models into CyberShake database.
  • Create XML file describing study for web monitoring tool