Difference between revisions of "CyberShake Study 15.4"

From SCECpedia
Jump to navigationJump to search
Line 74: Line 74:
 
In describing our results, we will refer to the "simulation frequency" and the "source frequency".  The simulation frequency refers to the choice of mesh spacing and dt.  The source frequency is the frequency the impulse used in the SGT simulation was low-pass filtered (using a 4th order Butterworth filter) at.
 
In describing our results, we will refer to the "simulation frequency" and the "source frequency".  The simulation frequency refers to the choice of mesh spacing and dt.  The source frequency is the frequency the impulse used in the SGT simulation was low-pass filtered (using a 4th order Butterworth filter) at.
  
First, we performed a run with a 0.5 Hz simulation frequency and a 1.0 Hz source frequency, and compared it to the runs we had been doing in the past, which are 0.5 Hz simulation / 0.5 Hz source.   
+
All of these calculations were done for WNGC, ERF 36, with uniform ruptures and the AWP-ODC-GPU SGT code.
 +
 
 +
First, we performed a run with a 0.5 Hz simulation frequency and a 1.0 Hz source frequency, and compared it to the runs we had been doing in the past, which are 0.5 Hz simulation / 0.5 Hz source.  The 1.0 Hz source frequency has an impact on the hazard curves, even at 3 and 5 seconds.
  
 
{|
 
{|
Line 81: Line 83:
 
| [[File:0.5Hz source comparison 10s.png]]
 
| [[File:0.5Hz source comparison 10s.png]]
 
|}
 
|}
 +
 +
From spectral plots of the largest 3 sec PSA seismograms, we can see that the PseudoAA response is affected, even at periods much higher than the filter frequency:
 +
 +
[[File:0.5Hz source comparison respect.png]]
 +
 +
Next, we repeated the same experiment for a 1.0 Hz simulation frequency and a 1.0 Hz and 2.0 Hz source frequency.  Similarly, using the higher frequency has an impact both on hazard curves and on PseudoAA response:
 +
 +
{|
 +
| [[File:1Hz source comparison 2s.png]]
 +
| [[File:1Hz source comparison 3s.png]]
 +
| [[File:1Hz source comparison 5s.png]]
 +
| [[File:1Hz source comparison 10s.png]]
 +
|}
 +
 +
[[File:1.0Hz source comparison respect.png]]
 +
 +
It seems that by increasing the frequency of the seismic filtering, we are able to get PSA values which are more accurate at frequencies closer to the simulation frequency.
 +
 +
This does have small impacts on the seismograms; for example, here are plots of two of the largest seismograms for WNGC with a 1 Hz and 2 Hz source filter.  The seismograms generated with the 2 Hz source filter have sharper peaks which are a results of their higher frequency content, but it should not be trusted, as the mesh spacing and dt of the simulation do not justify accuracy above 1 Hz:
 +
 +
{|
 +
| [[File:1.0Hz seismogram comparison.png]]
 +
| [[File:1.0Hz seismogram comparison.png]]
 +
|}
 +
 +
So for non-frequency-dependent applications of the seismograms, they should be filtered.
  
 
=== Blue Waters vs Titan for SGT calculation ===
 
=== Blue Waters vs Titan for SGT calculation ===

Revision as of 18:44, 23 March 2015

CyberShake Study 15.3 is a computational study to calculate one physics-based probabilistic seismic hazard model for Southern California at 1 Hz, using CVM-S4.26, the GPU implementation of AWP-ODC-SGT, the Graves and Pitarka (2015) rupture variations with uniform hypocenters, and the UCERF2 ERF. The SGT calculations will be split between NCSA Blue Waters and OLCF Titan, and the post-processing will be done entirely on Blue Waters. The goal is to calculate the standard Southern California site list (286 sites) used in previous CyberShake studies so we can produce comparison curves and maps, and data products for the UGMS Committee.

Preparation for production runs

  1. Check list of Mayssa's concerns
  2. Update DAX to support separate MD5 sums
  3. Add MD5 sum job to TC
  4. Evaluate topology-aware scheduling
  5. Get DirectSynth working at full run scale, verify results
  6. Modify workflow to have md5sums be in parallel
  7. Test of 1 Hz simulation with 2 Hz source - 2/27
  8. Add a third pilot job type to Titan pilots - 2/27
  9. Run test of full 1 Hz SGT workflow on Blue Waters - 3/4
  10. Add cleanup to workflow and test - 3/4
  11. Test interface between Titan workflows and Blue Waters workflows - 3/4
  12. Add capability to have files on Blue Waters correctly striped - 3/6
  13. Add restart capability to DirectSynth - 3/6
  14. File ticket for extended walltime for small jobs on Titan - 3/6
  15. Add DirectSynth to workflow tools - 3/6
  16. Implement and test parallel version of reformat_awp - 3/6
  17. Simulate curves for 3 sites with final configuration; compare curves and seismograms - 3/11
  18. File ticket for 90-day purged space at Blue Waters - 3/13
  19. File ticket for reservation at Blue Waters, along with justification - 3/13
  20. Follow up on high priority jobs at Titan - 3/13
  21. Create study description file for Run Manager - 3/13
  22. Set up usage monitoring on Blue Waters and Titan
  23. Add ability to determine if SGTs are being run on Blue Waters or Titan
  24. Modify auto-submit system to distinguish between full runs and PP runs
  25. Science readiness review - 3/18
  26. Technical readiness review - 3/18

Computational Status

We are hoping to begin this study in March 2015.

Data Products

Goals

Science Goals

  1. Calculate a 1 Hz hazard map of Southern California.
  2. Produce a contour map at 1 Hz for the UGMS committee.
  3. Compare the hazard maps at 0.5 Hz and 1 Hz.

Technical Goals

  1. Show that Titan can be integrated into our CyberShake workflows.
  2. Demonstrate scalability for 1 Hz calculations.
  3. Show that we can split the SGT calculations across sites.

Verification

DirectSynth

A comparison of 1 Hz results with SeisPSA to results with DirectSynth for WNGC. SeisPSA results are in magenta, DirectSynth results are in black. They're so close it's difficult to make out the magenta.

2s 3s 5s 10s
WNGC SeisPSA v DirectSynth 2s.png
WNGC SeisPSA v DirectSynth 3s.png
WNGC SeisPSA v DirectSynth 5s.png
WNGC SeisPSA v DirectSynth 10s.png

2 Hz source

Before beginning Study 15.3, we wanted to investigate our source filtering parameters, to see if it was possible to improve the accuracy of hazard curves at frequencies closer to the CyberShake study frequency.

In describing our results, we will refer to the "simulation frequency" and the "source frequency". The simulation frequency refers to the choice of mesh spacing and dt. The source frequency is the frequency the impulse used in the SGT simulation was low-pass filtered (using a 4th order Butterworth filter) at.

All of these calculations were done for WNGC, ERF 36, with uniform ruptures and the AWP-ODC-GPU SGT code.

First, we performed a run with a 0.5 Hz simulation frequency and a 1.0 Hz source frequency, and compared it to the runs we had been doing in the past, which are 0.5 Hz simulation / 0.5 Hz source. The 1.0 Hz source frequency has an impact on the hazard curves, even at 3 and 5 seconds.

File:0.5Hz source comparison 3s.png File:0.5Hz source comparison 5s.png File:0.5Hz source comparison 10s.png

From spectral plots of the largest 3 sec PSA seismograms, we can see that the PseudoAA response is affected, even at periods much higher than the filter frequency:

File:0.5Hz source comparison respect.png

Next, we repeated the same experiment for a 1.0 Hz simulation frequency and a 1.0 Hz and 2.0 Hz source frequency. Similarly, using the higher frequency has an impact both on hazard curves and on PseudoAA response:

File:1Hz source comparison 2s.png File:1Hz source comparison 3s.png File:1Hz source comparison 5s.png File:1Hz source comparison 10s.png

File:1.0Hz source comparison respect.png

It seems that by increasing the frequency of the seismic filtering, we are able to get PSA values which are more accurate at frequencies closer to the simulation frequency.

This does have small impacts on the seismograms; for example, here are plots of two of the largest seismograms for WNGC with a 1 Hz and 2 Hz source filter. The seismograms generated with the 2 Hz source filter have sharper peaks which are a results of their higher frequency content, but it should not be trusted, as the mesh spacing and dt of the simulation do not justify accuracy above 1 Hz:

File:1.0Hz seismogram comparison.png File:1.0Hz seismogram comparison.png

So for non-frequency-dependent applications of the seismograms, they should be filtered.

Blue Waters vs Titan for SGT calculation

Sites

We are proposing to run 286 sites around Southern California. Those sites include 46 points of interest, 27 precarious rock sites, 23 broadband station locations, 43 20 km gridded sites, and 147 10 km gridded sites. All of them fall within the Southern California box except for Diablo Canyon and Pioneer Town. You can get a CSV file listing the sites here. A KML file listing the sites is available here.

Fig 1: Sites selected for Study 2.3 Purple are gridded sites, red are precarious rocks, orange are SCSN stations, and yellow are sites of interest.

Performance Enhancements (over Study 14.2)

Responses to Study 14.2 Lessons Learned

  • AWP_ODC_GPU code, under certain situations, produced incorrect filenames.

This was fixed during the Study 14.2 run.

  • Incorrect dependency in DAX generator - NanCheckY was a child of AWP_SGTx.

This was fixed during the Study 14.2 run.

  • Try out Pegasus cleanup - accidentally blew away running directory using find, and later accidentally deleted about 400 sets of SGTs.

We have added cleanup to the SGT workflow, since that's where most of the extra data is generated, especially with two copies of the SGTs (the ones generated by AWP-ODC-GPU, and then the reformatted ones).

  • 50 connections per IP is too many for hpc-login2 gridftp server; brings it down. Try using a dedicated server next time with more aggregated files.

We have moved our USC gridftp transfer endpoint to hpc-scec.usc.edu, which does very little other than GridFTP transfers.

SGT codes

  • We have moved to a parallel version of reformat_awp. With this parallel version, we can reduce the runtime by 65%.

PP codes

  • We have switched from using extract_sgt for the SGT extraction and SeisPSA for the seismogram synthesis to DirectSynth, a code which reads in the SGTs across multiple cores and then uses MPI to send them directly to workers, which perform the seismogram synthesis. We anticipate this code will give us an efficiency improvement of at least 50% over the old approach, since it does not require the writing and reading of the extracted SGT files.

Workflow management

  • We are using a pilot job daemon on Titan to monitor the shock queue and submit pilot jobs to Titan accordingly.
  • The MD5sums calculated on the SGTs at the start of the post-processing now run in parallel with the actual post-processing calculations. If the MD5 sum job fails, the entire workflow will be aborted, but since that is rare, the majority of the time the rest of the post-processing workflow can continue without having the MD5 sums in the critical path.

Codes

Lessons Learned

Computational and Data Estimates

Computational Time

Titan

SGTs (GPU): 1800 node-hrs/site x 143 sites = 258K node-hours = 7.7M SUs

Add 25% margin: 9.6M SUs

Blue Waters

SGTs (GPU): 1300 node-hrs/site x 143 sites = 186K node-hours (3.0M SUs), XK nodes

PP: 2500 node-hrs/site x 286 sites = 715K node-hours (22.9M SUs), XE nodes

Add 25% margin: 1.1M node-hours

Storage Requirements

Titan

Purged space to store SGTs while generating: (1.5 TB SGTs + 120 GB mesh + 1.5 TB reformatted SGTs)/site x 143 sites = 446 TB

Blue Waters

Space to store SGTs (delayed purge): 1.5 TB/site x 286 sites = 429 TB

Purged disk usage: (1.5 TB SGTs + 120 GB mesh + 1.5 TB reformatted SGTs)/site x 143 sites + (27 GB/site seismograms + 0.2 GB/site PSA + 0.2 GB/site RotD) x 286 sites = 453 TB

SCEC

Archival disk usage: 7.5 TB seismograms + 0.1 TB PSA files + 0.1 TB RotD files on scec-04 (has 171 TB free) & 24 GB curves, disaggregations, reports, etc. on scec-00 (171 TB free)

Database usage: (5 rows PSA + 7 rows RotD)/rupture variation x 450K rupture variations/site x 286 sites = 1.5 billion rows x 151 bytes/row = 227 GB (4.3 TB free on focal.usc.edu disk)

Temporary disk usage: 515 GB workflow logs. scec-02 has 171 TB free.

Performance Metrics

Presentations and Papers