Difference between revisions of "CyberShake Computational Estimates"

From SCECpedia
Jump to navigationJump to search
Line 1: Line 1:
 
We will describe or current best estimates for the CyberShake computational and data requirements as we progress in our simulation planning and testing. These estimates will help us identify which aspects of the CyberShake computational system needs to be optimized to work within our time and resource constraints.
 
We will describe or current best estimates for the CyberShake computational and data requirements as we progress in our simulation planning and testing. These estimates will help us identify which aspects of the CyberShake computational system needs to be optimized to work within our time and resource constraints.
  
== SGT Simulation Parameters ==
+
All these estimates are for UCERF 3, estimating that the number of ruptures increases from 15000 to 200,000, but the number of rupture variations per rupture remains the same.
<pre>
 
1Hz CyberShake Estimates 2010.11.1
 
Max Freq 1Hz
 
Mesh Size Mesh Pts 12,000,000,000 320
 
Time Steps 40,000
 
 
Total Sites 4240
 
 
Total CPU Hours 320M
 
Estimated BW 10+Pflop/s
 
 
Estimated 1Hz Jaguar (half machine) 120 Days
 
Esimated 1Hz BW 40 Days
 
Estimated Effeciency Jagaur 8-10%
 
 
Jaguar 2.6GHz x 4FPU 10.4Gflop/s per core
 
Blue Waters 4GHz x 8 FPUs 32Gflop/s per core
 
  
</pre>
+
== 0.5 Hz (per site) ==
  
== Southern California, 0.5 Hz (current functionality) ==
+
Number of rupture variations:  5.5 million
  
Sites: 223 sites (802 on 5 km grid)
+
=== Deterministic ===
  
Jobs: 190 million
+
Number of jobs: 5.6 million
  
CPU-hours: 5.5 million (Ranger)
+
Storage: 40 GB SGTs, 125 GB seismograms
  
Data products (seismograms, spectral acceleration): 2.1 TB
+
SUs: 12k SGTs + 26k post-processing = 38k
  
Runtime on half-Ranger: 174 hrs (7.3 days)
+
=== Broadband ===
  
Runtime on half-Jaguar: 40 hrs (1.7 days)
+
Number of jobs: 16.6 million
  
Runtime on half-BW(=half-Mira): 10 hrs
+
Storage: 40 GB SGTs, 500 GB seismograms
  
Database entries: 366 million
+
SUs: 12k SGTs + 52k post-processing = 64k
  
== Southern California, 1.0 Hz ==
 
  
AWP-ODC
+
== 1 Hz (per site) ==
  
SRFs increase by 25x to ~25 TB
+
Number of rupture variations:  5.5 million
  
Sites: 223 sites
+
=== Deterministic ===
  
Jobs: 190 million
+
Number of jobs: 5.6 million
  
CPU-hours: 19.3 million (Ranger)
+
Storage: 320 GB SGTs, 250 GB seismograms
  
Data products (seismograms, spectral acceleration): 4.0 TB
+
SUs: 80k SGTs + 150k post-processing = 230k
  
Runtime on half-Ranger: 613 hrs (25.5 days)
+
=== Broadband ===
  
Runtime on half-Jaguar: 142 hrs (5.9 days)
+
Number of jobs: 16.6 million
  
Runtime on half-BW(=half-Mira): 35 hrs (1.5 days)
+
Storage: 320 GB SGTs, 2 TB seismograms
  
Database entries: 366 million
+
SUs: 80k SGTs + 200k post-processing = 280k
  
== California, 0.5 Hz ==
+
== Southern California simulations ==
  
=== Current software ===
+
200 sites
 +
=== 0.5 Hz, deterministic ===
  
Sites: 4240
+
Number of jobs: 1.1 billion
  
Jobs: 3.6 billion
+
Storage: 7.8 TB SGTs, 24.4 TB seismograms
  
CPU-hours: 104.6 million (Ranger)
+
SUs: 7.6 million
  
Data products (seismograms, spectral acceleration): 39.9 TB
+
=== 0.5 Hz, broadband ===
  
Runtime on half-Ranger: 3322 hrs (138.4 days)
+
Number of jobs: 3.3 billion
  
Runtime on half-Jaguar: 771 hrs (32.2 days)
+
Storage: 7.8 TB SGTs, 97 TB seismograms
  
Runtime on half-BW(=half-Mira): 192 hrs (8 days)
+
SUs: 12.8 million
  
Database entries: 6.95 billion
+
=== 1 Hz, deterministic ===
  
=== With AWP-ODC ===
+
Number of jobs:  1.1 billion
  
Sites: 4240
+
Storage: 62.5 TB SGTs, 48.8 TB seismograms
  
Jobs: 3.6 billion
+
SUs: 46 million
  
CPU-hours: 83.5 million (Ranger)
+
=== 1 Hz, broadband ===
  
Data products (seismograms, spectral acceleration): 39.9 TB
+
Number of jobs: 3.3 billion
  
Runtime on half-Ranger: 2652 hrs (110.5 days)
+
Storage: 62.5 TB SGTs, 400 TB seismograms
  
Runtime on half-Jaguar: 616 hrs (25.7 days)
+
SUs: 56 million
  
Runtime on half-BW(=half-Mira): 153 hrs (6.4 days)
+
== Statewide simulations ==
  
Database entries: 6.95 billion
+
1400 sites
 +
=== 0.5 Hz, deterministic ===
  
== California, 1.0 Hz ==
+
Number of jobs:  7.8 billion
  
AWP-ODC
+
Storage:  54.7 TB SGTs, 171 TB seismograms
  
SRFs increase by 25x to ~45 TB
+
SUs: 53 million
  
[http://hypocenter.usc.edu/research/cybershake/CA_10km_sites.png Site Map: 4240]
+
=== 0.5 Hz, broadband ===
  
Jobs: 3.6 billion
+
Number of jobs: 23.2 billion
  
CPU-hours: 376.2 million (337.8 million - SGT generation only, 339.1 million - SGT workflow)
+
Storage: 54.7 TB SGTs, 683 TB seismograms
  
Data products (1 Hz seismograms, spectral acceleration): 76.5 TB, (10 Hz seismograms): 230 TB
+
SUs: 89.6 million
  
Runtime on half-Ranger: 11947 hrs (497.8 days)
+
=== 1 Hz, deterministic ===
  
Runtime on half-Jaguar: 3096 hrs (129 days)
+
Number of jobs: 7.8 billion
  
Runtime on half-BW(=half-Mira): 770 hrs (32.1 days) (4.3% of yearly CPU-hrs)
+
Storage: 437.5 TB SGTs, 341.8 TB seismograms
  
Database entries: 6.95 billion
+
SUs: 322 million
 +
 
 +
=== 1 Hz, broadband ===
 +
 
 +
Number of jobs:  23.2 billion
 +
 
 +
Storage:  437.5 TB SGTs, 2800 TB seismograms
 +
 
 +
SUs:  392 million

Revision as of 23:10, 3 April 2012

We will describe or current best estimates for the CyberShake computational and data requirements as we progress in our simulation planning and testing. These estimates will help us identify which aspects of the CyberShake computational system needs to be optimized to work within our time and resource constraints.

All these estimates are for UCERF 3, estimating that the number of ruptures increases from 15000 to 200,000, but the number of rupture variations per rupture remains the same.

0.5 Hz (per site)

Number of rupture variations: 5.5 million

Deterministic

Number of jobs: 5.6 million

Storage: 40 GB SGTs, 125 GB seismograms

SUs: 12k SGTs + 26k post-processing = 38k

Broadband

Number of jobs: 16.6 million

Storage: 40 GB SGTs, 500 GB seismograms

SUs: 12k SGTs + 52k post-processing = 64k


1 Hz (per site)

Number of rupture variations: 5.5 million

Deterministic

Number of jobs: 5.6 million

Storage: 320 GB SGTs, 250 GB seismograms

SUs: 80k SGTs + 150k post-processing = 230k

Broadband

Number of jobs: 16.6 million

Storage: 320 GB SGTs, 2 TB seismograms

SUs: 80k SGTs + 200k post-processing = 280k

Southern California simulations

200 sites

0.5 Hz, deterministic

Number of jobs: 1.1 billion

Storage: 7.8 TB SGTs, 24.4 TB seismograms

SUs: 7.6 million

0.5 Hz, broadband

Number of jobs: 3.3 billion

Storage: 7.8 TB SGTs, 97 TB seismograms

SUs: 12.8 million

1 Hz, deterministic

Number of jobs: 1.1 billion

Storage: 62.5 TB SGTs, 48.8 TB seismograms

SUs: 46 million

1 Hz, broadband

Number of jobs: 3.3 billion

Storage: 62.5 TB SGTs, 400 TB seismograms

SUs: 56 million

Statewide simulations

1400 sites

0.5 Hz, deterministic

Number of jobs: 7.8 billion

Storage: 54.7 TB SGTs, 171 TB seismograms

SUs: 53 million

0.5 Hz, broadband

Number of jobs: 23.2 billion

Storage: 54.7 TB SGTs, 683 TB seismograms

SUs: 89.6 million

1 Hz, deterministic

Number of jobs: 7.8 billion

Storage: 437.5 TB SGTs, 341.8 TB seismograms

SUs: 322 million

1 Hz, broadband

Number of jobs: 23.2 billion

Storage: 437.5 TB SGTs, 2800 TB seismograms

SUs: 392 million