

<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://strike.scec.org/scecwiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Kmilner</id>
	<title>SCECpedia - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://strike.scec.org/scecwiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Kmilner"/>
	<link rel="alternate" type="text/html" href="https://strike.scec.org/scecpedia/Special:Contributions/Kmilner"/>
	<updated>2026-04-23T19:12:29Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.34.2</generator>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26316</id>
		<title>CyberShake Study 21.12</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26316"/>
		<updated>2021-12-08T18:37:31Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* Production Checklist */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake 21.12 is a computational study to use a new ERF with CyberShake, generated from an RSQSim catalog.  We plan to calculate results for 335 sites in Southern California using the RSQSim ERF, a minimum Vs of 500 m/s, and a frequency of 1 Hz.  We will use the CVM-S4.26.M01 model, and the GPU implementation of AWP-ODC-SGT enhanced from the BBP verification testing.  We will begin by generating all sets of SGTs, on Summit, then post-process them on a combination of Summit and Frontera.&lt;br /&gt;
&lt;br /&gt;
== Status ==&lt;br /&gt;
&lt;br /&gt;
This study is in the pre-production phase.  Production is scheduled to begin in mid-December, 2021.&lt;br /&gt;
&lt;br /&gt;
== Data Products ==&lt;br /&gt;
&lt;br /&gt;
== Science Goals ==&lt;br /&gt;
&lt;br /&gt;
The science goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Calculate a regional CyberShake model using an alternative, RSQSim-derived ERF.&lt;br /&gt;
*Compare results from an RSQSim ERF to results using a UCERF2 ERF (Study 15.4).&lt;br /&gt;
*Quantify effects of source model non-ergodicity&lt;br /&gt;
*Compare spatial distribution of ground motions (including  directivity) to empirical and kinematic models&lt;br /&gt;
&lt;br /&gt;
== Technical Goals ==&lt;br /&gt;
&lt;br /&gt;
The technical goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Perform a study using OLCF Summit as a key compute resource.&lt;br /&gt;
*Evaluate the performance of the new workflow submission host, shock-carc.&lt;br /&gt;
*Use Globus Online for staging of output data products.&lt;br /&gt;
&lt;br /&gt;
== ERF ==&lt;br /&gt;
&lt;br /&gt;
The ERF was generated from an RSQSim catalog, with the following parameters:&lt;br /&gt;
*715kyr catalog (the first 65k years of events were dropped, so that every fault's first event is excluded)&lt;br /&gt;
*220,927 earthquakes with M6.5+&lt;br /&gt;
*All events have equal probability, 1/715k&lt;br /&gt;
&lt;br /&gt;
Additional details are available on the [http://opensha.usc.edu/ftp/kmilner/markdown/rsqsim-analysis/catalogs/rundir4983_stitched/#bruce-4983-stitched catalog's metadata page], and the catalog and input fault geometry files can be [https://zenodo.org/record/5542222 downloaded from zeonodo]. This is the catalog used in [https://pubs.geoscienceworld.org/ssa/bssa/article/111/2/898/593757/Toward-Physics-Based-Nonergodic-PSHA-A-Prototype Milner et al., 2021], which used 0.5 Hz CyberShake simulations performed in May, 2020.&lt;br /&gt;
&lt;br /&gt;
== Sites ==&lt;br /&gt;
&lt;br /&gt;
We will run a list of 335 sites, taken from the site list that was used in other Southern California studies. The order of execution will be:&lt;br /&gt;
&lt;br /&gt;
*10 sites used in Milner et al. (2021), each with top mesh point Vs at the 500 m/s floor: USC, SMCA, OSI, WSS, SBSM, LAF, s022, STNI, WNGC, PDE&lt;br /&gt;
*PAS hard rock site&lt;br /&gt;
*20 km site grid&lt;br /&gt;
*10 km site grid&lt;br /&gt;
*Remaining POIs, select 5km grid sites also used in Study 15.4&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_21.12_site_map.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
*[[Media:Study_21.12_sites.csv|CSV site list]]&lt;br /&gt;
*[[Media:Study_21.12_sites_names.kml|KML site list with names]]&lt;br /&gt;
*[[Media:Study_21.12_sites_no_names.kml|KML site list without names]]&lt;br /&gt;
&lt;br /&gt;
== Velocity Model ==&lt;br /&gt;
&lt;br /&gt;
We will use CVM-S4.26.M01.&lt;br /&gt;
&lt;br /&gt;
To better represent the near-surface layer, we will populate the velocity parameters for the surface point by querying the velocity model at a depth of (grid spacing)/4.  For this study, the grid spacing is 100m, so we will query UCVM at a depth of 25m and use that value to populate the surface grid point.  The rationale is that the media parameters at the surface grid point are supposed to represent the material properties for [0, 50m], and this is better represented by using the value at 25m than the value at 0m.&lt;br /&gt;
&lt;br /&gt;
== Verification Tests ==&lt;br /&gt;
&lt;br /&gt;
=== USC Hazard Curves ===&lt;br /&gt;
&lt;br /&gt;
Hazard curve comparisons for site USC, between:&lt;br /&gt;
&lt;br /&gt;
*ERF 58: 0.5 Hz RSQSim ERF used for Milner et al. (2021) calculations, May 2020&lt;br /&gt;
*ERF 61: 0.5 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
*ERF 62: 1 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
&lt;br /&gt;
The first test run of ERF 61 used the wrong (older) mesh lower depth of 40 km, which is why the top right plot differs slightly from the top left. The middle left plot agrees perfectly with the top left.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 58, Original'''&lt;br /&gt;
| '''ERF 61 w/ wrong depth'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF58.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF61_PREV.png|thumb|500px]]&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61 w/ corrected depth'''&lt;br /&gt;
| '''ERF 62 (1 Hz) with old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 62 (1 Hz) with proposed (new) simulation parameters'''&lt;br /&gt;
| '''ERF 62 ccatter comparing new parameters (y) and old parameters (x)'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ASK2014.png|thumb|500px]]&lt;br /&gt;
| [[File:Erf_62prev_62_USC_compare.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
The first 1 Hz test run uses (bottom right above) uses the same simulation paramters as used with ERF 58 and 61, changing only the frequency. Some differences are apparent, which also persist for longer period (5s) curves:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61, 5s Sa, 0.5 Hz, old simulation parameters'''&lt;br /&gt;
| '''ERF 62, 5s Sa, 1 Hz, old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_5s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_5s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Here are seismograms and RotD's for the largest amplitude rupture for this 0.5 Hz vs 1 Hz test:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz.png|thumb|800px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz_rotd.png|thumb|800px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 3s Amplitude Scatter plots ===&lt;br /&gt;
&lt;br /&gt;
These plots show that (left) ERF 61 exactly reproduces ERF 58, and (right) that there are indeed differences going from 0.5 Hz to 1 Hz:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Erf_58_61_USC_compare.png|thumb|500px]]&lt;br /&gt;
| [[File:Erf_58_62_USC_compare.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 1 Hz vs 0.5 Hz comparisons ===&lt;br /&gt;
&lt;br /&gt;
Here is a comparison of 1 Hz runs (y axis) and 0.5 Hz runs (x axis). The top row is a recent USC run with the RSQSim ERF. The bottom row is a previous test with UCERF2-CyberShake and the WNGC site. The columns are 3s, 5s, and 10s SA, respectively, all geometric mean.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_3.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_5.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_10.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_3.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_5.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_10.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Technical and Scientific Updates ==&lt;br /&gt;
&lt;br /&gt;
Since our last study we have made a number of scientific updates to the platform, many as a result of the BBP verification effort.&lt;br /&gt;
&lt;br /&gt;
*Several bugs were found and fixed in the AWP code.&lt;br /&gt;
*We have switched from stress insertion to velocity insertion of the impulse when generating SGTs.&lt;br /&gt;
*The sponge zone used in the absorbing boundary condition was increased from 50 to 80 points.&lt;br /&gt;
*By default, we use a depth of h/4 when querying UCVM to populate the surface grid point.&lt;br /&gt;
*The padding between the nearest fault or site and the edge of the volume was increased from 30 to 50 km.&lt;br /&gt;
*We fixed a bug in the coordinate conversion between RWG and AWP: previously we were adding 1 to the RWG z-coordinate to produce the AWP z-coordinate, but both codes use z=1 to represent the surface and therefore no increment should be applied.&lt;br /&gt;
*When calculating Qs in the SGT header generation code, a default Qs of 25 was always used.  This has been changed to Qs=0.05Vs.&lt;br /&gt;
*We have turned off the adjustment of mu and lambda.&lt;br /&gt;
*FP was increased from 0.5 to 1.0.&lt;br /&gt;
*We modified the lambda and mu calculations in AWP to use the original media parameter values from the velocity mesh rather than the FP-modified ones when calculating strains, to be consistent with RWG.&lt;br /&gt;
&lt;br /&gt;
=== Study 18.8 Lessons Learned ===&lt;br /&gt;
&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider separating SGT and PP workflows in auto-submit tool to better manage the number of each, for improved reservation utilization.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create a read-only way to look at the CyberShake Run Manager website.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider reducing levels of the workflow hierarchy, thereby reducing load on shock.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which require fewer GPUs.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which exceed memory on nodes.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create new velocity model ID for composite model, capturing metadata.&amp;lt;/i&amp;gt;&lt;br /&gt;
We modified the database to enable composite models, but for this study we are just using a single model.&lt;br /&gt;
*&amp;lt;i&amp;gt;Verify all Java processes grab a reasonable amount of memory.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Clear disk space before study begins to avoid disk contention.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Add stress test before beginning study, for multiple sites at a time, with cleanup.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;In addition to disk space, check local inode usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
Only 1% of the inodes are used on shock-carc; we will assume /project has sufficient inodes, as we can't check them.&lt;br /&gt;
*&amp;lt;i&amp;gt;Establish clear rules and policies about reservation usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;If submitting to multiple reservations, make sure enough jobs are eligible to run that no reservation is starved.&amp;lt;/i&amp;gt;&lt;br /&gt;
We are not planning to run this study with reservations.&lt;br /&gt;
*&amp;lt;i&amp;gt;If running primarily SGTs for awhile, make sure they don't get deleted due to quota policies.&amp;lt;/i&amp;gt;&lt;br /&gt;
We will stage the SGTs to HPSS if there is a delay in post-processing them.  Summit has a 90-day purge policy, so we will have some time.&lt;br /&gt;
&lt;br /&gt;
== Output Data Products ==&lt;br /&gt;
&lt;br /&gt;
=== File-based data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to produce the following data products which will be stored at CARC:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Seismograms: 2-component seismograms, 8000 timesteps (400 sec) each&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;PSA: X and Y spectral acceleration at 44 periods (10, 9.5, 9, 8.5, 8, 7.5, 7, 6.5, 6, 5.5, 5, 4.8, 4.6, 4.4, 4.2, 4, 3.8, 3.6, 3.4, 3.2, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.66667, 1.42857, 1.25, 1.11111, 1, .66667, .5, .4, .33333, .285714, .25, .22222, .2, .16667, .142857, .125, .11111, .1 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: PGV, and RotD50, the RotD50 azimuth, and RotD100 at 22 periods (1.0, 1.2, 1.4, 1.5, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8, 3.0, 3.5, 4.0, 4.4, 5.0, 5.5, 6.0, 6.5, 7.5, 8.5, 10.0 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Database data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to store the following data products in the database:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt; &lt;br /&gt;
&amp;lt;li&amp;gt;PSA: none&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50 and RotD100 at 10, 7.5, 5, 4, 3, and 2 sec, and PGV.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: acceleration 5-75% and 5-95% for X and Y components&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Computational and Data Estimates ==&lt;br /&gt;
&lt;br /&gt;
=== Computational Estimates ===&lt;br /&gt;
&lt;br /&gt;
We based these estimates by scaling from site USC (the average site has 3.8% more events and a volume 9.7% larger).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+SGT calculation&lt;br /&gt;
! !! UCVM runtime !! UCVM nodes !! SGT runtime !! SGT nodes !! Other SGT workflow jobs !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC&lt;br /&gt;
| 372 sec || 80 || 2628 sec || 67 || 1510 node-sec || 106.5 node-hrs&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 408 sec || 80 || 2883 sec || 67 || 1550 node-sec || 116.8 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives us an estimate of 43k node-hours for SGT calculation.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+PP calculation&lt;br /&gt;
! !! DirectSynth runtime !! DirectSynth nodes !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC &lt;br /&gt;
| 1081 || 36 || 10.8&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 1122 || 36 || 11.2&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives an estimate of 4.2k node-hours for post-processing.&lt;br /&gt;
&lt;br /&gt;
=== Data Estimates ===&lt;br /&gt;
&lt;br /&gt;
==== Summit ====&lt;br /&gt;
&lt;br /&gt;
{| class='wikitable'&lt;br /&gt;
|+Data estimates &lt;br /&gt;
! !! Velocity mesh !! SGTs size !! Temp data !! Output data&lt;br /&gt;
|-&lt;br /&gt;
| USC || 243 GB || 196 GB || 439 GB || 4.5 GB&lt;br /&gt;
|-&lt;br /&gt;
| Average || 267 GB || 203 GB || 470 GB || 4.7 GB&lt;br /&gt;
|-&lt;br /&gt;
! Total !! 87 TB !! 66 TB !! 153 TB !! 1.5 TB&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
This is a total of 307 TB, which we could reach if we calculate all the SGTs first and don't delete anything.  The default quota on Summit is 50 TB, so I suggest we request a quota increase to at least 300 TB so we don't need to rely on cleanup.&lt;br /&gt;
&lt;br /&gt;
If we need to keep the SGTs for awhile before performing post-processing, the quota on HPSS is 100 TB, so we could store them there.&lt;br /&gt;
&lt;br /&gt;
==== CARC ====&lt;br /&gt;
&lt;br /&gt;
We estimate 1.5 TB in output data, which will be transferred back to CARC.&lt;br /&gt;
&lt;br /&gt;
==== shock-carc ====&lt;br /&gt;
&lt;br /&gt;
The study should use approximately 200 GB in workflow log space on /home/shock.  This drive has approximately 1.7 TB free.&lt;br /&gt;
&lt;br /&gt;
==== moment database ====&lt;br /&gt;
&lt;br /&gt;
The PeakAmplitudes table uses approximately 100 bytes per entry.&lt;br /&gt;
&lt;br /&gt;
100 bytes/entry * 17 entries/event * 76786 events/site * 335 sites = 41 GB.  The drive on moment with the mysql database has 919 GB free.&lt;br /&gt;
&lt;br /&gt;
== Lessons Learned ==&lt;br /&gt;
&lt;br /&gt;
== Performance Metrics ==&lt;br /&gt;
&lt;br /&gt;
== Production Checklist ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Science:&amp;lt;/b&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Confirm that ERF 62 test produces results which closely match ERF 61&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Restore improvements to codes since ERF 58, and rerun USC for ERF 62&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Fix h/4 issue and rerun USC test.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Create prioritized site list.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Hold science readiness review.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Add link to fault geometry on Zenodo, either on the wiki or the fault metadata page.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Add copy of science readiness review slides to wiki.&amp;lt;/s&amp;gt;  &lt;br /&gt;
*&amp;lt;s&amp;gt;Generate 0.5 Hz v 1 Hz scatterplot from UCERF2 ERF run.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Go through config updates as a pair to confirm they have been correctly applied.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Technical:&amp;lt;/b&amp;gt;&lt;br /&gt;
*Approach OLCF for the following requests:&lt;br /&gt;
**Quota increase to 400 TB&lt;br /&gt;
**8 jobs ready to run&lt;br /&gt;
**5 jobs in bin 5.&lt;br /&gt;
*To be able to bundle jobs, fix issue with Summit glideins.&lt;br /&gt;
*To run post-processing, resolve issues using GO to transfer data back to /project at CARC.&lt;br /&gt;
*Tag code&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify job sizes and runtimes.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Test auto-submit script.&lt;br /&gt;
*Prepare pending file.&lt;br /&gt;
*Create XML file describing study for web monitoring tool.&lt;br /&gt;
*Get usage stats for Summit.&lt;br /&gt;
*&amp;lt;s&amp;gt;Prepare cronjob on Summit for monitoring jobs.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Call with OLCF staff&lt;br /&gt;
*Activate script for monitoring x509 certificate.&lt;br /&gt;
*Modify workflows to not insert or calculate curves for PSA data.&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use h/4 as default for surface point.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use ERF62 parameter file for generating GMPE comparison curves.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Hold technical readiness review.&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify nt to 8000 (400 sec).&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Add calculation of PGV.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Test calculation and insertion of PGV.&lt;br /&gt;
*Test SGT-only and PP-only create/plan/run scripts.&lt;br /&gt;
*Fix issue with moving runs into Verified state&lt;br /&gt;
&lt;br /&gt;
== Presentations, Posters, and Papers ==&lt;br /&gt;
&lt;br /&gt;
Science Readiness Review: [[Media:CyberShake_Study_21.12_Readiness_Review.odp | ODP]], [[Media:CyberShake_Study_21.12_Readiness_Review.pdf | PDF]]&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26315</id>
		<title>CyberShake Study 21.12</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26315"/>
		<updated>2021-12-08T18:36:52Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* Verification Tests */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake 21.12 is a computational study to use a new ERF with CyberShake, generated from an RSQSim catalog.  We plan to calculate results for 335 sites in Southern California using the RSQSim ERF, a minimum Vs of 500 m/s, and a frequency of 1 Hz.  We will use the CVM-S4.26.M01 model, and the GPU implementation of AWP-ODC-SGT enhanced from the BBP verification testing.  We will begin by generating all sets of SGTs, on Summit, then post-process them on a combination of Summit and Frontera.&lt;br /&gt;
&lt;br /&gt;
== Status ==&lt;br /&gt;
&lt;br /&gt;
This study is in the pre-production phase.  Production is scheduled to begin in mid-December, 2021.&lt;br /&gt;
&lt;br /&gt;
== Data Products ==&lt;br /&gt;
&lt;br /&gt;
== Science Goals ==&lt;br /&gt;
&lt;br /&gt;
The science goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Calculate a regional CyberShake model using an alternative, RSQSim-derived ERF.&lt;br /&gt;
*Compare results from an RSQSim ERF to results using a UCERF2 ERF (Study 15.4).&lt;br /&gt;
*Quantify effects of source model non-ergodicity&lt;br /&gt;
*Compare spatial distribution of ground motions (including  directivity) to empirical and kinematic models&lt;br /&gt;
&lt;br /&gt;
== Technical Goals ==&lt;br /&gt;
&lt;br /&gt;
The technical goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Perform a study using OLCF Summit as a key compute resource.&lt;br /&gt;
*Evaluate the performance of the new workflow submission host, shock-carc.&lt;br /&gt;
*Use Globus Online for staging of output data products.&lt;br /&gt;
&lt;br /&gt;
== ERF ==&lt;br /&gt;
&lt;br /&gt;
The ERF was generated from an RSQSim catalog, with the following parameters:&lt;br /&gt;
*715kyr catalog (the first 65k years of events were dropped, so that every fault's first event is excluded)&lt;br /&gt;
*220,927 earthquakes with M6.5+&lt;br /&gt;
*All events have equal probability, 1/715k&lt;br /&gt;
&lt;br /&gt;
Additional details are available on the [http://opensha.usc.edu/ftp/kmilner/markdown/rsqsim-analysis/catalogs/rundir4983_stitched/#bruce-4983-stitched catalog's metadata page], and the catalog and input fault geometry files can be [https://zenodo.org/record/5542222 downloaded from zeonodo]. This is the catalog used in [https://pubs.geoscienceworld.org/ssa/bssa/article/111/2/898/593757/Toward-Physics-Based-Nonergodic-PSHA-A-Prototype Milner et al., 2021], which used 0.5 Hz CyberShake simulations performed in May, 2020.&lt;br /&gt;
&lt;br /&gt;
== Sites ==&lt;br /&gt;
&lt;br /&gt;
We will run a list of 335 sites, taken from the site list that was used in other Southern California studies. The order of execution will be:&lt;br /&gt;
&lt;br /&gt;
*10 sites used in Milner et al. (2021), each with top mesh point Vs at the 500 m/s floor: USC, SMCA, OSI, WSS, SBSM, LAF, s022, STNI, WNGC, PDE&lt;br /&gt;
*PAS hard rock site&lt;br /&gt;
*20 km site grid&lt;br /&gt;
*10 km site grid&lt;br /&gt;
*Remaining POIs, select 5km grid sites also used in Study 15.4&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_21.12_site_map.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
*[[Media:Study_21.12_sites.csv|CSV site list]]&lt;br /&gt;
*[[Media:Study_21.12_sites_names.kml|KML site list with names]]&lt;br /&gt;
*[[Media:Study_21.12_sites_no_names.kml|KML site list without names]]&lt;br /&gt;
&lt;br /&gt;
== Velocity Model ==&lt;br /&gt;
&lt;br /&gt;
We will use CVM-S4.26.M01.&lt;br /&gt;
&lt;br /&gt;
To better represent the near-surface layer, we will populate the velocity parameters for the surface point by querying the velocity model at a depth of (grid spacing)/4.  For this study, the grid spacing is 100m, so we will query UCVM at a depth of 25m and use that value to populate the surface grid point.  The rationale is that the media parameters at the surface grid point are supposed to represent the material properties for [0, 50m], and this is better represented by using the value at 25m than the value at 0m.&lt;br /&gt;
&lt;br /&gt;
== Verification Tests ==&lt;br /&gt;
&lt;br /&gt;
=== USC Hazard Curves ===&lt;br /&gt;
&lt;br /&gt;
Hazard curve comparisons for site USC, between:&lt;br /&gt;
&lt;br /&gt;
*ERF 58: 0.5 Hz RSQSim ERF used for Milner et al. (2021) calculations, May 2020&lt;br /&gt;
*ERF 61: 0.5 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
*ERF 62: 1 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
&lt;br /&gt;
The first test run of ERF 61 used the wrong (older) mesh lower depth of 40 km, which is why the top right plot differs slightly from the top left. The middle left plot agrees perfectly with the top left.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 58, Original'''&lt;br /&gt;
| '''ERF 61 w/ wrong depth'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF58.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF61_PREV.png|thumb|500px]]&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61 w/ corrected depth'''&lt;br /&gt;
| '''ERF 62 (1 Hz) with old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 62 (1 Hz) with proposed (new) simulation parameters'''&lt;br /&gt;
| '''ERF 62 ccatter comparing new parameters (y) and old parameters (x)'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ASK2014.png|thumb|500px]]&lt;br /&gt;
| [[File:Erf_62prev_62_USC_compare.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
The first 1 Hz test run uses (bottom right above) uses the same simulation paramters as used with ERF 58 and 61, changing only the frequency. Some differences are apparent, which also persist for longer period (5s) curves:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61, 5s Sa, 0.5 Hz, old simulation parameters'''&lt;br /&gt;
| '''ERF 62, 5s Sa, 1 Hz, old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_5s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_5s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Here are seismograms and RotD's for the largest amplitude rupture for this 0.5 Hz vs 1 Hz test:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz.png|thumb|800px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz_rotd.png|thumb|800px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 3s Amplitude Scatter plots ===&lt;br /&gt;
&lt;br /&gt;
These plots show that (left) ERF 61 exactly reproduces ERF 58, and (right) that there are indeed differences going from 0.5 Hz to 1 Hz:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Erf_58_61_USC_compare.png|thumb|500px]]&lt;br /&gt;
| [[File:Erf_58_62_USC_compare.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 1 Hz vs 0.5 Hz comparisons ===&lt;br /&gt;
&lt;br /&gt;
Here is a comparison of 1 Hz runs (y axis) and 0.5 Hz runs (x axis). The top row is a recent USC run with the RSQSim ERF. The bottom row is a previous test with UCERF2-CyberShake and the WNGC site. The columns are 3s, 5s, and 10s SA, respectively, all geometric mean.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_3.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_5.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_10.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_3.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_5.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_10.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Technical and Scientific Updates ==&lt;br /&gt;
&lt;br /&gt;
Since our last study we have made a number of scientific updates to the platform, many as a result of the BBP verification effort.&lt;br /&gt;
&lt;br /&gt;
*Several bugs were found and fixed in the AWP code.&lt;br /&gt;
*We have switched from stress insertion to velocity insertion of the impulse when generating SGTs.&lt;br /&gt;
*The sponge zone used in the absorbing boundary condition was increased from 50 to 80 points.&lt;br /&gt;
*By default, we use a depth of h/4 when querying UCVM to populate the surface grid point.&lt;br /&gt;
*The padding between the nearest fault or site and the edge of the volume was increased from 30 to 50 km.&lt;br /&gt;
*We fixed a bug in the coordinate conversion between RWG and AWP: previously we were adding 1 to the RWG z-coordinate to produce the AWP z-coordinate, but both codes use z=1 to represent the surface and therefore no increment should be applied.&lt;br /&gt;
*When calculating Qs in the SGT header generation code, a default Qs of 25 was always used.  This has been changed to Qs=0.05Vs.&lt;br /&gt;
*We have turned off the adjustment of mu and lambda.&lt;br /&gt;
*FP was increased from 0.5 to 1.0.&lt;br /&gt;
*We modified the lambda and mu calculations in AWP to use the original media parameter values from the velocity mesh rather than the FP-modified ones when calculating strains, to be consistent with RWG.&lt;br /&gt;
&lt;br /&gt;
=== Study 18.8 Lessons Learned ===&lt;br /&gt;
&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider separating SGT and PP workflows in auto-submit tool to better manage the number of each, for improved reservation utilization.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create a read-only way to look at the CyberShake Run Manager website.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider reducing levels of the workflow hierarchy, thereby reducing load on shock.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which require fewer GPUs.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which exceed memory on nodes.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create new velocity model ID for composite model, capturing metadata.&amp;lt;/i&amp;gt;&lt;br /&gt;
We modified the database to enable composite models, but for this study we are just using a single model.&lt;br /&gt;
*&amp;lt;i&amp;gt;Verify all Java processes grab a reasonable amount of memory.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Clear disk space before study begins to avoid disk contention.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Add stress test before beginning study, for multiple sites at a time, with cleanup.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;In addition to disk space, check local inode usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
Only 1% of the inodes are used on shock-carc; we will assume /project has sufficient inodes, as we can't check them.&lt;br /&gt;
*&amp;lt;i&amp;gt;Establish clear rules and policies about reservation usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;If submitting to multiple reservations, make sure enough jobs are eligible to run that no reservation is starved.&amp;lt;/i&amp;gt;&lt;br /&gt;
We are not planning to run this study with reservations.&lt;br /&gt;
*&amp;lt;i&amp;gt;If running primarily SGTs for awhile, make sure they don't get deleted due to quota policies.&amp;lt;/i&amp;gt;&lt;br /&gt;
We will stage the SGTs to HPSS if there is a delay in post-processing them.  Summit has a 90-day purge policy, so we will have some time.&lt;br /&gt;
&lt;br /&gt;
== Output Data Products ==&lt;br /&gt;
&lt;br /&gt;
=== File-based data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to produce the following data products which will be stored at CARC:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Seismograms: 2-component seismograms, 8000 timesteps (400 sec) each&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;PSA: X and Y spectral acceleration at 44 periods (10, 9.5, 9, 8.5, 8, 7.5, 7, 6.5, 6, 5.5, 5, 4.8, 4.6, 4.4, 4.2, 4, 3.8, 3.6, 3.4, 3.2, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.66667, 1.42857, 1.25, 1.11111, 1, .66667, .5, .4, .33333, .285714, .25, .22222, .2, .16667, .142857, .125, .11111, .1 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: PGV, and RotD50, the RotD50 azimuth, and RotD100 at 22 periods (1.0, 1.2, 1.4, 1.5, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8, 3.0, 3.5, 4.0, 4.4, 5.0, 5.5, 6.0, 6.5, 7.5, 8.5, 10.0 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Database data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to store the following data products in the database:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt; &lt;br /&gt;
&amp;lt;li&amp;gt;PSA: none&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50 and RotD100 at 10, 7.5, 5, 4, 3, and 2 sec, and PGV.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: acceleration 5-75% and 5-95% for X and Y components&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Computational and Data Estimates ==&lt;br /&gt;
&lt;br /&gt;
=== Computational Estimates ===&lt;br /&gt;
&lt;br /&gt;
We based these estimates by scaling from site USC (the average site has 3.8% more events and a volume 9.7% larger).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+SGT calculation&lt;br /&gt;
! !! UCVM runtime !! UCVM nodes !! SGT runtime !! SGT nodes !! Other SGT workflow jobs !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC&lt;br /&gt;
| 372 sec || 80 || 2628 sec || 67 || 1510 node-sec || 106.5 node-hrs&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 408 sec || 80 || 2883 sec || 67 || 1550 node-sec || 116.8 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives us an estimate of 43k node-hours for SGT calculation.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+PP calculation&lt;br /&gt;
! !! DirectSynth runtime !! DirectSynth nodes !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC &lt;br /&gt;
| 1081 || 36 || 10.8&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 1122 || 36 || 11.2&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives an estimate of 4.2k node-hours for post-processing.&lt;br /&gt;
&lt;br /&gt;
=== Data Estimates ===&lt;br /&gt;
&lt;br /&gt;
==== Summit ====&lt;br /&gt;
&lt;br /&gt;
{| class='wikitable'&lt;br /&gt;
|+Data estimates &lt;br /&gt;
! !! Velocity mesh !! SGTs size !! Temp data !! Output data&lt;br /&gt;
|-&lt;br /&gt;
| USC || 243 GB || 196 GB || 439 GB || 4.5 GB&lt;br /&gt;
|-&lt;br /&gt;
| Average || 267 GB || 203 GB || 470 GB || 4.7 GB&lt;br /&gt;
|-&lt;br /&gt;
! Total !! 87 TB !! 66 TB !! 153 TB !! 1.5 TB&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
This is a total of 307 TB, which we could reach if we calculate all the SGTs first and don't delete anything.  The default quota on Summit is 50 TB, so I suggest we request a quota increase to at least 300 TB so we don't need to rely on cleanup.&lt;br /&gt;
&lt;br /&gt;
If we need to keep the SGTs for awhile before performing post-processing, the quota on HPSS is 100 TB, so we could store them there.&lt;br /&gt;
&lt;br /&gt;
==== CARC ====&lt;br /&gt;
&lt;br /&gt;
We estimate 1.5 TB in output data, which will be transferred back to CARC.&lt;br /&gt;
&lt;br /&gt;
==== shock-carc ====&lt;br /&gt;
&lt;br /&gt;
The study should use approximately 200 GB in workflow log space on /home/shock.  This drive has approximately 1.7 TB free.&lt;br /&gt;
&lt;br /&gt;
==== moment database ====&lt;br /&gt;
&lt;br /&gt;
The PeakAmplitudes table uses approximately 100 bytes per entry.&lt;br /&gt;
&lt;br /&gt;
100 bytes/entry * 17 entries/event * 76786 events/site * 335 sites = 41 GB.  The drive on moment with the mysql database has 919 GB free.&lt;br /&gt;
&lt;br /&gt;
== Lessons Learned ==&lt;br /&gt;
&lt;br /&gt;
== Performance Metrics ==&lt;br /&gt;
&lt;br /&gt;
== Production Checklist ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Science:&amp;lt;/b&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Confirm that ERF 62 test produces results which closely match ERF 61&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Restore improvements to codes since ERF 58, and rerun USC for ERF 62&amp;lt;/s&amp;gt;&lt;br /&gt;
*Fix h/4 issue and rerun USC test.&lt;br /&gt;
*&amp;lt;s&amp;gt;Create prioritized site list.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Hold science readiness review.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Add link to fault geometry on Zenodo, either on the wiki or the fault metadata page.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Add copy of science readiness review slides to wiki.&amp;lt;/s&amp;gt;  &lt;br /&gt;
*&amp;lt;s&amp;gt;Generate 0.5 Hz v 1 Hz scatterplot from UCERF2 ERF run.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Go through config updates as a pair to confirm they have been correctly applied.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Technical:&amp;lt;/b&amp;gt;&lt;br /&gt;
*Approach OLCF for the following requests:&lt;br /&gt;
**Quota increase to 400 TB&lt;br /&gt;
**8 jobs ready to run&lt;br /&gt;
**5 jobs in bin 5.&lt;br /&gt;
*To be able to bundle jobs, fix issue with Summit glideins.&lt;br /&gt;
*To run post-processing, resolve issues using GO to transfer data back to /project at CARC.&lt;br /&gt;
*Tag code&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify job sizes and runtimes.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Test auto-submit script.&lt;br /&gt;
*Prepare pending file.&lt;br /&gt;
*Create XML file describing study for web monitoring tool.&lt;br /&gt;
*Get usage stats for Summit.&lt;br /&gt;
*&amp;lt;s&amp;gt;Prepare cronjob on Summit for monitoring jobs.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Call with OLCF staff&lt;br /&gt;
*Activate script for monitoring x509 certificate.&lt;br /&gt;
*Modify workflows to not insert or calculate curves for PSA data.&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use h/4 as default for surface point.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use ERF62 parameter file for generating GMPE comparison curves.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Hold technical readiness review.&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify nt to 8000 (400 sec).&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Add calculation of PGV.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Test calculation and insertion of PGV.&lt;br /&gt;
*Test SGT-only and PP-only create/plan/run scripts.&lt;br /&gt;
*Fix issue with moving runs into Verified state&lt;br /&gt;
&lt;br /&gt;
== Presentations, Posters, and Papers ==&lt;br /&gt;
&lt;br /&gt;
Science Readiness Review: [[Media:CyberShake_Study_21.12_Readiness_Review.odp | ODP]], [[Media:CyberShake_Study_21.12_Readiness_Review.pdf | PDF]]&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_3s_ASK2014.png&amp;diff=26314</id>
		<title>File:USC curves 3s ASK2014.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_3s_ASK2014.png&amp;diff=26314"/>
		<updated>2021-12-08T18:34:09Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Erf_62prev_62_USC_compare.png&amp;diff=26313</id>
		<title>File:Erf 62prev 62 USC compare.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Erf_62prev_62_USC_compare.png&amp;diff=26313"/>
		<updated>2021-12-08T18:33:29Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26303</id>
		<title>CyberShake Study 21.12</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26303"/>
		<updated>2021-12-07T01:39:19Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* 1 Hz vs 0.5 Hz comparisons */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake 21.12 is a computational study to use a new ERF with CyberShake, generated from an RSQSim catalog.  We plan to calculate results for 335 sites in Southern California using the RSQSim ERF, a minimum Vs of 500 m/s, and a frequency of 1 Hz.  We will use the CVM-S4.26.M01 model, and the GPU implementation of AWP-ODC-SGT enhanced from the BBP verification testing.  We will begin by generating all sets of SGTs, on Summit, then post-process them on a combination of Summit and Frontera.&lt;br /&gt;
&lt;br /&gt;
== Status ==&lt;br /&gt;
&lt;br /&gt;
This study is in the pre-production phase.  Production is scheduled to begin in mid-December, 2021.&lt;br /&gt;
&lt;br /&gt;
== Data Products ==&lt;br /&gt;
&lt;br /&gt;
== Science Goals ==&lt;br /&gt;
&lt;br /&gt;
The science goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Calculate a regional CyberShake model using an alternative, RSQSim-derived ERF.&lt;br /&gt;
*Compare results from an RSQSim ERF to results using a UCERF2 ERF (Study 15.4).&lt;br /&gt;
*Quantify effects of source model non-ergodicity&lt;br /&gt;
*Compare spatial distribution of ground motions (including  directivity) to empirical and kinematic models&lt;br /&gt;
&lt;br /&gt;
== Technical Goals ==&lt;br /&gt;
&lt;br /&gt;
The technical goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Perform a study using OLCF Summit as a key compute resource.&lt;br /&gt;
*Evaluate the performance of the new workflow submission host, shock-carc.&lt;br /&gt;
*Use Globus Online for staging of output data products.&lt;br /&gt;
&lt;br /&gt;
== ERF ==&lt;br /&gt;
&lt;br /&gt;
The ERF was generated from an RSQSim catalog, with the following parameters:&lt;br /&gt;
*715kyr catalog (the first 65k years of events were dropped, so that every fault's first event is excluded)&lt;br /&gt;
*220,927 earthquakes with M6.5+&lt;br /&gt;
*All events have equal probability, 1/715k&lt;br /&gt;
&lt;br /&gt;
Additional details are available on the [http://opensha.usc.edu/ftp/kmilner/markdown/rsqsim-analysis/catalogs/rundir4983_stitched/#bruce-4983-stitched catalog's metadata page], and the catalog and input fault geometry files can be [https://zenodo.org/record/5542222 downloaded from zeonodo]. This is the catalog used in [https://pubs.geoscienceworld.org/ssa/bssa/article/111/2/898/593757/Toward-Physics-Based-Nonergodic-PSHA-A-Prototype Milner et al., 2021], which used 0.5 Hz CyberShake simulations performed in May, 2020.&lt;br /&gt;
&lt;br /&gt;
== Sites ==&lt;br /&gt;
&lt;br /&gt;
We will run a list of 335 sites, taken from the site list that was used in other Southern California studies. The order of execution will be:&lt;br /&gt;
&lt;br /&gt;
*10 sites used in Milner et al. (2021), each with top mesh point Vs at the 500 m/s floor: USC, SMCA, OSI, WSS, SBSM, LAF, s022, STNI, WNGC, PDE&lt;br /&gt;
*PAS hard rock site&lt;br /&gt;
*20 km site grid&lt;br /&gt;
*10 km site grid&lt;br /&gt;
*Remaining POIs, select 5km grid sites also used in Study 15.4&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_21.12_site_map.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
*[[Media:Study_21.12_sites.csv|CSV site list]]&lt;br /&gt;
*[[Media:Study_21.12_sites_names.kml|KML site list with names]]&lt;br /&gt;
*[[Media:Study_21.12_sites_no_names.kml|KML site list without names]]&lt;br /&gt;
&lt;br /&gt;
== Velocity Model ==&lt;br /&gt;
&lt;br /&gt;
We will use CVM-S4.26.M01.&lt;br /&gt;
&lt;br /&gt;
To better represent the near-surface layer, we will populate the velocity parameters for the surface point by querying the velocity model at a depth of (grid spacing)/4.  For this study, the grid spacing is 100m, so we will query UCVM at a depth of 25m and use that value to populate the surface grid point.  The rationale is that the media parameters at the surface grid point are supposed to represent the material properties for [0, 50m], and this is better represented by using the value at 25m than the value at 0m.&lt;br /&gt;
&lt;br /&gt;
== Verification Tests ==&lt;br /&gt;
&lt;br /&gt;
=== USC Hazard Curves ===&lt;br /&gt;
&lt;br /&gt;
Hazard curve comparisons for site USC, between:&lt;br /&gt;
&lt;br /&gt;
*ERF 58: 0.5 Hz RSQSim ERF used for Milner et al. (2021) calculations, May 2020&lt;br /&gt;
*ERF 61: 0.5 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
*ERF 62: 1 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
&lt;br /&gt;
The first test run of ERF 61 used the wrong (older) mesh lower depth of 40 km, which is why the top right plot differs slightly from the top left. The bottom left plot agrees perfectly with the tol left.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 58, Original'''&lt;br /&gt;
| '''ERF 61 w/ wrong depth'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF58.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF61_PREV.png|thumb|500px]]&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61 w/ corrected depth'''&lt;br /&gt;
| '''ERF 62 (1 Hz) with old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
The first 1 Hz test run uses (bottom right above) uses the same simulation paramters as used with ERF 58 and 61, changing only the frequency. Some differences are apparent, which also persist for longer period (5s) curves:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61, 5s Sa, 0.5 Hz, old simulation parameters'''&lt;br /&gt;
| '''ERF 62, 5s Sa, 1 Hz, old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_5s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_5s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Here are seismograms and RotD's for the largest amplitude rupture for this 0.5 Hz vs 1 Hz test:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz.png|thumb|800px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz_rotd.png|thumb|800px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 3s Amplitude Scatter plots ===&lt;br /&gt;
&lt;br /&gt;
These plots show that (left) ERF 61 exactly reproduces ERF 58, and (right) that there are indeed differences going from 0.5 Hz to 1 Hz:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Erf_58_61_USC_compare.png|thumb|500px]]&lt;br /&gt;
| [[File:Erf_58_62_USC_compare.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 1 Hz vs 0.5 Hz comparisons ===&lt;br /&gt;
&lt;br /&gt;
Here is a comparison of 1 Hz runs (y axis) and 0.5 Hz runs (x axis). The top row is a recent USC run with the RSQSim ERF. The bottom row is a previous test with UCERF2-CyberShake and the WNGC site. The columns are 3s, 5s, and 10s SA, respectively, all geometric mean.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_3.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_5.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_10.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_3.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_5.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_10.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Technical and Scientific Updates ==&lt;br /&gt;
&lt;br /&gt;
Since our last study we have made a number of scientific updates to the platform, many as a result of the BBP verification effort.&lt;br /&gt;
&lt;br /&gt;
*Several bugs were found and fixed in the AWP code.&lt;br /&gt;
*We have switched from stress insertion to velocity insertion of the impulse when generating SGTs.&lt;br /&gt;
*The sponge zone used in the absorbing boundary condition was increased from 50 to 80 points.&lt;br /&gt;
*By default, we use a depth of h/4 when querying UCVM to populate the surface grid point.&lt;br /&gt;
*The padding between the nearest fault or site and the edge of the volume was increased from 30 to 50 km.&lt;br /&gt;
*We fixed a bug in the coordinate conversion between RWG and AWP: previously we were adding 1 to the RWG z-coordinate to produce the AWP z-coordinate, but both codes use z=1 to represent the surface and therefore no increment should be applied.&lt;br /&gt;
*When calculating Qs in the SGT header generation code, a default Qs of 25 was always used.  This has been changed to Qs=0.05Vs.&lt;br /&gt;
*We have turned off the adjustment of mu and lambda.&lt;br /&gt;
*FP was increased from 0.5 to 1.0.&lt;br /&gt;
*We modified the lambda and mu calculations in AWP to use the original media parameter values from the velocity mesh rather than the FP-modified ones when calculating strains, to be consistent with RWG.&lt;br /&gt;
&lt;br /&gt;
=== Study 18.8 Lessons Learned ===&lt;br /&gt;
&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider separating SGT and PP workflows in auto-submit tool to better manage the number of each, for improved reservation utilization.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create a read-only way to look at the CyberShake Run Manager website.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider reducing levels of the workflow hierarchy, thereby reducing load on shock.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which require fewer GPUs.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which exceed memory on nodes.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create new velocity model ID for composite model, capturing metadata.&amp;lt;/i&amp;gt;&lt;br /&gt;
We modified the database to enable composite models, but for this study we are just using a single model.&lt;br /&gt;
*&amp;lt;i&amp;gt;Verify all Java processes grab a reasonable amount of memory.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Clear disk space before study begins to avoid disk contention.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Add stress test before beginning study, for multiple sites at a time, with cleanup.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;In addition to disk space, check local inode usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
Only 1% of the inodes are used on shock-carc; we will assume /project has sufficient inodes, as we can't check them.&lt;br /&gt;
*&amp;lt;i&amp;gt;Establish clear rules and policies about reservation usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;If submitting to multiple reservations, make sure enough jobs are eligible to run that no reservation is starved.&amp;lt;/i&amp;gt;&lt;br /&gt;
We are not planning to run this study with reservations.&lt;br /&gt;
*&amp;lt;i&amp;gt;If running primarily SGTs for awhile, make sure they don't get deleted due to quota policies.&amp;lt;/i&amp;gt;&lt;br /&gt;
We will stage the SGTs to HPSS if there is a delay in post-processing them.  Summit has a 90-day purge policy, so we will have some time.&lt;br /&gt;
&lt;br /&gt;
== Output Data Products ==&lt;br /&gt;
&lt;br /&gt;
=== File-based data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to produce the following data products which will be stored at CARC:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Seismograms: 2-component seismograms, 8000 timesteps (400 sec) each&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;PSA: X and Y spectral acceleration at 44 periods (10, 9.5, 9, 8.5, 8, 7.5, 7, 6.5, 6, 5.5, 5, 4.8, 4.6, 4.4, 4.2, 4, 3.8, 3.6, 3.4, 3.2, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.66667, 1.42857, 1.25, 1.11111, 1, .66667, .5, .4, .33333, .285714, .25, .22222, .2, .16667, .142857, .125, .11111, .1 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50, the RotD50 azimuth, and RotD100 at 22 periods (1.0, 1.2, 1.4, 1.5, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8, 3.0, 3.5, 4.0, 4.4, 5.0, 5.5, 6.0, 6.5, 7.5, 8.5, 10.0 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Database data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to store the following data products in the database:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt; &lt;br /&gt;
&amp;lt;li&amp;gt;PSA: none&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50 and RotD100 at 10, 7.5, 5, 4, 3, and 2 sec.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: acceleration 5-75% and 5-95% for X and Y components&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Computational and Data Estimates ==&lt;br /&gt;
&lt;br /&gt;
=== Computational Estimates ===&lt;br /&gt;
&lt;br /&gt;
We based these estimates by scaling from site USC (the average site has 3.8% more events and a volume 9.7% larger).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+SGT calculation&lt;br /&gt;
! !! UCVM runtime !! UCVM nodes !! SGT runtime !! SGT nodes !! Other SGT workflow jobs !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC&lt;br /&gt;
| 372 sec || 80 || 2628 sec || 67 || 1510 node-sec || 106.5 node-hrs&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 408 sec || 80 || 2883 sec || 67 || 1550 node-sec || 116.8 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives us an estimate of 43k node-hours for SGT calculation.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+PP calculation&lt;br /&gt;
! !! DirectSynth runtime !! DirectSynth nodes !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC &lt;br /&gt;
| 1081 || 36 || 10.8&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 1122 || 36 || 11.2&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives an estimate of 4.2k node-hours for post-processing.&lt;br /&gt;
&lt;br /&gt;
=== Data Estimates ===&lt;br /&gt;
&lt;br /&gt;
==== Summit ====&lt;br /&gt;
&lt;br /&gt;
{| class='wikitable'&lt;br /&gt;
|+Data estimates &lt;br /&gt;
! !! Velocity mesh !! SGTs size !! Temp data !! Output data&lt;br /&gt;
|-&lt;br /&gt;
| USC || 243 GB || 196 GB || 439 GB || 4.5 GB&lt;br /&gt;
|-&lt;br /&gt;
| Average || 267 GB || 203 GB || 470 GB || 4.7 GB&lt;br /&gt;
|-&lt;br /&gt;
! Total !! 87 TB !! 66 TB !! 153 TB !! 1.5 TB&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
This is a total of 307 TB, which we could reach if we calculate all the SGTs first and don't delete anything.  The default quota on Summit is 50 TB, so I suggest we request a quota increase to at least 300 TB so we don't need to rely on cleanup.&lt;br /&gt;
&lt;br /&gt;
If we need to keep the SGTs for awhile before performing post-processing, the quota on HPSS is 100 TB, so we could store them there.&lt;br /&gt;
&lt;br /&gt;
==== CARC ====&lt;br /&gt;
&lt;br /&gt;
We estimate 1.5 TB in output data, which will be transferred back to CARC.&lt;br /&gt;
&lt;br /&gt;
==== shock-carc ====&lt;br /&gt;
&lt;br /&gt;
The study should use approximately 200 GB in workflow log space on /home/shock.  This drive has approximately 1.7 TB free.&lt;br /&gt;
&lt;br /&gt;
==== moment database ====&lt;br /&gt;
&lt;br /&gt;
The PeakAmplitudes table uses approximately 100 bytes per entry.&lt;br /&gt;
&lt;br /&gt;
100 bytes/entry * 16 entries/event * 76786 events/site * 335 sites = 38 GB.  The drive on moment with the mysql database has 919 GB free.&lt;br /&gt;
&lt;br /&gt;
== Lessons Learned ==&lt;br /&gt;
&lt;br /&gt;
== Performance Metrics ==&lt;br /&gt;
&lt;br /&gt;
== Production Checklist ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Science:&amp;lt;/b&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Confirm that ERF 62 test produces results which closely match ERF 61&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Restore improvements to codes since ERF 58, and rerun USC for ERF 62&amp;lt;/s&amp;gt;&lt;br /&gt;
*Fix h/4 issue and rerun USC test.&lt;br /&gt;
*&amp;lt;s&amp;gt;Create prioritized site list.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Hold science readiness review.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Add link to fault geometry on Zenodo, either on the wiki or the fault metadata page.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Add copy of science readiness review slides to wiki.&amp;lt;/s&amp;gt; [[Media:CyberShake_Study_21.12_Readiness_Review.pdf]] [[Media:CyberShake_Study_21.12_Readiness_Review.odp]]&lt;br /&gt;
*&amp;lt;s&amp;gt;Generate 0.5 Hz v 1 Hz scatterplot from UCERF2 ERF run.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Go through config updates as a pair to confirm they have been correctly applied.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Technical:&amp;lt;/b&amp;gt;&lt;br /&gt;
*Approach OLCF for the following requests:&lt;br /&gt;
**Quota increase to 400 TB&lt;br /&gt;
**8 jobs ready to run&lt;br /&gt;
**5 jobs in bin 5.&lt;br /&gt;
*To be able to bundle jobs, fix issue with Summit glideins.&lt;br /&gt;
*To run post-processing, resolve issues using GO to transfer data back to /project at CARC.&lt;br /&gt;
*Tag code&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify job sizes and runtimes.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Test auto-submit script.&lt;br /&gt;
*Prepare pending file.&lt;br /&gt;
*Create XML file describing study for web monitoring tool.&lt;br /&gt;
*Get usage stats for Summit.&lt;br /&gt;
*&amp;lt;s&amp;gt;Prepare cronjob on Summit for monitoring jobs.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Call with OLCF staff&lt;br /&gt;
*Activate script for monitoring x509 certificate.&lt;br /&gt;
*Modify workflows to not insert or calculate curves for PSA data.&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use h/4 as default for surface point.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use ERF62 parameter file for generating GMPE comparison curves.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Hold technical readiness review.&lt;br /&gt;
*Modify nt to 8000 (400 sec).&lt;br /&gt;
*Add calculation of PGV.&lt;br /&gt;
*Test SGT-only and PP-only create/plan/run scripts.&lt;br /&gt;
&lt;br /&gt;
== Presentations, Posters, and Papers ==&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26302</id>
		<title>CyberShake Study 21.12</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26302"/>
		<updated>2021-12-07T01:35:36Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* Production Checklist */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake 21.12 is a computational study to use a new ERF with CyberShake, generated from an RSQSim catalog.  We plan to calculate results for 335 sites in Southern California using the RSQSim ERF, a minimum Vs of 500 m/s, and a frequency of 1 Hz.  We will use the CVM-S4.26.M01 model, and the GPU implementation of AWP-ODC-SGT enhanced from the BBP verification testing.  We will begin by generating all sets of SGTs, on Summit, then post-process them on a combination of Summit and Frontera.&lt;br /&gt;
&lt;br /&gt;
== Status ==&lt;br /&gt;
&lt;br /&gt;
This study is in the pre-production phase.  Production is scheduled to begin in mid-December, 2021.&lt;br /&gt;
&lt;br /&gt;
== Data Products ==&lt;br /&gt;
&lt;br /&gt;
== Science Goals ==&lt;br /&gt;
&lt;br /&gt;
The science goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Calculate a regional CyberShake model using an alternative, RSQSim-derived ERF.&lt;br /&gt;
*Compare results from an RSQSim ERF to results using a UCERF2 ERF (Study 15.4).&lt;br /&gt;
*Quantify effects of source model non-ergodicity&lt;br /&gt;
*Compare spatial distribution of ground motions (including  directivity) to empirical and kinematic models&lt;br /&gt;
&lt;br /&gt;
== Technical Goals ==&lt;br /&gt;
&lt;br /&gt;
The technical goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Perform a study using OLCF Summit as a key compute resource.&lt;br /&gt;
*Evaluate the performance of the new workflow submission host, shock-carc.&lt;br /&gt;
*Use Globus Online for staging of output data products.&lt;br /&gt;
&lt;br /&gt;
== ERF ==&lt;br /&gt;
&lt;br /&gt;
The ERF was generated from an RSQSim catalog, with the following parameters:&lt;br /&gt;
*715kyr catalog (the first 65k years of events were dropped, so that every fault's first event is excluded)&lt;br /&gt;
*220,927 earthquakes with M6.5+&lt;br /&gt;
*All events have equal probability, 1/715k&lt;br /&gt;
&lt;br /&gt;
Additional details are available on the [http://opensha.usc.edu/ftp/kmilner/markdown/rsqsim-analysis/catalogs/rundir4983_stitched/#bruce-4983-stitched catalog's metadata page], and the catalog and input fault geometry files can be [https://zenodo.org/record/5542222 downloaded from zeonodo]. This is the catalog used in [https://pubs.geoscienceworld.org/ssa/bssa/article/111/2/898/593757/Toward-Physics-Based-Nonergodic-PSHA-A-Prototype Milner et al., 2021], which used 0.5 Hz CyberShake simulations performed in May, 2020.&lt;br /&gt;
&lt;br /&gt;
== Sites ==&lt;br /&gt;
&lt;br /&gt;
We will run a list of 335 sites, taken from the site list that was used in other Southern California studies. The order of execution will be:&lt;br /&gt;
&lt;br /&gt;
*10 sites used in Milner et al. (2021), each with top mesh point Vs at the 500 m/s floor: USC, SMCA, OSI, WSS, SBSM, LAF, s022, STNI, WNGC, PDE&lt;br /&gt;
*PAS hard rock site&lt;br /&gt;
*20 km site grid&lt;br /&gt;
*10 km site grid&lt;br /&gt;
*Remaining POIs, select 5km grid sites also used in Study 15.4&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_21.12_site_map.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
*[[Media:Study_21.12_sites.csv|CSV site list]]&lt;br /&gt;
*[[Media:Study_21.12_sites_names.kml|KML site list with names]]&lt;br /&gt;
*[[Media:Study_21.12_sites_no_names.kml|KML site list without names]]&lt;br /&gt;
&lt;br /&gt;
== Velocity Model ==&lt;br /&gt;
&lt;br /&gt;
We will use CVM-S4.26.M01.&lt;br /&gt;
&lt;br /&gt;
To better represent the near-surface layer, we will populate the velocity parameters for the surface point by querying the velocity model at a depth of (grid spacing)/4.  For this study, the grid spacing is 100m, so we will query UCVM at a depth of 25m and use that value to populate the surface grid point.  The rationale is that the media parameters at the surface grid point are supposed to represent the material properties for [0, 50m], and this is better represented by using the value at 25m than the value at 0m.&lt;br /&gt;
&lt;br /&gt;
== Verification Tests ==&lt;br /&gt;
&lt;br /&gt;
=== USC Hazard Curves ===&lt;br /&gt;
&lt;br /&gt;
Hazard curve comparisons for site USC, between:&lt;br /&gt;
&lt;br /&gt;
*ERF 58: 0.5 Hz RSQSim ERF used for Milner et al. (2021) calculations, May 2020&lt;br /&gt;
*ERF 61: 0.5 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
*ERF 62: 1 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
&lt;br /&gt;
The first test run of ERF 61 used the wrong (older) mesh lower depth of 40 km, which is why the top right plot differs slightly from the top left. The bottom left plot agrees perfectly with the tol left.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 58, Original'''&lt;br /&gt;
| '''ERF 61 w/ wrong depth'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF58.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF61_PREV.png|thumb|500px]]&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61 w/ corrected depth'''&lt;br /&gt;
| '''ERF 62 (1 Hz) with old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
The first 1 Hz test run uses (bottom right above) uses the same simulation paramters as used with ERF 58 and 61, changing only the frequency. Some differences are apparent, which also persist for longer period (5s) curves:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61, 5s Sa, 0.5 Hz, old simulation parameters'''&lt;br /&gt;
| '''ERF 62, 5s Sa, 1 Hz, old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_5s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_5s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Here are seismograms and RotD's for the largest amplitude rupture for this 0.5 Hz vs 1 Hz test:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz.png|thumb|800px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz_rotd.png|thumb|800px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 3s Amplitude Scatter plots ===&lt;br /&gt;
&lt;br /&gt;
These plots show that (left) ERF 61 exactly reproduces ERF 58, and (right) that there are indeed differences going from 0.5 Hz to 1 Hz:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Erf_58_61_USC_compare.png|thumb|500px]]&lt;br /&gt;
| [[File:Erf_58_62_USC_compare.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 1 Hz vs 0.5 Hz comparisons ===&lt;br /&gt;
&lt;br /&gt;
Here is a comparison of 1 Hz runs (y axis) and 0.5 Hz runs (x axis). The top row is a recent USC run, with the RSQSim ERF. The bottom row is a previous test with UCERF2-CyberShake, and the WNGC site.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_3.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_5.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_10.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_3.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_5.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_10.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Technical and Scientific Updates ==&lt;br /&gt;
&lt;br /&gt;
Since our last study we have made a number of scientific updates to the platform, many as a result of the BBP verification effort.&lt;br /&gt;
&lt;br /&gt;
*Several bugs were found and fixed in the AWP code.&lt;br /&gt;
*We have switched from stress insertion to velocity insertion of the impulse when generating SGTs.&lt;br /&gt;
*The sponge zone used in the absorbing boundary condition was increased from 50 to 80 points.&lt;br /&gt;
*By default, we use a depth of h/4 when querying UCVM to populate the surface grid point.&lt;br /&gt;
*The padding between the nearest fault or site and the edge of the volume was increased from 30 to 50 km.&lt;br /&gt;
*We fixed a bug in the coordinate conversion between RWG and AWP: previously we were adding 1 to the RWG z-coordinate to produce the AWP z-coordinate, but both codes use z=1 to represent the surface and therefore no increment should be applied.&lt;br /&gt;
*When calculating Qs in the SGT header generation code, a default Qs of 25 was always used.  This has been changed to Qs=0.05Vs.&lt;br /&gt;
*We have turned off the adjustment of mu and lambda.&lt;br /&gt;
*FP was increased from 0.5 to 1.0.&lt;br /&gt;
*We modified the lambda and mu calculations in AWP to use the original media parameter values from the velocity mesh rather than the FP-modified ones when calculating strains, to be consistent with RWG.&lt;br /&gt;
&lt;br /&gt;
=== Study 18.8 Lessons Learned ===&lt;br /&gt;
&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider separating SGT and PP workflows in auto-submit tool to better manage the number of each, for improved reservation utilization.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create a read-only way to look at the CyberShake Run Manager website.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider reducing levels of the workflow hierarchy, thereby reducing load on shock.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which require fewer GPUs.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which exceed memory on nodes.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create new velocity model ID for composite model, capturing metadata.&amp;lt;/i&amp;gt;&lt;br /&gt;
We modified the database to enable composite models, but for this study we are just using a single model.&lt;br /&gt;
*&amp;lt;i&amp;gt;Verify all Java processes grab a reasonable amount of memory.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Clear disk space before study begins to avoid disk contention.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Add stress test before beginning study, for multiple sites at a time, with cleanup.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;In addition to disk space, check local inode usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
Only 1% of the inodes are used on shock-carc; we will assume /project has sufficient inodes, as we can't check them.&lt;br /&gt;
*&amp;lt;i&amp;gt;Establish clear rules and policies about reservation usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;If submitting to multiple reservations, make sure enough jobs are eligible to run that no reservation is starved.&amp;lt;/i&amp;gt;&lt;br /&gt;
We are not planning to run this study with reservations.&lt;br /&gt;
*&amp;lt;i&amp;gt;If running primarily SGTs for awhile, make sure they don't get deleted due to quota policies.&amp;lt;/i&amp;gt;&lt;br /&gt;
We will stage the SGTs to HPSS if there is a delay in post-processing them.  Summit has a 90-day purge policy, so we will have some time.&lt;br /&gt;
&lt;br /&gt;
== Output Data Products ==&lt;br /&gt;
&lt;br /&gt;
=== File-based data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to produce the following data products which will be stored at CARC:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Seismograms: 2-component seismograms, 8000 timesteps (400 sec) each&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;PSA: X and Y spectral acceleration at 44 periods (10, 9.5, 9, 8.5, 8, 7.5, 7, 6.5, 6, 5.5, 5, 4.8, 4.6, 4.4, 4.2, 4, 3.8, 3.6, 3.4, 3.2, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.66667, 1.42857, 1.25, 1.11111, 1, .66667, .5, .4, .33333, .285714, .25, .22222, .2, .16667, .142857, .125, .11111, .1 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50, the RotD50 azimuth, and RotD100 at 22 periods (1.0, 1.2, 1.4, 1.5, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8, 3.0, 3.5, 4.0, 4.4, 5.0, 5.5, 6.0, 6.5, 7.5, 8.5, 10.0 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Database data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to store the following data products in the database:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt; &lt;br /&gt;
&amp;lt;li&amp;gt;PSA: none&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50 and RotD100 at 10, 7.5, 5, 4, 3, and 2 sec.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: acceleration 5-75% and 5-95% for X and Y components&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Computational and Data Estimates ==&lt;br /&gt;
&lt;br /&gt;
=== Computational Estimates ===&lt;br /&gt;
&lt;br /&gt;
We based these estimates by scaling from site USC (the average site has 3.8% more events and a volume 9.7% larger).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+SGT calculation&lt;br /&gt;
! !! UCVM runtime !! UCVM nodes !! SGT runtime !! SGT nodes !! Other SGT workflow jobs !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC&lt;br /&gt;
| 372 sec || 80 || 2628 sec || 67 || 1510 node-sec || 106.5 node-hrs&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 408 sec || 80 || 2883 sec || 67 || 1550 node-sec || 116.8 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives us an estimate of 43k node-hours for SGT calculation.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+PP calculation&lt;br /&gt;
! !! DirectSynth runtime !! DirectSynth nodes !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC &lt;br /&gt;
| 1081 || 36 || 10.8&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 1122 || 36 || 11.2&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives an estimate of 4.2k node-hours for post-processing.&lt;br /&gt;
&lt;br /&gt;
=== Data Estimates ===&lt;br /&gt;
&lt;br /&gt;
==== Summit ====&lt;br /&gt;
&lt;br /&gt;
{| class='wikitable'&lt;br /&gt;
|+Data estimates &lt;br /&gt;
! !! Velocity mesh !! SGTs size !! Temp data !! Output data&lt;br /&gt;
|-&lt;br /&gt;
| USC || 243 GB || 196 GB || 439 GB || 4.5 GB&lt;br /&gt;
|-&lt;br /&gt;
| Average || 267 GB || 203 GB || 470 GB || 4.7 GB&lt;br /&gt;
|-&lt;br /&gt;
! Total !! 87 TB !! 66 TB !! 153 TB !! 1.5 TB&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
This is a total of 307 TB, which we could reach if we calculate all the SGTs first and don't delete anything.  The default quota on Summit is 50 TB, so I suggest we request a quota increase to at least 300 TB so we don't need to rely on cleanup.&lt;br /&gt;
&lt;br /&gt;
If we need to keep the SGTs for awhile before performing post-processing, the quota on HPSS is 100 TB, so we could store them there.&lt;br /&gt;
&lt;br /&gt;
==== CARC ====&lt;br /&gt;
&lt;br /&gt;
We estimate 1.5 TB in output data, which will be transferred back to CARC.&lt;br /&gt;
&lt;br /&gt;
==== shock-carc ====&lt;br /&gt;
&lt;br /&gt;
The study should use approximately 200 GB in workflow log space on /home/shock.  This drive has approximately 1.7 TB free.&lt;br /&gt;
&lt;br /&gt;
==== moment database ====&lt;br /&gt;
&lt;br /&gt;
The PeakAmplitudes table uses approximately 100 bytes per entry.&lt;br /&gt;
&lt;br /&gt;
100 bytes/entry * 16 entries/event * 76786 events/site * 335 sites = 38 GB.  The drive on moment with the mysql database has 919 GB free.&lt;br /&gt;
&lt;br /&gt;
== Lessons Learned ==&lt;br /&gt;
&lt;br /&gt;
== Performance Metrics ==&lt;br /&gt;
&lt;br /&gt;
== Production Checklist ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Science:&amp;lt;/b&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Confirm that ERF 62 test produces results which closely match ERF 61&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Restore improvements to codes since ERF 58, and rerun USC for ERF 62&amp;lt;/s&amp;gt;&lt;br /&gt;
*Fix h/4 issue and rerun USC test.&lt;br /&gt;
*&amp;lt;s&amp;gt;Create prioritized site list.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Hold science readiness review.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Add link to fault geometry on Zenodo, either on the wiki or the fault metadata page.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Add copy of science readiness review slides to wiki.&amp;lt;/s&amp;gt; [[Media:CyberShake_Study_21.12_Readiness_Review.pdf]] [[Media:CyberShake_Study_21.12_Readiness_Review.odp]]&lt;br /&gt;
*&amp;lt;s&amp;gt;Generate 0.5 Hz v 1 Hz scatterplot from UCERF2 ERF run.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Go through config updates as a pair to confirm they have been correctly applied.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Technical:&amp;lt;/b&amp;gt;&lt;br /&gt;
*Approach OLCF for the following requests:&lt;br /&gt;
**Quota increase to 400 TB&lt;br /&gt;
**8 jobs ready to run&lt;br /&gt;
**5 jobs in bin 5.&lt;br /&gt;
*To be able to bundle jobs, fix issue with Summit glideins.&lt;br /&gt;
*To run post-processing, resolve issues using GO to transfer data back to /project at CARC.&lt;br /&gt;
*Tag code&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify job sizes and runtimes.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Test auto-submit script.&lt;br /&gt;
*Prepare pending file.&lt;br /&gt;
*Create XML file describing study for web monitoring tool.&lt;br /&gt;
*Get usage stats for Summit.&lt;br /&gt;
*&amp;lt;s&amp;gt;Prepare cronjob on Summit for monitoring jobs.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Call with OLCF staff&lt;br /&gt;
*Activate script for monitoring x509 certificate.&lt;br /&gt;
*Modify workflows to not insert or calculate curves for PSA data.&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use h/4 as default for surface point.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use ERF62 parameter file for generating GMPE comparison curves.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Hold technical readiness review.&lt;br /&gt;
*Modify nt to 8000 (400 sec).&lt;br /&gt;
*Add calculation of PGV.&lt;br /&gt;
*Test SGT-only and PP-only create/plan/run scripts.&lt;br /&gt;
&lt;br /&gt;
== Presentations, Posters, and Papers ==&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:CyberShake_Study_21.12_Readiness_Review.odp&amp;diff=26301</id>
		<title>File:CyberShake Study 21.12 Readiness Review.odp</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:CyberShake_Study_21.12_Readiness_Review.odp&amp;diff=26301"/>
		<updated>2021-12-07T01:33:45Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:CyberShake_Study_21.12_Readiness_Review.pdf&amp;diff=26300</id>
		<title>File:CyberShake Study 21.12 Readiness Review.pdf</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:CyberShake_Study_21.12_Readiness_Review.pdf&amp;diff=26300"/>
		<updated>2021-12-07T01:33:13Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26299</id>
		<title>CyberShake Study 21.12</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26299"/>
		<updated>2021-12-07T01:31:37Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* Verification Tests */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake 21.12 is a computational study to use a new ERF with CyberShake, generated from an RSQSim catalog.  We plan to calculate results for 335 sites in Southern California using the RSQSim ERF, a minimum Vs of 500 m/s, and a frequency of 1 Hz.  We will use the CVM-S4.26.M01 model, and the GPU implementation of AWP-ODC-SGT enhanced from the BBP verification testing.  We will begin by generating all sets of SGTs, on Summit, then post-process them on a combination of Summit and Frontera.&lt;br /&gt;
&lt;br /&gt;
== Status ==&lt;br /&gt;
&lt;br /&gt;
This study is in the pre-production phase.  Production is scheduled to begin in mid-December, 2021.&lt;br /&gt;
&lt;br /&gt;
== Data Products ==&lt;br /&gt;
&lt;br /&gt;
== Science Goals ==&lt;br /&gt;
&lt;br /&gt;
The science goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Calculate a regional CyberShake model using an alternative, RSQSim-derived ERF.&lt;br /&gt;
*Compare results from an RSQSim ERF to results using a UCERF2 ERF (Study 15.4).&lt;br /&gt;
*Quantify effects of source model non-ergodicity&lt;br /&gt;
*Compare spatial distribution of ground motions (including  directivity) to empirical and kinematic models&lt;br /&gt;
&lt;br /&gt;
== Technical Goals ==&lt;br /&gt;
&lt;br /&gt;
The technical goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Perform a study using OLCF Summit as a key compute resource.&lt;br /&gt;
*Evaluate the performance of the new workflow submission host, shock-carc.&lt;br /&gt;
*Use Globus Online for staging of output data products.&lt;br /&gt;
&lt;br /&gt;
== ERF ==&lt;br /&gt;
&lt;br /&gt;
The ERF was generated from an RSQSim catalog, with the following parameters:&lt;br /&gt;
*715kyr catalog (the first 65k years of events were dropped, so that every fault's first event is excluded)&lt;br /&gt;
*220,927 earthquakes with M6.5+&lt;br /&gt;
*All events have equal probability, 1/715k&lt;br /&gt;
&lt;br /&gt;
Additional details are available on the [http://opensha.usc.edu/ftp/kmilner/markdown/rsqsim-analysis/catalogs/rundir4983_stitched/#bruce-4983-stitched catalog's metadata page], and the catalog and input fault geometry files can be [https://zenodo.org/record/5542222 downloaded from zeonodo]. This is the catalog used in [https://pubs.geoscienceworld.org/ssa/bssa/article/111/2/898/593757/Toward-Physics-Based-Nonergodic-PSHA-A-Prototype Milner et al., 2021], which used 0.5 Hz CyberShake simulations performed in May, 2020.&lt;br /&gt;
&lt;br /&gt;
== Sites ==&lt;br /&gt;
&lt;br /&gt;
We will run a list of 335 sites, taken from the site list that was used in other Southern California studies. The order of execution will be:&lt;br /&gt;
&lt;br /&gt;
*10 sites used in Milner et al. (2021), each with top mesh point Vs at the 500 m/s floor: USC, SMCA, OSI, WSS, SBSM, LAF, s022, STNI, WNGC, PDE&lt;br /&gt;
*PAS hard rock site&lt;br /&gt;
*20 km site grid&lt;br /&gt;
*10 km site grid&lt;br /&gt;
*Remaining POIs, select 5km grid sites also used in Study 15.4&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_21.12_site_map.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
*[[Media:Study_21.12_sites.csv|CSV site list]]&lt;br /&gt;
*[[Media:Study_21.12_sites_names.kml|KML site list with names]]&lt;br /&gt;
*[[Media:Study_21.12_sites_no_names.kml|KML site list without names]]&lt;br /&gt;
&lt;br /&gt;
== Velocity Model ==&lt;br /&gt;
&lt;br /&gt;
We will use CVM-S4.26.M01.&lt;br /&gt;
&lt;br /&gt;
To better represent the near-surface layer, we will populate the velocity parameters for the surface point by querying the velocity model at a depth of (grid spacing)/4.  For this study, the grid spacing is 100m, so we will query UCVM at a depth of 25m and use that value to populate the surface grid point.  The rationale is that the media parameters at the surface grid point are supposed to represent the material properties for [0, 50m], and this is better represented by using the value at 25m than the value at 0m.&lt;br /&gt;
&lt;br /&gt;
== Verification Tests ==&lt;br /&gt;
&lt;br /&gt;
=== USC Hazard Curves ===&lt;br /&gt;
&lt;br /&gt;
Hazard curve comparisons for site USC, between:&lt;br /&gt;
&lt;br /&gt;
*ERF 58: 0.5 Hz RSQSim ERF used for Milner et al. (2021) calculations, May 2020&lt;br /&gt;
*ERF 61: 0.5 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
*ERF 62: 1 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
&lt;br /&gt;
The first test run of ERF 61 used the wrong (older) mesh lower depth of 40 km, which is why the top right plot differs slightly from the top left. The bottom left plot agrees perfectly with the tol left.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 58, Original'''&lt;br /&gt;
| '''ERF 61 w/ wrong depth'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF58.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF61_PREV.png|thumb|500px]]&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61 w/ corrected depth'''&lt;br /&gt;
| '''ERF 62 (1 Hz) with old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
The first 1 Hz test run uses (bottom right above) uses the same simulation paramters as used with ERF 58 and 61, changing only the frequency. Some differences are apparent, which also persist for longer period (5s) curves:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61, 5s Sa, 0.5 Hz, old simulation parameters'''&lt;br /&gt;
| '''ERF 62, 5s Sa, 1 Hz, old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_5s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_5s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Here are seismograms and RotD's for the largest amplitude rupture for this 0.5 Hz vs 1 Hz test:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz.png|thumb|800px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz_rotd.png|thumb|800px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 3s Amplitude Scatter plots ===&lt;br /&gt;
&lt;br /&gt;
These plots show that (left) ERF 61 exactly reproduces ERF 58, and (right) that there are indeed differences going from 0.5 Hz to 1 Hz:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Erf_58_61_USC_compare.png|thumb|500px]]&lt;br /&gt;
| [[File:Erf_58_62_USC_compare.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 1 Hz vs 0.5 Hz comparisons ===&lt;br /&gt;
&lt;br /&gt;
Here is a comparison of 1 Hz runs (y axis) and 0.5 Hz runs (x axis). The top row is a recent USC run, with the RSQSim ERF. The bottom row is a previous test with UCERF2-CyberShake, and the WNGC site.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_3.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_5.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:USC_7236_vs_7237_scatter_10.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_3.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_5.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
| [[File:WNGC_3842_vs_3837_scatter_10.0s_GEOM_compare.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Technical and Scientific Updates ==&lt;br /&gt;
&lt;br /&gt;
Since our last study we have made a number of scientific updates to the platform, many as a result of the BBP verification effort.&lt;br /&gt;
&lt;br /&gt;
*Several bugs were found and fixed in the AWP code.&lt;br /&gt;
*We have switched from stress insertion to velocity insertion of the impulse when generating SGTs.&lt;br /&gt;
*The sponge zone used in the absorbing boundary condition was increased from 50 to 80 points.&lt;br /&gt;
*By default, we use a depth of h/4 when querying UCVM to populate the surface grid point.&lt;br /&gt;
*The padding between the nearest fault or site and the edge of the volume was increased from 30 to 50 km.&lt;br /&gt;
*We fixed a bug in the coordinate conversion between RWG and AWP: previously we were adding 1 to the RWG z-coordinate to produce the AWP z-coordinate, but both codes use z=1 to represent the surface and therefore no increment should be applied.&lt;br /&gt;
*When calculating Qs in the SGT header generation code, a default Qs of 25 was always used.  This has been changed to Qs=0.05Vs.&lt;br /&gt;
*We have turned off the adjustment of mu and lambda.&lt;br /&gt;
*FP was increased from 0.5 to 1.0.&lt;br /&gt;
*We modified the lambda and mu calculations in AWP to use the original media parameter values from the velocity mesh rather than the FP-modified ones when calculating strains, to be consistent with RWG.&lt;br /&gt;
&lt;br /&gt;
=== Study 18.8 Lessons Learned ===&lt;br /&gt;
&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider separating SGT and PP workflows in auto-submit tool to better manage the number of each, for improved reservation utilization.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create a read-only way to look at the CyberShake Run Manager website.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider reducing levels of the workflow hierarchy, thereby reducing load on shock.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which require fewer GPUs.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which exceed memory on nodes.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create new velocity model ID for composite model, capturing metadata.&amp;lt;/i&amp;gt;&lt;br /&gt;
We modified the database to enable composite models, but for this study we are just using a single model.&lt;br /&gt;
*&amp;lt;i&amp;gt;Verify all Java processes grab a reasonable amount of memory.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Clear disk space before study begins to avoid disk contention.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Add stress test before beginning study, for multiple sites at a time, with cleanup.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;In addition to disk space, check local inode usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
Only 1% of the inodes are used on shock-carc; we will assume /project has sufficient inodes, as we can't check them.&lt;br /&gt;
*&amp;lt;i&amp;gt;Establish clear rules and policies about reservation usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;If submitting to multiple reservations, make sure enough jobs are eligible to run that no reservation is starved.&amp;lt;/i&amp;gt;&lt;br /&gt;
We are not planning to run this study with reservations.&lt;br /&gt;
*&amp;lt;i&amp;gt;If running primarily SGTs for awhile, make sure they don't get deleted due to quota policies.&amp;lt;/i&amp;gt;&lt;br /&gt;
We will stage the SGTs to HPSS if there is a delay in post-processing them.  Summit has a 90-day purge policy, so we will have some time.&lt;br /&gt;
&lt;br /&gt;
== Output Data Products ==&lt;br /&gt;
&lt;br /&gt;
=== File-based data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to produce the following data products which will be stored at CARC:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Seismograms: 2-component seismograms, 8000 timesteps (400 sec) each&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;PSA: X and Y spectral acceleration at 44 periods (10, 9.5, 9, 8.5, 8, 7.5, 7, 6.5, 6, 5.5, 5, 4.8, 4.6, 4.4, 4.2, 4, 3.8, 3.6, 3.4, 3.2, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.66667, 1.42857, 1.25, 1.11111, 1, .66667, .5, .4, .33333, .285714, .25, .22222, .2, .16667, .142857, .125, .11111, .1 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50, the RotD50 azimuth, and RotD100 at 22 periods (1.0, 1.2, 1.4, 1.5, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8, 3.0, 3.5, 4.0, 4.4, 5.0, 5.5, 6.0, 6.5, 7.5, 8.5, 10.0 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Database data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to store the following data products in the database:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt; &lt;br /&gt;
&amp;lt;li&amp;gt;PSA: none&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50 and RotD100 at 10, 7.5, 5, 4, 3, and 2 sec.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: acceleration 5-75% and 5-95% for X and Y components&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Computational and Data Estimates ==&lt;br /&gt;
&lt;br /&gt;
=== Computational Estimates ===&lt;br /&gt;
&lt;br /&gt;
We based these estimates by scaling from site USC (the average site has 3.8% more events and a volume 9.7% larger).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+SGT calculation&lt;br /&gt;
! !! UCVM runtime !! UCVM nodes !! SGT runtime !! SGT nodes !! Other SGT workflow jobs !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC&lt;br /&gt;
| 372 sec || 80 || 2628 sec || 67 || 1510 node-sec || 106.5 node-hrs&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 408 sec || 80 || 2883 sec || 67 || 1550 node-sec || 116.8 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives us an estimate of 43k node-hours for SGT calculation.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+PP calculation&lt;br /&gt;
! !! DirectSynth runtime !! DirectSynth nodes !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC &lt;br /&gt;
| 1081 || 36 || 10.8&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 1122 || 36 || 11.2&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives an estimate of 4.2k node-hours for post-processing.&lt;br /&gt;
&lt;br /&gt;
=== Data Estimates ===&lt;br /&gt;
&lt;br /&gt;
==== Summit ====&lt;br /&gt;
&lt;br /&gt;
{| class='wikitable'&lt;br /&gt;
|+Data estimates &lt;br /&gt;
! !! Velocity mesh !! SGTs size !! Temp data !! Output data&lt;br /&gt;
|-&lt;br /&gt;
| USC || 243 GB || 196 GB || 439 GB || 4.5 GB&lt;br /&gt;
|-&lt;br /&gt;
| Average || 267 GB || 203 GB || 470 GB || 4.7 GB&lt;br /&gt;
|-&lt;br /&gt;
! Total !! 87 TB !! 66 TB !! 153 TB !! 1.5 TB&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
This is a total of 307 TB, which we could reach if we calculate all the SGTs first and don't delete anything.  The default quota on Summit is 50 TB, so I suggest we request a quota increase to at least 300 TB so we don't need to rely on cleanup.&lt;br /&gt;
&lt;br /&gt;
If we need to keep the SGTs for awhile before performing post-processing, the quota on HPSS is 100 TB, so we could store them there.&lt;br /&gt;
&lt;br /&gt;
==== CARC ====&lt;br /&gt;
&lt;br /&gt;
We estimate 1.5 TB in output data, which will be transferred back to CARC.&lt;br /&gt;
&lt;br /&gt;
==== shock-carc ====&lt;br /&gt;
&lt;br /&gt;
The study should use approximately 200 GB in workflow log space on /home/shock.  This drive has approximately 1.7 TB free.&lt;br /&gt;
&lt;br /&gt;
==== moment database ====&lt;br /&gt;
&lt;br /&gt;
The PeakAmplitudes table uses approximately 100 bytes per entry.&lt;br /&gt;
&lt;br /&gt;
100 bytes/entry * 16 entries/event * 76786 events/site * 335 sites = 38 GB.  The drive on moment with the mysql database has 919 GB free.&lt;br /&gt;
&lt;br /&gt;
== Lessons Learned ==&lt;br /&gt;
&lt;br /&gt;
== Performance Metrics ==&lt;br /&gt;
&lt;br /&gt;
== Production Checklist ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Science:&amp;lt;/b&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Confirm that ERF 62 test produces results which closely match ERF 61&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Restore improvements to codes since ERF 58, and rerun USC for ERF 62&amp;lt;/s&amp;gt;&lt;br /&gt;
*Fix h/4 issue and rerun USC test.&lt;br /&gt;
*&amp;lt;s&amp;gt;Create prioritized site list.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Hold science readiness review.&lt;br /&gt;
*Add link to fault geometry on Zenodo, either on the wiki or the fault metadata page.&lt;br /&gt;
*Add copy of science readiness review slides to wiki.&lt;br /&gt;
*Generate 0.5 Hz v 1 Hz scatterplot from UCERF2 ERF run.&lt;br /&gt;
*Go through config updates as a pair to confirm they have been correctly applied.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Technical:&amp;lt;/b&amp;gt;&lt;br /&gt;
*Approach OLCF for the following requests:&lt;br /&gt;
**Quota increase to 400 TB&lt;br /&gt;
**8 jobs ready to run&lt;br /&gt;
**5 jobs in bin 5.&lt;br /&gt;
*To be able to bundle jobs, fix issue with Summit glideins.&lt;br /&gt;
*To run post-processing, resolve issues using GO to transfer data back to /project at CARC.&lt;br /&gt;
*Tag code&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify job sizes and runtimes.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Test auto-submit script.&lt;br /&gt;
*Prepare pending file.&lt;br /&gt;
*Create XML file describing study for web monitoring tool.&lt;br /&gt;
*Get usage stats for Summit.&lt;br /&gt;
*&amp;lt;s&amp;gt;Prepare cronjob on Summit for monitoring jobs.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Call with OLCF staff&lt;br /&gt;
*Activate script for monitoring x509 certificate.&lt;br /&gt;
*Modify workflows to not insert or calculate curves for PSA data.&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use h/4 as default for surface point.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use ERF62 parameter file for generating GMPE comparison curves.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Hold technical readiness review.&lt;br /&gt;
*Modify nt to 8000 (400 sec).&lt;br /&gt;
*Add calculation of PGV.&lt;br /&gt;
*Test SGT-only and PP-only create/plan/run scripts.&lt;br /&gt;
&lt;br /&gt;
== Presentations, Posters, and Papers ==&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:WNGC_3842_vs_3837_scatter_10.0s_GEOM_compare.png&amp;diff=26298</id>
		<title>File:WNGC 3842 vs 3837 scatter 10.0s GEOM compare.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:WNGC_3842_vs_3837_scatter_10.0s_GEOM_compare.png&amp;diff=26298"/>
		<updated>2021-12-07T01:29:50Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:WNGC_3842_vs_3837_scatter_5.0s_GEOM_compare.png&amp;diff=26297</id>
		<title>File:WNGC 3842 vs 3837 scatter 5.0s GEOM compare.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:WNGC_3842_vs_3837_scatter_5.0s_GEOM_compare.png&amp;diff=26297"/>
		<updated>2021-12-07T01:29:41Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:WNGC_3842_vs_3837_scatter_3.0s_GEOM_compare.png&amp;diff=26296</id>
		<title>File:WNGC 3842 vs 3837 scatter 3.0s GEOM compare.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:WNGC_3842_vs_3837_scatter_3.0s_GEOM_compare.png&amp;diff=26296"/>
		<updated>2021-12-07T01:29:33Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_7236_vs_7237_scatter_10.0s_GEOM_compare.png&amp;diff=26295</id>
		<title>File:USC 7236 vs 7237 scatter 10.0s GEOM compare.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_7236_vs_7237_scatter_10.0s_GEOM_compare.png&amp;diff=26295"/>
		<updated>2021-12-07T01:25:54Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_7236_vs_7237_scatter_5.0s_GEOM_compare.png&amp;diff=26294</id>
		<title>File:USC 7236 vs 7237 scatter 5.0s GEOM compare.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_7236_vs_7237_scatter_5.0s_GEOM_compare.png&amp;diff=26294"/>
		<updated>2021-12-07T01:25:43Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_7236_vs_7237_scatter_3.0s_GEOM_compare.png&amp;diff=26293</id>
		<title>File:USC 7236 vs 7237 scatter 3.0s GEOM compare.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_7236_vs_7237_scatter_3.0s_GEOM_compare.png&amp;diff=26293"/>
		<updated>2021-12-07T01:25:33Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:WNGC_3837_vs_3842_scatter_3.0s_GEOM_compare_diff.png&amp;diff=26292</id>
		<title>File:WNGC 3837 vs 3842 scatter 3.0s GEOM compare diff.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:WNGC_3837_vs_3842_scatter_3.0s_GEOM_compare_diff.png&amp;diff=26292"/>
		<updated>2021-12-07T01:19:34Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:WNGC_3837_vs_3842_scatter_3.0s_GEOM_compare.png&amp;diff=26291</id>
		<title>File:WNGC 3837 vs 3842 scatter 3.0s GEOM compare.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:WNGC_3837_vs_3842_scatter_3.0s_GEOM_compare.png&amp;diff=26291"/>
		<updated>2021-12-07T01:18:56Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26290</id>
		<title>CyberShake Study 21.12</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26290"/>
		<updated>2021-12-07T00:46:18Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* ERF */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake 21.12 is a computational study to use a new ERF with CyberShake, generated from an RSQSim catalog.  We plan to calculate results for 335 sites in Southern California using the RSQSim ERF, a minimum Vs of 500 m/s, and a frequency of 1 Hz.  We will use the CVM-S4.26.M01 model, and the GPU implementation of AWP-ODC-SGT enhanced from the BBP verification testing.  We will begin by generating all sets of SGTs, on Summit, then post-process them on a combination of Summit and Frontera.&lt;br /&gt;
&lt;br /&gt;
== Status ==&lt;br /&gt;
&lt;br /&gt;
This study is in the pre-production phase.  Production is scheduled to begin in mid-December, 2021.&lt;br /&gt;
&lt;br /&gt;
== Data Products ==&lt;br /&gt;
&lt;br /&gt;
== Science Goals ==&lt;br /&gt;
&lt;br /&gt;
The science goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Calculate a regional CyberShake model using an alternative, RSQSim-derived ERF.&lt;br /&gt;
*Compare results from an RSQSim ERF to results using a UCERF2 ERF (Study 15.4).&lt;br /&gt;
*Quantify effects of source model non-ergodicity&lt;br /&gt;
*Compare spatial distribution of ground motions (including  directivity) to empirical and kinematic models&lt;br /&gt;
&lt;br /&gt;
== Technical Goals ==&lt;br /&gt;
&lt;br /&gt;
The technical goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Perform a study using OLCF Summit as a key compute resource.&lt;br /&gt;
*Evaluate the performance of the new workflow submission host, shock-carc.&lt;br /&gt;
*Use Globus Online for staging of output data products.&lt;br /&gt;
&lt;br /&gt;
== ERF ==&lt;br /&gt;
&lt;br /&gt;
The ERF was generated from an RSQSim catalog, with the following parameters:&lt;br /&gt;
*715kyr catalog (the first 65k years of events were dropped, so that every fault's first event is excluded)&lt;br /&gt;
*220,927 earthquakes with M6.5+&lt;br /&gt;
*All events have equal probability, 1/715k&lt;br /&gt;
&lt;br /&gt;
Additional details are available on the [http://opensha.usc.edu/ftp/kmilner/markdown/rsqsim-analysis/catalogs/rundir4983_stitched/#bruce-4983-stitched catalog's metadata page], and the catalog and input fault geometry files can be [https://zenodo.org/record/5542222 downloaded from zeonodo]. This is the catalog used in [https://pubs.geoscienceworld.org/ssa/bssa/article/111/2/898/593757/Toward-Physics-Based-Nonergodic-PSHA-A-Prototype Milner et al., 2021], which used 0.5 Hz CyberShake simulations performed in May, 2020.&lt;br /&gt;
&lt;br /&gt;
== Sites ==&lt;br /&gt;
&lt;br /&gt;
We will run a list of 335 sites, taken from the site list that was used in other Southern California studies. The order of execution will be:&lt;br /&gt;
&lt;br /&gt;
*10 sites used in Milner et al. (2021), each with top mesh point Vs at the 500 m/s floor: USC, SMCA, OSI, WSS, SBSM, LAF, s022, STNI, WNGC, PDE&lt;br /&gt;
*PAS hard rock site&lt;br /&gt;
*20 km site grid&lt;br /&gt;
*10 km site grid&lt;br /&gt;
*Remaining POIs, select 5km grid sites also used in Study 15.4&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_21.12_site_map.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
*[[Media:Study_21.12_sites.csv|CSV site list]]&lt;br /&gt;
*[[Media:Study_21.12_sites_names.kml|KML site list with names]]&lt;br /&gt;
*[[Media:Study_21.12_sites_no_names.kml|KML site list without names]]&lt;br /&gt;
&lt;br /&gt;
== Velocity Model ==&lt;br /&gt;
&lt;br /&gt;
We will use CVM-S4.26.M01.&lt;br /&gt;
&lt;br /&gt;
To better represent the near-surface layer, we will populate the velocity parameters for the surface point by querying the velocity model at a depth of (grid spacing)/4.  For this study, the grid spacing is 100m, so we will query UCVM at a depth of 25m and use that value to populate the surface grid point.  The rationale is that the media parameters at the surface grid point are supposed to represent the material properties for [0, 50m], and this is better represented by using the value at 25m than the value at 0m.&lt;br /&gt;
&lt;br /&gt;
== Verification Tests ==&lt;br /&gt;
&lt;br /&gt;
=== USC Hazard Curves ===&lt;br /&gt;
&lt;br /&gt;
Hazard curve comparisons for site USC, between:&lt;br /&gt;
&lt;br /&gt;
*ERF 58: 0.5 Hz RSQSim ERF used for Milner et al. (2021) calculations, May 2020&lt;br /&gt;
*ERF 61: 0.5 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
*ERF 62: 1 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
&lt;br /&gt;
The first test run of ERF 61 used the wrong (older) mesh lower depth of 40 km, which is why the top right plot differs slightly from the top left. The bottom left plot agrees perfectly with the tol left.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 58, Original'''&lt;br /&gt;
| '''ERF 61 w/ wrong depth'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF58.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF61_PREV.png|thumb|500px]]&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61 w/ corrected depth'''&lt;br /&gt;
| '''ERF 62 (1 Hz) with old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
The first 1 Hz test run uses (bottom right above) uses the same simulation paramters as used with ERF 58 and 61, changing only the frequency. Some differences are apparent, which also persist for longer period (5s) curves:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61, 5s Sa, 0.5 Hz, old simulation parameters'''&lt;br /&gt;
| '''ERF 62, 5s Sa, 1 Hz, old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_5s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_5s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Here are seismograms and RotD's for the largest amplitude rupture for this 0.5 Hz vs 1 Hz test:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz.png|thumb|800px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz_rotd.png|thumb|800px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 3s Amplitude Scatter plots ===&lt;br /&gt;
&lt;br /&gt;
These plots show that (left) ERF 61 exactly reproduces ERF 58, and (right) that there are indeed differences going from 0.5 Hz to 1 Hz:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Erf_58_61_USC_compare.png|thumb|500px]]&lt;br /&gt;
| [[File:Erf_58_62_USC_compare.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Technical and Scientific Updates ==&lt;br /&gt;
&lt;br /&gt;
Since our last study we have made a number of scientific updates to the platform, many as a result of the BBP verification effort.&lt;br /&gt;
&lt;br /&gt;
*Several bugs were found and fixed in the AWP code.&lt;br /&gt;
*We have switched from stress insertion to velocity insertion of the impulse when generating SGTs.&lt;br /&gt;
*The sponge zone used in the absorbing boundary condition was increased from 50 to 80 points.&lt;br /&gt;
*By default, we use a depth of h/4 when querying UCVM to populate the surface grid point.&lt;br /&gt;
*The padding between the nearest fault or site and the edge of the volume was increased from 30 to 50 km.&lt;br /&gt;
*We fixed a bug in the coordinate conversion between RWG and AWP: previously we were adding 1 to the RWG z-coordinate to produce the AWP z-coordinate, but both codes use z=1 to represent the surface and therefore no increment should be applied.&lt;br /&gt;
*When calculating Qs in the SGT header generation code, a default Qs of 25 was always used.  This has been changed to Qs=0.05Vs.&lt;br /&gt;
*We have turned off the adjustment of mu and lambda.&lt;br /&gt;
*FP was increased from 0.5 to 1.0.&lt;br /&gt;
*We modified the lambda and mu calculations in AWP to use the original media parameter values from the velocity mesh rather than the FP-modified ones when calculating strains, to be consistent with RWG.&lt;br /&gt;
&lt;br /&gt;
=== Study 18.8 Lessons Learned ===&lt;br /&gt;
&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider separating SGT and PP workflows in auto-submit tool to better manage the number of each, for improved reservation utilization.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create a read-only way to look at the CyberShake Run Manager website.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider reducing levels of the workflow hierarchy, thereby reducing load on shock.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which require fewer GPUs.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which exceed memory on nodes.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create new velocity model ID for composite model, capturing metadata.&amp;lt;/i&amp;gt;&lt;br /&gt;
We modified the database to enable composite models, but for this study we are just using a single model.&lt;br /&gt;
*&amp;lt;i&amp;gt;Verify all Java processes grab a reasonable amount of memory.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Clear disk space before study begins to avoid disk contention.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Add stress test before beginning study, for multiple sites at a time, with cleanup.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;In addition to disk space, check local inode usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
Only 1% of the inodes are used on shock-carc; we will assume /project has sufficient inodes, as we can't check them.&lt;br /&gt;
*&amp;lt;i&amp;gt;Establish clear rules and policies about reservation usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;If submitting to multiple reservations, make sure enough jobs are eligible to run that no reservation is starved.&amp;lt;/i&amp;gt;&lt;br /&gt;
We are not planning to run this study with reservations.&lt;br /&gt;
*&amp;lt;i&amp;gt;If running primarily SGTs for awhile, make sure they don't get deleted due to quota policies.&amp;lt;/i&amp;gt;&lt;br /&gt;
We will stage the SGTs to HPSS if there is a delay in post-processing them.  Summit has a 90-day purge policy, so we will have some time.&lt;br /&gt;
&lt;br /&gt;
== Output Data Products ==&lt;br /&gt;
&lt;br /&gt;
=== File-based data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to produce the following data products which will be stored at CARC:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Seismograms: 2-component seismograms, 8000 timesteps (400 sec) each&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;PSA: X and Y spectral acceleration at 44 periods (10, 9.5, 9, 8.5, 8, 7.5, 7, 6.5, 6, 5.5, 5, 4.8, 4.6, 4.4, 4.2, 4, 3.8, 3.6, 3.4, 3.2, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.66667, 1.42857, 1.25, 1.11111, 1, .66667, .5, .4, .33333, .285714, .25, .22222, .2, .16667, .142857, .125, .11111, .1 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50, the RotD50 azimuth, and RotD100 at 22 periods (1.0, 1.2, 1.4, 1.5, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8, 3.0, 3.5, 4.0, 4.4, 5.0, 5.5, 6.0, 6.5, 7.5, 8.5, 10.0 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Database data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to store the following data products in the database:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt; &lt;br /&gt;
&amp;lt;li&amp;gt;PSA: none&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50 and RotD100 at 10, 7.5, 5, 4, 3, and 2 sec.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: acceleration 5-75% and 5-95% for X and Y components&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Computational and Data Estimates ==&lt;br /&gt;
&lt;br /&gt;
=== Computational Estimates ===&lt;br /&gt;
&lt;br /&gt;
We based these estimates by scaling from site USC (the average site has 3.8% more events and a volume 9.7% larger).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+SGT calculation&lt;br /&gt;
! !! UCVM runtime !! UCVM nodes !! SGT runtime !! SGT nodes !! Other SGT workflow jobs !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC&lt;br /&gt;
| 372 sec || 80 || 2628 sec || 67 || 1510 node-sec || 106.5 node-hrs&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 408 sec || 80 || 2883 sec || 67 || 1550 node-sec || 116.8 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives us an estimate of 43k node-hours for SGT calculation.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+PP calculation&lt;br /&gt;
! !! DirectSynth runtime !! DirectSynth nodes !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC &lt;br /&gt;
| 1081 || 36 || 10.8&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 1122 || 36 || 11.2&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives an estimate of 4.2k node-hours for post-processing.&lt;br /&gt;
&lt;br /&gt;
=== Data Estimates ===&lt;br /&gt;
&lt;br /&gt;
==== Summit ====&lt;br /&gt;
&lt;br /&gt;
{| class='wikitable'&lt;br /&gt;
|+Data estimates &lt;br /&gt;
! !! Velocity mesh !! SGTs size !! Temp data !! Output data&lt;br /&gt;
|-&lt;br /&gt;
| USC || 243 GB || 196 GB || 439 GB || 4.5 GB&lt;br /&gt;
|-&lt;br /&gt;
| Average || 267 GB || 203 GB || 470 GB || 4.7 GB&lt;br /&gt;
|-&lt;br /&gt;
! Total !! 87 TB !! 66 TB !! 153 TB !! 1.5 TB&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
This is a total of 307 TB, which we could reach if we calculate all the SGTs first and don't delete anything.  The default quota on Summit is 50 TB, so I suggest we request a quota increase to at least 300 TB so we don't need to rely on cleanup.&lt;br /&gt;
&lt;br /&gt;
If we need to keep the SGTs for awhile before performing post-processing, the quota on HPSS is 100 TB, so we could store them there.&lt;br /&gt;
&lt;br /&gt;
==== CARC ====&lt;br /&gt;
&lt;br /&gt;
We estimate 1.5 TB in output data, which will be transferred back to CARC.&lt;br /&gt;
&lt;br /&gt;
==== shock-carc ====&lt;br /&gt;
&lt;br /&gt;
The study should use approximately 200 GB in workflow log space on /home/shock.  This drive has approximately 1.7 TB free.&lt;br /&gt;
&lt;br /&gt;
==== moment database ====&lt;br /&gt;
&lt;br /&gt;
The PeakAmplitudes table uses approximately 100 bytes per entry.&lt;br /&gt;
&lt;br /&gt;
100 bytes/entry * 16 entries/event * 76786 events/site * 335 sites = 38 GB.  The drive on moment with the mysql database has 919 GB free.&lt;br /&gt;
&lt;br /&gt;
== Lessons Learned ==&lt;br /&gt;
&lt;br /&gt;
== Performance Metrics ==&lt;br /&gt;
&lt;br /&gt;
== Production Checklist ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Science:&amp;lt;/b&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Confirm that ERF 62 test produces results which closely match ERF 61&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Restore improvements to codes since ERF 58, and rerun USC for ERF 62&amp;lt;/s&amp;gt;&lt;br /&gt;
*Fix h/4 issue and rerun USC test.&lt;br /&gt;
*&amp;lt;s&amp;gt;Create prioritized site list.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Hold science readiness review.&lt;br /&gt;
*Add link to fault geometry on Zenodo, either on the wiki or the fault metadata page.&lt;br /&gt;
*Add copy of science readiness review slides to wiki.&lt;br /&gt;
*Generate 0.5 Hz v 1 Hz scatterplot from UCERF2 ERF run.&lt;br /&gt;
*Go through config updates as a pair to confirm they have been correctly applied.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Technical:&amp;lt;/b&amp;gt;&lt;br /&gt;
*Approach OLCF for the following requests:&lt;br /&gt;
**Quota increase to 400 TB&lt;br /&gt;
**8 jobs ready to run&lt;br /&gt;
**5 jobs in bin 5.&lt;br /&gt;
*To be able to bundle jobs, fix issue with Summit glideins.&lt;br /&gt;
*To run post-processing, resolve issues using GO to transfer data back to /project at CARC.&lt;br /&gt;
*Tag code&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify job sizes and runtimes.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Test auto-submit script.&lt;br /&gt;
*Prepare pending file.&lt;br /&gt;
*Create XML file describing study for web monitoring tool.&lt;br /&gt;
*Get usage stats for Summit.&lt;br /&gt;
*&amp;lt;s&amp;gt;Prepare cronjob on Summit for monitoring jobs.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Call with OLCF staff&lt;br /&gt;
*Activate script for monitoring x509 certificate.&lt;br /&gt;
*Modify workflows to not insert or calculate curves for PSA data.&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use h/4 as default for surface point.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use ERF62 parameter file for generating GMPE comparison curves.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Hold technical readiness review.&lt;br /&gt;
*Modify nt to 8000 (400 sec).&lt;br /&gt;
*Add calculation of PGV.&lt;br /&gt;
*Test SGT-only and PP-only create/plan/run scripts.&lt;br /&gt;
&lt;br /&gt;
== Presentations, Posters, and Papers ==&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26264</id>
		<title>CyberShake Study 21.12</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26264"/>
		<updated>2021-12-03T19:26:30Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* Verification Tests */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake 21.12 is a computational study to use a new ERF with CyberShake, generated from an RSQSim catalog.  We plan to calculate results for 335 sites in Southern California using the RSQSim ERF, a minimum Vs of 500 m/s, and a frequency of 1 Hz.  We will use the CVM-S4.26.M01 model, and the GPU implementation of AWP-ODC-SGT enhanced from the BBP verification testing.  We will begin by generating all sets of SGTs, on Summit, then post-process them on a combination of Summit and Frontera.&lt;br /&gt;
&lt;br /&gt;
== Status ==&lt;br /&gt;
&lt;br /&gt;
This study is in the pre-production phase.  Production is scheduled to begin in mid-December, 2021.&lt;br /&gt;
&lt;br /&gt;
== Data Products ==&lt;br /&gt;
&lt;br /&gt;
== Science Goals ==&lt;br /&gt;
&lt;br /&gt;
The science goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Calculate a regional CyberShake model using an alternative, RSQSim-derived ERF.&lt;br /&gt;
*Compare results from an RSQSim ERF to results using a UCERF2 ERF (Study 15.4).&lt;br /&gt;
*Quantify effects of source model non-ergodicity&lt;br /&gt;
*Compare spatial distribution of ground motions (including  directivity) to empirical and kinematic models&lt;br /&gt;
&lt;br /&gt;
== Technical Goals ==&lt;br /&gt;
&lt;br /&gt;
The technical goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Perform a study using OLCF Summit as a key compute resource.&lt;br /&gt;
*Evaluate the performance of the new workflow submission host, shock-carc.&lt;br /&gt;
*Use Globus Online for staging of output data products.&lt;br /&gt;
&lt;br /&gt;
== ERF ==&lt;br /&gt;
&lt;br /&gt;
The ERF was generated from an RSQSim catalog, with the following parameters:&lt;br /&gt;
*715kyr catalog (the first 65k years of events were dropped, so that every fault's first event is excluded)&lt;br /&gt;
*220,927 earthquakes with M6.5+&lt;br /&gt;
*All events have equal probability, 1/715k&lt;br /&gt;
&lt;br /&gt;
Additional details are available on the [http://opensha.usc.edu/ftp/kmilner/markdown/rsqsim-analysis/catalogs/rundir4983_stitched/#bruce-4983-stitched catalog's metadata page]. This is the catalog used in [https://pubs.geoscienceworld.org/ssa/bssa/article/111/2/898/593757/Toward-Physics-Based-Nonergodic-PSHA-A-Prototype Milner et al., 2021], which used 0.5 Hz CyberShake simulations performed in May, 2020.&lt;br /&gt;
&lt;br /&gt;
== Sites ==&lt;br /&gt;
&lt;br /&gt;
We will run a list of 335 sites, taken from the site list that was used in other Southern California studies. The order of execution will be:&lt;br /&gt;
&lt;br /&gt;
*10 sites used in Milner et al. (2021), each with top mesh point Vs at the 500 m/s floor: USC, SMCA, OSI, WSS, SBSM, LAF, s022, STNI, WNGC, PDE&lt;br /&gt;
*PAS hard rock site&lt;br /&gt;
*20 km site grid&lt;br /&gt;
*10 km site grid&lt;br /&gt;
*Remaining POIs, select 5km grid sites also used in Study 15.4&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_21.12_site_map.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
*[[Media:Study_21.12_sites.csv|CSV site list]]&lt;br /&gt;
*[[Media:Study_21.12_sites_names.kml|KML site list with names]]&lt;br /&gt;
*[[Media:Study_21.12_sites_no_names.kml|KML site list without names]]&lt;br /&gt;
&lt;br /&gt;
== Velocity Model ==&lt;br /&gt;
&lt;br /&gt;
We will use CVM-S4.26.M01.&lt;br /&gt;
&lt;br /&gt;
To better represent the near-surface layer, we will populate the velocity parameters for the surface point by querying the velocity model at a depth of (grid spacing)/4.  For this study, the grid spacing is 100m, so we will query UCVM at a depth of 25m and use that value to populate the surface grid point.  The rationale is that the media parameters at the surface grid point are supposed to represent the material properties for [0, 50m], and this is better represented by using the value at 25m than the value at 0m.&lt;br /&gt;
&lt;br /&gt;
== Verification Tests ==&lt;br /&gt;
&lt;br /&gt;
=== USC Hazard Curves ===&lt;br /&gt;
&lt;br /&gt;
Hazard curve comparisons for site USC, between:&lt;br /&gt;
&lt;br /&gt;
*ERF 58: 0.5 Hz RSQSim ERF used for Milner et al. (2021) calculations, May 2020&lt;br /&gt;
*ERF 61: 0.5 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
*ERF 62: 1 Hz RSQSim production ERF with the same catalog&lt;br /&gt;
&lt;br /&gt;
The first test run of ERF 61 used the wrong (older) mesh lower depth of 40 km, which is why the top right plot differs slightly from the top left. The bottom left plot agrees perfectly with the tol left.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 58, Original'''&lt;br /&gt;
| '''ERF 61 w/ wrong depth'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF58.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF61_PREV.png|thumb|500px]]&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61 w/ corrected depth'''&lt;br /&gt;
| '''ERF 62 (1 Hz) with old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_3s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_3s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
The first 1 Hz test run uses (bottom right above) uses the same simulation paramters as used with ERF 58 and 61, changing only the frequency. Some differences are apparent, which also persist for longer period (5s) curves:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
| '''ERF 61, 5s Sa, 0.5 Hz, old simulation parameters'''&lt;br /&gt;
| '''ERF 62, 5s Sa, 1 Hz, old simulation parameters'''&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_curves_5s_ERF61.png|thumb|500px]]&lt;br /&gt;
| [[File:USC_curves_5s_ERF62_FIRST.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Here are seismograms and RotD's for the largest amplitude rupture for this 0.5 Hz vs 1 Hz test:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz.png|thumb|800px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_s1069_r0_1_v_0.5hz_rotd.png|thumb|800px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== 3s Amplitude Scatter plots ===&lt;br /&gt;
&lt;br /&gt;
These plots show that (left) ERF 61 exactly reproduces ERF 58, and (right) that there are indeed differences going from 0.5 Hz to 1 Hz:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Erf_58_61_USC_compare.png|thumb|500px]]&lt;br /&gt;
| [[File:Erf_58_62_USC_compare.png|thumb|500px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Technical and Scientific Updates ==&lt;br /&gt;
&lt;br /&gt;
Since our last study we have made a number of scientific updates to the platform, many as a result of the BBP verification effort.&lt;br /&gt;
&lt;br /&gt;
*Several bugs were found and fixed in the AWP code.&lt;br /&gt;
*We have switched from stress insertion to velocity insertion of the impulse when generating SGTs.&lt;br /&gt;
*The sponge zone used in the absorbing boundary condition was increased from 50 to 80 points.&lt;br /&gt;
*By default, we use a depth of h/4 when querying UCVM to populate the surface grid point.&lt;br /&gt;
*The padding between the nearest fault or site and the edge of the volume was increased from 30 to 50 km.&lt;br /&gt;
*We fixed a bug in the coordinate conversion between RWG and AWP: previously we were adding 1 to the RWG z-coordinate to produce the AWP z-coordinate, but both codes use z=1 to represent the surface and therefore no increment should be applied.&lt;br /&gt;
*When calculating Qs in the SGT header generation code, a default Qs of 25 was always used.  This has been changed to Qs=0.05Vs.&lt;br /&gt;
*We have turned off the adjustment of mu and lambda.&lt;br /&gt;
*FP was increased from 0.5 to 1.0.&lt;br /&gt;
*We modified the lambda and mu calculations in AWP to use the original media parameter values from the velocity mesh rather than the FP-modified ones when calculating strains, to be consistent with RWG.&lt;br /&gt;
&lt;br /&gt;
=== Study 18.8 Lessons Learned ===&lt;br /&gt;
&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider separating SGT and PP workflows in auto-submit tool to better manage the number of each, for improved reservation utilization.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create a read-only way to look at the CyberShake Run Manager website.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider reducing levels of the workflow hierarchy, thereby reducing load on shock.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which require fewer GPUs.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which exceed memory on nodes.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create new velocity model ID for composite model, capturing metadata.&amp;lt;/i&amp;gt;&lt;br /&gt;
We modified the database to enable composite models, but for this study we are just using a single model.&lt;br /&gt;
*&amp;lt;i&amp;gt;Verify all Java processes grab a reasonable amount of memory.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Clear disk space before study begins to avoid disk contention.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Add stress test before beginning study, for multiple sites at a time, with cleanup.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;In addition to disk space, check local inode usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
Only 1% of the inodes are used on shock-carc; we will assume /project has sufficient inodes, as we can't check them.&lt;br /&gt;
*&amp;lt;i&amp;gt;Establish clear rules and policies about reservation usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;If submitting to multiple reservations, make sure enough jobs are eligible to run that no reservation is starved.&amp;lt;/i&amp;gt;&lt;br /&gt;
We are not planning to run this study with reservations.&lt;br /&gt;
*&amp;lt;i&amp;gt;If running primarily SGTs for awhile, make sure they don't get deleted due to quota policies.&amp;lt;/i&amp;gt;&lt;br /&gt;
We will stage the SGTs to HPSS if there is a delay in post-processing them.  Summit has a 90-day purge policy, so we will have some time.&lt;br /&gt;
&lt;br /&gt;
== Output Data Products ==&lt;br /&gt;
&lt;br /&gt;
=== File-based data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to produce the following data products which will be stored at CARC:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Seismograms: 2-component seismograms, 6000 timesteps (300 sec) each&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;PSA: X and Y spectral acceleration at 44 periods (10, 9.5, 9, 8.5, 8, 7.5, 7, 6.5, 6, 5.5, 5, 4.8, 4.6, 4.4, 4.2, 4, 3.8, 3.6, 3.4, 3.2, 2, 2.8, 2.6, 2.4, 2.2, 2, 1.66667, 1.42857, 1.25, 1.11111, 1, .66667, .5, .4, .33333, .285714, .25, .22222, .2, .16667, .142857, .125, .11111, .1 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50, the RotD50 azimuth, and RotD100 at 22 periods (1.0, 1.2, 1.4, 1.5, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8, 3.0, 3.5, 4.0, 4.4, 5.0, 5.5, 6.0, 6.5, 7.5, 8.5, 10.0 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Database data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to store the following data products in the database:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt; &lt;br /&gt;
&amp;lt;li&amp;gt;PSA: none&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50 and RotD100 at 10, 7.5, 5, 4, 3, and 2 sec.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: acceleration 5-75% and 5-95% for X and Y components&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Computational and Data Estimates ==&lt;br /&gt;
&lt;br /&gt;
=== Computational Estimates ===&lt;br /&gt;
&lt;br /&gt;
We based these estimates by scaling from site USC (the average site has 3.8% more events and a volume 9.7% larger).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+SGT calculation&lt;br /&gt;
! !! UCVM runtime !! UCVM nodes !! SGT runtime !! SGT nodes !! Other SGT workflow jobs !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC&lt;br /&gt;
| 372 sec || 80 || 2628 sec || 67 || 1510 node-sec || 106.5 node-hrs&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 408 sec || 80 || 2883 sec || 67 || 1550 node-sec || 116.8 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives us an estimate of 43k node-hours for SGT calculation.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+PP calculation&lt;br /&gt;
! !! DirectSynth runtime !! DirectSynth nodes !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC &lt;br /&gt;
| 1081 || 36 || 10.8&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 1122 || 36 || 11.2&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives an estimate of 4.2k node-hours for post-processing.&lt;br /&gt;
&lt;br /&gt;
=== Data Estimates ===&lt;br /&gt;
&lt;br /&gt;
==== Summit ====&lt;br /&gt;
&lt;br /&gt;
{| class='wikitable'&lt;br /&gt;
|+Data estimates &lt;br /&gt;
! !! Velocity mesh !! SGTs size !! Temp data !! Output data&lt;br /&gt;
|-&lt;br /&gt;
| USC || 243 GB || 196 GB || 439 GB || 3.4 GB&lt;br /&gt;
|-&lt;br /&gt;
| Average || 267 GB || 203 GB || 470 GB || 3.5 GB&lt;br /&gt;
|-&lt;br /&gt;
! Total !! 87 TB !! 66 TB !! 153 TB !! 1.2 TB&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
This is a total of 307 TB, which we could reach if we calculate all the SGTs first.  The default quota on Summit is 50 TB, so I suggest we request a quota increase to 400 TB so we don't need to rely on cleanup.&lt;br /&gt;
&lt;br /&gt;
If we need to keep the SGTs for awhile before performing post-processing, the quota on HPSS is 100 TB, so we could store them there.&lt;br /&gt;
&lt;br /&gt;
==== CARC ====&lt;br /&gt;
&lt;br /&gt;
We estimate 1.2 TB in output data, which will be transferred back to CARC.&lt;br /&gt;
&lt;br /&gt;
==== shock-carc ====&lt;br /&gt;
&lt;br /&gt;
The study should use approximately 200 GB in workflow log space on /home/shock.  This drive has approximately 1.7 TB free.&lt;br /&gt;
&lt;br /&gt;
==== moment database ====&lt;br /&gt;
&lt;br /&gt;
The PeakAmplitudes table uses approximately 100 bytes per entry.&lt;br /&gt;
&lt;br /&gt;
100 bytes/entry * 16 entries/event * 76786 events/site * 335 sites = 38 GB.  The drive on moment with the mysql database has 919 GB free.&lt;br /&gt;
&lt;br /&gt;
== Lessons Learned ==&lt;br /&gt;
&lt;br /&gt;
== Performance Metrics ==&lt;br /&gt;
&lt;br /&gt;
== Production Checklist ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Science:&amp;lt;/b&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Confirm that ERF 62 test produces results which closely match ERF 61&amp;lt;/s&amp;gt;&lt;br /&gt;
*Restore improvements to codes since ERF 58, and rerun USC for ERF 62&lt;br /&gt;
*&amp;lt;s&amp;gt;Create prioritized site list.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Hold science readiness review.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Technical:&amp;lt;/b&amp;gt;&lt;br /&gt;
*Approach OLCF for the following requests:&lt;br /&gt;
**Quota increase to 400 TB&lt;br /&gt;
**8 jobs ready to run&lt;br /&gt;
**5 jobs in bin 5.&lt;br /&gt;
*To be able to bundle jobs, fix issue with Summit glideins.&lt;br /&gt;
*To run post-processing, resolve issues using GO to transfer data back to /project at CARC.&lt;br /&gt;
*Tag code&lt;br /&gt;
*Modify job sizes and runtimes.&lt;br /&gt;
*Test auto-submit script.&lt;br /&gt;
*Prepare pending file.&lt;br /&gt;
*Create XML file describing study for web monitoring tool.&lt;br /&gt;
*Get usage stats for Summit.&lt;br /&gt;
*Check cronjob on Summit for monitoring jobs.&lt;br /&gt;
*Call with OLCF staff&lt;br /&gt;
*Activate script for monitoring x509 certificate.&lt;br /&gt;
*Modify workflows to not insert or calculate curves for PSA data.&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use h/4 as default for surface point.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify dax-generator to use ERF62 parameter file for generating GMPE comparison curves.&amp;lt;/s&amp;gt;&lt;br /&gt;
*Hold technical readiness review.&lt;br /&gt;
&lt;br /&gt;
== Presentations, Posters, and Papers ==&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Erf_58_62_USC_compare.png&amp;diff=26262</id>
		<title>File:Erf 58 62 USC compare.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Erf_58_62_USC_compare.png&amp;diff=26262"/>
		<updated>2021-12-03T19:24:21Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Erf_58_61_USC_compare.png&amp;diff=26260</id>
		<title>File:Erf 58 61 USC compare.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Erf_58_61_USC_compare.png&amp;diff=26260"/>
		<updated>2021-12-03T19:23:42Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_s1069_r0_1_v_0.5hz_rotd.png&amp;diff=26259</id>
		<title>File:USC s1069 r0 1 v 0.5hz rotd.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_s1069_r0_1_v_0.5hz_rotd.png&amp;diff=26259"/>
		<updated>2021-12-03T19:21:58Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_s1069_r0_1_v_0.5hz.png&amp;diff=26258</id>
		<title>File:USC s1069 r0 1 v 0.5hz.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_s1069_r0_1_v_0.5hz.png&amp;diff=26258"/>
		<updated>2021-12-03T19:20:53Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_5s_ERF62_FIRST.png&amp;diff=26256</id>
		<title>File:USC curves 5s ERF62 FIRST.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_5s_ERF62_FIRST.png&amp;diff=26256"/>
		<updated>2021-12-03T19:18:30Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: Kmilner moved page File:USC curves 5s ERF62.png to File:USC curves 5s ERF62 FIRST.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_5s_ERF62.png&amp;diff=26257</id>
		<title>File:USC curves 5s ERF62.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_5s_ERF62.png&amp;diff=26257"/>
		<updated>2021-12-03T19:18:30Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: Kmilner moved page File:USC curves 5s ERF62.png to File:USC curves 5s ERF62 FIRST.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;#REDIRECT [[File:USC curves 5s ERF62 FIRST.png]]&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_5s_ERF62_FIRST.png&amp;diff=26255</id>
		<title>File:USC curves 5s ERF62 FIRST.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_5s_ERF62_FIRST.png&amp;diff=26255"/>
		<updated>2021-12-03T19:17:07Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_5s_ERF61.png&amp;diff=26254</id>
		<title>File:USC curves 5s ERF61.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_5s_ERF61.png&amp;diff=26254"/>
		<updated>2021-12-03T19:17:00Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_3s_ERF62_FIRST.png&amp;diff=26253</id>
		<title>File:USC curves 3s ERF62 FIRST.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_3s_ERF62_FIRST.png&amp;diff=26253"/>
		<updated>2021-12-03T19:07:04Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_3s_ERF61_PREV.png&amp;diff=26252</id>
		<title>File:USC curves 3s ERF61 PREV.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_3s_ERF61_PREV.png&amp;diff=26252"/>
		<updated>2021-12-03T19:06:35Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_3s_ERF61.png&amp;diff=26251</id>
		<title>File:USC curves 3s ERF61.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_3s_ERF61.png&amp;diff=26251"/>
		<updated>2021-12-03T19:06:23Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_3s_ERF58.png&amp;diff=26250</id>
		<title>File:USC curves 3s ERF58.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_curves_3s_ERF58.png&amp;diff=26250"/>
		<updated>2021-12-03T19:06:04Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26247</id>
		<title>CyberShake Study 21.12</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26247"/>
		<updated>2021-12-03T19:01:06Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake 21.12 is a computational study to use a new ERF with CyberShake, generated from an RSQSim catalog.  We plan to calculate results for 335 sites in Southern California using the RSQSim ERF, a minimum Vs of 500 m/s, and a frequency of 1 Hz.  We will use the CVM-S4.26.M01 model, and the GPU implementation of AWP-ODC-SGT enhanced from the BBP verification testing.  We will begin by generating all sets of SGTs, on Summit, then post-process them on a combination of Summit and Frontera.&lt;br /&gt;
&lt;br /&gt;
== Status ==&lt;br /&gt;
&lt;br /&gt;
This study is in the pre-production phase.  Production is scheduled to begin in mid-December, 2021.&lt;br /&gt;
&lt;br /&gt;
== Data Products ==&lt;br /&gt;
&lt;br /&gt;
== Science Goals ==&lt;br /&gt;
&lt;br /&gt;
The science goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Calculate a regional CyberShake model using an alternative, RSQSim-derived ERF.&lt;br /&gt;
*Compare results from an RSQSim ERF to results using a UCERF2 ERF (Study 15.4).&lt;br /&gt;
*Quantify effects of source model non-ergodicity&lt;br /&gt;
*Compare spatial distribution of ground motions (including  directivity) to empirical and kinematic models&lt;br /&gt;
&lt;br /&gt;
== Technical Goals ==&lt;br /&gt;
&lt;br /&gt;
The technical goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Perform a study using OLCF Summit as a key compute resource.&lt;br /&gt;
*Evaluate the performance of the new workflow submission host, shock-carc.&lt;br /&gt;
*Use Globus Online for staging of output data products.&lt;br /&gt;
&lt;br /&gt;
== ERF ==&lt;br /&gt;
&lt;br /&gt;
The ERF was generated from an RSQSim catalog, with the following parameters:&lt;br /&gt;
*715kyr catalog (the first 65k years of events were dropped, so that every fault's first event is excluded)&lt;br /&gt;
*220,927 earthquakes with M6.5+&lt;br /&gt;
*All events have equal probability, 1/715k&lt;br /&gt;
&lt;br /&gt;
Additional details are available on the [http://opensha.usc.edu/ftp/kmilner/markdown/rsqsim-analysis/catalogs/rundir4983_stitched/#bruce-4983-stitched catalog's metadata page]. This is the catalog used in [https://pubs.geoscienceworld.org/ssa/bssa/article/111/2/898/593757/Toward-Physics-Based-Nonergodic-PSHA-A-Prototype Milner et al., 2021], which used 0.5 Hz CyberShake simulations performed in May, 2020.&lt;br /&gt;
&lt;br /&gt;
== Sites ==&lt;br /&gt;
&lt;br /&gt;
We will run a list of 335 sites, taken from the site list that was used in other Southern California studies. The order of execution will be:&lt;br /&gt;
&lt;br /&gt;
*10 sites used in Milner et al. (2021), each with top mesh point Vs at the 500 m/s floor: USC, SMCA, OSI, WSS, SBSM, LAF, s022, STNI, WNGC, PDE&lt;br /&gt;
*PAS hard rock site&lt;br /&gt;
*20 km site grid&lt;br /&gt;
*10 km site grid&lt;br /&gt;
*Remaining POIs, select 5km grid sites also used in Study 15.4&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_21.12_site_map.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
*[[Media:Study_21.12_sites.csv|CSV site list]]&lt;br /&gt;
*[[Media:Study_21.12_sites_names.kml|KML site list with names]]&lt;br /&gt;
*[[Media:Study_21.12_sites_no_names.kml|KML site list without names]]&lt;br /&gt;
&lt;br /&gt;
== Velocity Model ==&lt;br /&gt;
&lt;br /&gt;
We will use CVM-S4.26.M01.&lt;br /&gt;
&lt;br /&gt;
To better represent the near-surface layer, we will populate the velocity parameters for the surface point by querying the velocity model at a depth of (grid spacing)/4.  For this study, the grid spacing is 100m, so we will query UCVM at a depth of 25m and use that value to populate the surface grid point.  The rationale is that the media parameters at the surface grid point are supposed to represent the material properties for [0, 50m], and this is better represented by using the value at 25m than the value at 0m.&lt;br /&gt;
&lt;br /&gt;
== Verification Tests ==&lt;br /&gt;
&lt;br /&gt;
TODO&lt;br /&gt;
&lt;br /&gt;
== Technical and Scientific Updates ==&lt;br /&gt;
&lt;br /&gt;
Since our last study we have made a number of scientific updates to the platform, many as a result of the BBP verification effort.&lt;br /&gt;
&lt;br /&gt;
*Several bugs were found and fixed in the AWP code.&lt;br /&gt;
*We have switched from stress insertion to velocity insertion of the impulse when generating SGTs.&lt;br /&gt;
*The sponge zone used in the absorbing boundary condition was increased from 50 to 80 points.&lt;br /&gt;
*By default, we use a depth of h/4 when querying UCVM to populate the surface grid point.&lt;br /&gt;
*The padding between the nearest fault or site and the edge of the volume was increased from 30 to 50 km.&lt;br /&gt;
*We fixed a bug in the coordinate conversion between RWG and AWP: previously we were adding 1 to the RWG z-coordinate to produce the AWP z-coordinate, but both codes use z=1 to represent the surface and therefore no increment should be applied.&lt;br /&gt;
*When calculating Qs in the SGT header generation code, a default Qs of 25 was always used.  This has been changed to Qs=0.05Vs.&lt;br /&gt;
*We have turned off the adjustment of mu and lambda.&lt;br /&gt;
*FP was increased from 0.5 to 1.0.&lt;br /&gt;
*We modified the lambda and mu calculations in AWP to use the original media parameter values from the velocity mesh rather than the FP-modified ones when calculating strains, to be consistent with RWG.&lt;br /&gt;
&lt;br /&gt;
=== Study 18.8 Lessons Learned ===&lt;br /&gt;
&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider separating SGT and PP workflows in auto-submit tool to better manage the number of each, for improved reservation utilization.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create a read-only way to look at the CyberShake Run Manager website.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider reducing levels of the workflow hierarchy, thereby reducing load on shock.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which require fewer GPUs.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which exceed memory on nodes.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create new velocity model ID for composite model, capturing metadata.&amp;lt;/i&amp;gt;&lt;br /&gt;
We modified the database to enable composite models, but for this study we are just using a single model.&lt;br /&gt;
*&amp;lt;i&amp;gt;Verify all Java processes grab a reasonable amount of memory.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Clear disk space before study begins to avoid disk contention.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Add stress test before beginning study, for multiple sites at a time, with cleanup.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;In addition to disk space, check local inode usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
Only 1% of the inodes are used on shock-carc; we will assume /project has sufficient inodes, as we can't check them.&lt;br /&gt;
*&amp;lt;i&amp;gt;Establish clear rules and policies about reservation usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;If submitting to multiple reservations, make sure enough jobs are eligible to run that no reservation is starved.&amp;lt;/i&amp;gt;&lt;br /&gt;
We are not planning to run this study with reservations.&lt;br /&gt;
*&amp;lt;i&amp;gt;If running primarily SGTs for awhile, make sure they don't get deleted due to quota policies.&amp;lt;/i&amp;gt;&lt;br /&gt;
We will stage the SGTs to HPSS if there is a delay in post-processing them.  Summit has a 90-day purge policy, so we will have some time.&lt;br /&gt;
&lt;br /&gt;
== Output Data Products ==&lt;br /&gt;
&lt;br /&gt;
=== File-based data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to produce the following data products which will be stored at CARC:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Seismograms: 2-component seismograms, 6000 timesteps (300 sec) each&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;PSA: X and Y spectral acceleration at 44 periods (10, 9.5, 9, 8.5, 8, 7.5, 7, 6.5, 6, 5.5, 5, 4.8, 4.6, 4.4, 4.2, 4, 3.8, 3.6, 3.4, 3.2, 2, 2.8, 2.6, 2.4, 2.2, 2, 1.66667, 1.42857, 1.25, 1.11111, 1, .66667, .5, .4, .33333, .285714, .25, .22222, .2, .16667, .142857, .125, .11111, .1 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50, the RotD50 azimuth, and RotD100 at 22 periods (1.0, 1.2, 1.4, 1.5, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8, 3.0, 3.5, 4.0, 4.4, 5.0, 5.5, 6.0, 6.5, 7.5, 8.5, 10.0 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Database data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to store the following data products in the database:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt; &lt;br /&gt;
&amp;lt;li&amp;gt;PSA: none&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50 and RotD100 at 10, 7.5, 5, 4, 3, and 2 sec.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: acceleration 5-75% and 5-95% for X and Y components&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Computational and Data Estimates ==&lt;br /&gt;
&lt;br /&gt;
=== Computational Estimates ===&lt;br /&gt;
&lt;br /&gt;
We based these estimates by scaling from site USC (the average site has 3.8% more events and a volume 9.7% larger).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+SGT calculation&lt;br /&gt;
! !! UCVM runtime !! UCVM nodes !! SGT runtime !! SGT nodes !! Other SGT workflow jobs !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC&lt;br /&gt;
| 372 sec || 80 || 2628 sec || 67 || 1510 node-sec || 106.5 node-hrs&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 408 sec || 80 || 2883 sec || 67 || 1550 node-sec || 116.8 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives us an estimate of 43k node-hours for SGT calculation.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+PP calculation&lt;br /&gt;
! !! DirectSynth runtime !! DirectSynth nodes !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC &lt;br /&gt;
| 1081 || 36 || 10.8&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 1122 || 36 || 11.2&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives an estimate of 4.2k node-hours for post-processing.&lt;br /&gt;
&lt;br /&gt;
=== Data Estimates ===&lt;br /&gt;
&lt;br /&gt;
==== Summit ====&lt;br /&gt;
&lt;br /&gt;
{| class='wikitable'&lt;br /&gt;
|+Data estimates &lt;br /&gt;
! !! Velocity mesh !! SGTs size !! Temp data !! Output data&lt;br /&gt;
|-&lt;br /&gt;
| USC || 243 GB || 196 GB || 439 GB || 3.4 GB&lt;br /&gt;
|-&lt;br /&gt;
| Average || 267 GB || 203 GB || 470 GB || 3.5 GB&lt;br /&gt;
|-&lt;br /&gt;
! Total !! 87 TB !! 66 TB !! 153 TB !! 1.2 TB&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
This is a total of 307 TB, which we could reach if we calculate all the SGTs first.  The default quota on Summit is 50 TB, so I suggest we request a quota increase to 400 TB so we don't need to rely on cleanup.&lt;br /&gt;
&lt;br /&gt;
If we need to keep the SGTs for awhile before performing post-processing, the quota on HPSS is 100 TB, so we could store them there.&lt;br /&gt;
&lt;br /&gt;
==== CARC ====&lt;br /&gt;
&lt;br /&gt;
We estimate 1.2 TB in output data, which will be transferred back to CARC.&lt;br /&gt;
&lt;br /&gt;
==== shock-carc ====&lt;br /&gt;
&lt;br /&gt;
The study should use approximately 200 GB in workflow log space on /home/shock.  This drive has approximately 1.7 TB free.&lt;br /&gt;
&lt;br /&gt;
==== moment database ====&lt;br /&gt;
&lt;br /&gt;
The PeakAmplitudes table uses approximately 100 bytes per entry.&lt;br /&gt;
&lt;br /&gt;
100 bytes/entry * 16 entries/event * 76786 events/site * 335 sites = 38 GB.  The drive on moment with the mysql database has 919 GB free.&lt;br /&gt;
&lt;br /&gt;
== Lessons Learned ==&lt;br /&gt;
&lt;br /&gt;
== Performance Metrics ==&lt;br /&gt;
&lt;br /&gt;
== Production Checklist ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Science:&amp;lt;/b&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Confirm that ERF 62 test produces results which closely match ERF 61&amp;lt;/s&amp;gt;&lt;br /&gt;
*Restore improvements to codes since ERF 58, and rerun USC for ERF 62&lt;br /&gt;
*Create prioritized site list.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Technical:&amp;lt;/b&amp;gt;&lt;br /&gt;
*Approach OLCF for the following requests:&lt;br /&gt;
**Quota increase to 400 TB&lt;br /&gt;
**8 jobs ready to run&lt;br /&gt;
**5 jobs in bin 5.&lt;br /&gt;
*To be able to bundle jobs, fix issue with Summit glideins.&lt;br /&gt;
*To run post-processing, resolve issues using GO to transfer data back to /project at CARC.&lt;br /&gt;
*Tag code&lt;br /&gt;
*Modify job sizes and runtimes.&lt;br /&gt;
*Test auto-submit script.&lt;br /&gt;
*Prepare pending file.&lt;br /&gt;
*Create XML file describing study for web monitoring tool.&lt;br /&gt;
*Get usage stats for Summit.&lt;br /&gt;
*Check cronjob on Summit for monitoring jobs.&lt;br /&gt;
*Call with OLCF staff?&lt;br /&gt;
*Activate script for monitoring x509 certificate.&lt;br /&gt;
*Modify workflows to not insert or calculate curves for PSA data.&lt;br /&gt;
*Modify dax-generator to use h/4 as default for surface point.&lt;br /&gt;
*Modify dax-generator to use ERF62 parameter file for generating GMPE comparison curves.&lt;br /&gt;
&lt;br /&gt;
== Presentations, Posters, and Papers ==&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26243</id>
		<title>CyberShake Study 21.12</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_21.12&amp;diff=26243"/>
		<updated>2021-12-03T18:52:24Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake 21.12 is a computational study to use a new ERF with CyberShake, generated from an RSQSim catalog.  We plan to calculate results for 335 sites in Southern California using the RSQSim ERF, a minimum Vs of 500 m/s, and a frequency of 1 Hz.  We will use the CVM-S4.26.M01 model, and the GPU implementation of AWP-ODC-SGT enhanced from the BBP verification testing.  We will begin by generating all sets of SGTs, on Summit, then post-process them on a combination of Summit and Frontera.&lt;br /&gt;
&lt;br /&gt;
== Status ==&lt;br /&gt;
&lt;br /&gt;
This study is in the pre-production phase.  Production is scheduled to begin in mid-December, 2021.&lt;br /&gt;
&lt;br /&gt;
== Data Products ==&lt;br /&gt;
&lt;br /&gt;
== Science Goals ==&lt;br /&gt;
&lt;br /&gt;
The science goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Calculate a regional CyberShake model using an alternative, RSQSim-derived ERF.&lt;br /&gt;
*Compare results from an RSQSim ERF to results using a UCERF2 ERF (Study 15.4).&lt;br /&gt;
*Quantify effects of source model non-ergodicity&lt;br /&gt;
*Compare spatial distribution of ground motions (including  directivity) to empirical and kinematic models&lt;br /&gt;
&lt;br /&gt;
== Technical Goals ==&lt;br /&gt;
&lt;br /&gt;
The technical goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Perform a study using OLCF Summit as a key compute resource.&lt;br /&gt;
*Evaluate the performance of the new workflow submission host, shock-carc.&lt;br /&gt;
*Use Globus Online for staging of output data products.&lt;br /&gt;
&lt;br /&gt;
== ERF ==&lt;br /&gt;
&lt;br /&gt;
The ERF was generated from an RSQSim catalog, with the following parameters:&lt;br /&gt;
*715kyr catalog (the first 65k years of events were dropped, so that every fault's first event is excluded)&lt;br /&gt;
*220,927 earthquakes with M6.5+&lt;br /&gt;
*All events have equal probability, 1/715k&lt;br /&gt;
&lt;br /&gt;
Additional details are available on the [http://opensha.usc.edu/ftp/kmilner/markdown/rsqsim-analysis/catalogs/rundir4983_stitched/#bruce-4983-stitched catalog's metadata page]. This is the catalog used in [https://pubs.geoscienceworld.org/ssa/bssa/article/111/2/898/593757/Toward-Physics-Based-Nonergodic-PSHA-A-Prototype Milner et al., 2021], which used 0.5 Hz CyberShake simulations performed in May, 2020.&lt;br /&gt;
&lt;br /&gt;
== Sites ==&lt;br /&gt;
&lt;br /&gt;
We will run a list of 335 sites, taken from the site list that was used in other Southern California studies. The order of execution will be:&lt;br /&gt;
&lt;br /&gt;
*10 sites used in Milner et al. (2021), each with top mesh point Vs at the 500 m/s floor: USC, SMCA, OSI, WSS, SBSM, LAF, s022, STNI, WNGC, PDE&lt;br /&gt;
*PAS hard rock site&lt;br /&gt;
*20 km site grid&lt;br /&gt;
*10 km site grid&lt;br /&gt;
*Remaining POIs, select 5km grid sites also used in Study 15.4&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_21.12_site_map.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
*[[Media:Study_21.12_sites.csv|CSV site list]]&lt;br /&gt;
*[[Media:Study_21.12_sites_names.kml|KML site list with names]]&lt;br /&gt;
*[[Media:Study_21.12_sites_no_names.kml|KML site list without names]]&lt;br /&gt;
&lt;br /&gt;
== Velocity Model ==&lt;br /&gt;
&lt;br /&gt;
We will use CVM-S4.26.M01.&lt;br /&gt;
&lt;br /&gt;
To better represent the near-surface layer, we will populate the velocity parameters for the surface point by querying the velocity model at a depth of (grid spacing)/4.  For this study, the grid spacing is 100m, so we will query UCVM at a depth of 25m and use that value to populate the surface grid point.  The rationale is that the media parameters at the surface grid point are supposed to represent the material properties for [0, 50m], and this is better represented by using the value at 25m than the value at 0m.&lt;br /&gt;
&lt;br /&gt;
== Technical and Scientific Updates ==&lt;br /&gt;
&lt;br /&gt;
Since our last study we have made a number of scientific updates to the platform, many as a result of the BBP verification effort.&lt;br /&gt;
&lt;br /&gt;
*Several bugs were found and fixed in the AWP code.&lt;br /&gt;
*We have switched from stress insertion to velocity insertion of the impulse when generating SGTs.&lt;br /&gt;
*The sponge zone used in the absorbing boundary condition was increased from 50 to 80 points.&lt;br /&gt;
*By default, we use a depth of h/4 when querying UCVM to populate the surface grid point.&lt;br /&gt;
*The padding between the nearest fault or site and the edge of the volume was increased from 30 to 50 km.&lt;br /&gt;
*We fixed a bug in the coordinate conversion between RWG and AWP: previously we were adding 1 to the RWG z-coordinate to produce the AWP z-coordinate, but both codes use z=1 to represent the surface and therefore no increment should be applied.&lt;br /&gt;
*When calculating Qs in the SGT header generation code, a default Qs of 25 was always used.  This has been changed to Qs=0.05Vs.&lt;br /&gt;
*We have turned off the adjustment of mu and lambda.&lt;br /&gt;
*FP was increased from 0.5 to 1.0.&lt;br /&gt;
 &lt;br /&gt;
&lt;br /&gt;
=== Study 18.8 Lessons Learned ===&lt;br /&gt;
&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider separating SGT and PP workflows in auto-submit tool to better manage the number of each, for improved reservation utilization.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create a read-only way to look at the CyberShake Run Manager website.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Consider reducing levels of the workflow hierarchy, thereby reducing load on shock.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which require fewer GPUs.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Determine advance plan for SGTs for sites which exceed memory on nodes.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Create new velocity model ID for composite model, capturing metadata.&amp;lt;/i&amp;gt;&lt;br /&gt;
We modified the database to enable composite models, but for this study we are just using a single model.&lt;br /&gt;
*&amp;lt;i&amp;gt;Verify all Java processes grab a reasonable amount of memory.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Clear disk space before study begins to avoid disk contention.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;Add stress test before beginning study, for multiple sites at a time, with cleanup.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;In addition to disk space, check local inode usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
Only 1% of the inodes are used on shock-carc; we will assume /project has sufficient inodes, as we can't check them.&lt;br /&gt;
*&amp;lt;i&amp;gt;Establish clear rules and policies about reservation usage.&amp;lt;/i&amp;gt;&lt;br /&gt;
*&amp;lt;i&amp;gt;If submitting to multiple reservations, make sure enough jobs are eligible to run that no reservation is starved.&amp;lt;/i&amp;gt;&lt;br /&gt;
We are not planning to run this study with reservations.&lt;br /&gt;
*&amp;lt;i&amp;gt;If running primarily SGTs for awhile, make sure they don't get deleted due to quota policies.&amp;lt;/i&amp;gt;&lt;br /&gt;
We will stage the SGTs to HPSS if there is a delay in post-processing them.  Summit has a 90-day purge policy, so we will have some time.&lt;br /&gt;
&lt;br /&gt;
== Output Data Products ==&lt;br /&gt;
&lt;br /&gt;
=== File-based data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to produce the following data products which will be stored at CARC:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Seismograms: 2-component seismograms, 6000 timesteps (300 sec) each&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;PSA: X and Y spectral acceleration at 44 periods (10, 9.5, 9, 8.5, 8, 7.5, 7, 6.5, 6, 5.5, 5, 4.8, 4.6, 4.4, 4.2, 4, 3.8, 3.6, 3.4, 3.2, 2, 2.8, 2.6, 2.4, 2.2, 2, 1.66667, 1.42857, 1.25, 1.11111, 1, .66667, .5, .4, .33333, .285714, .25, .22222, .2, .16667, .142857, .125, .11111, .1 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50, the RotD50 azimuth, and RotD100 at 22 periods (1.0, 1.2, 1.4, 1.5, 1.6, 1.8, 2.0, 2.2, 2.4, 2.6, 2.8, 3.0, 3.5, 4.0, 4.4, 5.0, 5.5, 6.0, 6.5, 7.5, 8.5, 10.0 sec)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Database data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to store the following data products in the database:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ul&amp;gt; &lt;br /&gt;
&amp;lt;li&amp;gt;PSA: none&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;RotD: RotD50 and RotD100 at 10, 7.5, 5, 4, 3, and 2 sec.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Durations: acceleration 5-75% and 5-95% for X and Y components&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Computational and Data Estimates ==&lt;br /&gt;
&lt;br /&gt;
=== Computational Estimates ===&lt;br /&gt;
&lt;br /&gt;
We based these estimates by scaling from site USC (the average site has 3.8% more events and a volume 9.7% larger).&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+SGT calculation&lt;br /&gt;
! !! UCVM runtime !! UCVM nodes !! SGT runtime !! SGT nodes !! Other SGT workflow jobs !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC&lt;br /&gt;
| 372 sec || 80 || 2628 sec || 67 || 1510 node-sec || 106.5 node-hrs&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 408 sec || 80 || 2883 sec || 67 || 1550 node-sec || 116.8 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives us an estimate of 43k node-hours for SGT calculation.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+PP calculation&lt;br /&gt;
! !! DirectSynth runtime !! DirectSynth nodes !! Summit Total&lt;br /&gt;
|-&lt;br /&gt;
! USC &lt;br /&gt;
| 1081 || 36 || 10.8&lt;br /&gt;
|-&lt;br /&gt;
! Average (est) &lt;br /&gt;
| 1122 || 36 || 11.2&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Adding 10% overrun margin gives an estimate of 4.2k node-hours for post-processing.&lt;br /&gt;
&lt;br /&gt;
=== Data Estimates ===&lt;br /&gt;
&lt;br /&gt;
==== Summit ====&lt;br /&gt;
&lt;br /&gt;
{| class='wikitable'&lt;br /&gt;
|+Data estimates &lt;br /&gt;
! !! Velocity mesh !! SGTs size !! Temp data !! Output data&lt;br /&gt;
|-&lt;br /&gt;
| USC || 243 GB || 196 GB || 439 GB || 3.4 GB&lt;br /&gt;
|-&lt;br /&gt;
| Average || 267 GB || 203 GB || 470 GB || 3.5 GB&lt;br /&gt;
|-&lt;br /&gt;
! Total !! 87 TB !! 66 TB !! 153 TB !! 1.2 TB&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
This is a total of 307 TB, which we could reach if we calculate all the SGTs first.  The default quota on Summit is 50 TB, so I suggest we request a quota increase to 400 TB so we don't need to rely on cleanup.&lt;br /&gt;
&lt;br /&gt;
If we need to keep the SGTs for awhile before performing post-processing, the quota on HPSS is 100 TB, so we could store them there.&lt;br /&gt;
&lt;br /&gt;
==== CARC ====&lt;br /&gt;
&lt;br /&gt;
We estimate 1.2 TB in output data, which will be transferred back to CARC.&lt;br /&gt;
&lt;br /&gt;
==== shock-carc ====&lt;br /&gt;
&lt;br /&gt;
The study should use approximately 200 GB in workflow log space on /home/shock.  This drive has approximately 1.7 TB free.&lt;br /&gt;
&lt;br /&gt;
==== moment database ====&lt;br /&gt;
&lt;br /&gt;
The PeakAmplitudes table uses approximately 100 bytes per entry.&lt;br /&gt;
&lt;br /&gt;
100 bytes/entry * 16 entries/event * 76786 events/site * 335 sites = 38 GB.  The drive on moment with the mysql database has 919 GB free.&lt;br /&gt;
&lt;br /&gt;
== Lessons Learned ==&lt;br /&gt;
&lt;br /&gt;
== Performance Metrics ==&lt;br /&gt;
&lt;br /&gt;
== Production Checklist ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Science:&amp;lt;/b&amp;gt;&lt;br /&gt;
*Confirm that ERF 62 test produces results which closely match ERF 61&lt;br /&gt;
*Restore improvements to codes since ERF 58, and rerun USC for ERF 62&lt;br /&gt;
*Create prioritized site list.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Technical:&amp;lt;/b&amp;gt;&lt;br /&gt;
*Approach OLCF for the following requests:&lt;br /&gt;
**Quota increase to 400 TB&lt;br /&gt;
**8 jobs ready to run&lt;br /&gt;
**5 jobs in bin 5.&lt;br /&gt;
*To be able to bundle jobs, fix issue with Summit glideins.&lt;br /&gt;
*To run post-processing, resolve issues using GO to transfer data back to /project at CARC.&lt;br /&gt;
*Tag code&lt;br /&gt;
*Modify job sizes and runtimes.&lt;br /&gt;
*Test auto-submit script.&lt;br /&gt;
*Prepare pending file.&lt;br /&gt;
*Create XML file describing study for web monitoring tool.&lt;br /&gt;
*Get usage stats for Summit.&lt;br /&gt;
*Check cronjob on Summit for monitoring jobs.&lt;br /&gt;
*Call with OLCF staff?&lt;br /&gt;
*Activate script for monitoring x509 certificate.&lt;br /&gt;
*Modify workflows to not insert or calculate curves for PSA data.&lt;br /&gt;
&lt;br /&gt;
== Presentations, Posters, and Papers ==&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Wills_Map&amp;diff=24104</id>
		<title>Wills Map</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Wills_Map&amp;diff=24104"/>
		<updated>2020-03-31T20:31:14Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* Description of Wills Digital Map */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Wills References ==&lt;br /&gt;
&lt;br /&gt;
* Wills, Christopher &amp;amp; Gutierrez, Carlos &amp;amp; Perez, Ante &amp;amp; Branum, David. (2015). A Next Generation V S 30 Map for California Based on Geology and Topography. Bulletin of the Seismological Society of America. 105. 10.1785/0120150105. &lt;br /&gt;
*[https://www.researchgate.net/publication/283670069_A_Next_Generation_V_S_30_Map_for_California_Based_on_Geology_and_Topograpy Wills 2015]&lt;br /&gt;
* Wills 2006 - C.J. and Clahan, K.B., [2006], Developing a map of geologically defined site-conditions categories for California: Bull. Seism. Soc. Am., Vol. 96 pp. 1483 – 1501&lt;br /&gt;
&lt;br /&gt;
== grd2etree ==&lt;br /&gt;
&lt;br /&gt;
Background information from &lt;br /&gt;
&lt;br /&gt;
  https://scec.usc.edu/scecpedia/UCVM_Utah&lt;br /&gt;
  https://github.com/SCECcode/UCVMC/wiki/Manual-Pages&lt;br /&gt;
&lt;br /&gt;
Where we worked at in generating the ucvm.e &lt;br /&gt;
that we are currently using:&lt;br /&gt;
&lt;br /&gt;
  /home/scec-00/patrices/opt/vs30&lt;br /&gt;
&lt;br /&gt;
Created a configure file sample.conf &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# grid2etree UCM Map config file&lt;br /&gt;
&lt;br /&gt;
# Domain corners coordinates (degrees): &lt;br /&gt;
proj=+proj=aeqd +lat_0=36.0 +lon_0=-120.0 +x_0=0.0 +y_0=0.0 &lt;br /&gt;
lon_0=-129.75 &lt;br /&gt;
lat_0=40.75 &lt;br /&gt;
rot=55.0&lt;br /&gt;
&lt;br /&gt;
# Domain dimensions (meters): &lt;br /&gt;
x-size=1800000.0 &lt;br /&gt;
y-size=900000.0&lt;br /&gt;
&lt;br /&gt;
# Spacing &lt;br /&gt;
spacing=250.0&lt;br /&gt;
&lt;br /&gt;
# Etree parameters and info &lt;br /&gt;
title=UCVM_Elev_Vs30_Map_Wills_Wald&lt;br /&gt;
author=P_Small &lt;br /&gt;
date=05/2011 &lt;br /&gt;
outputfile=ucvm.e&lt;br /&gt;
&lt;br /&gt;
# Grid data directories &lt;br /&gt;
elev_hr_dir=/home/scec-00/patrices/opt/ned&lt;br /&gt;
elev_lr_dir=/home/scec-00/patrices/opt/bath&lt;br /&gt;
vs30_hr_dir=/home/scec-00/patrices/opt/vs30/wills_gridfloat&lt;br /&gt;
vs30_lr_dir=/home/scec-00/patrices/opt/vs30/wald_gridfloat&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
NOTE: the ucvm.e newly recreated is 1.2G but UCVMC's ucvm.e is 683M&lt;br /&gt;
&lt;br /&gt;
command used to generate ucvm.e&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./grd2etree -f sample.conf&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== new ucvm.e ==&lt;br /&gt;
&lt;br /&gt;
A new ucvm.e is created by replacing the vs30_hr_dir with the rasterized data file from Kevin&lt;br /&gt;
and update the hdr file to match with what is expected by grd2etree&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#BYTEORDER      I&lt;br /&gt;
LAYOUT         BIL&lt;br /&gt;
nrows          37900&lt;br /&gt;
ncols          41104&lt;br /&gt;
NBANDS         1&lt;br /&gt;
NBITS          32&lt;br /&gt;
BANDROWBYTES   164416&lt;br /&gt;
TOTALROWBYTES  164416&lt;br /&gt;
PIXELTYPE      FLOAT&lt;br /&gt;
&lt;br /&gt;
#ULXMAP         -124.406528862058&lt;br /&gt;
#ULYMAP         42.0090923024621&lt;br /&gt;
xllcorner         -124.406528862058&lt;br /&gt;
yllcorner         32.534092&lt;br /&gt;
cellsize          0.00025&lt;br /&gt;
NODATA_value  -9999&lt;br /&gt;
byteorder     LSBFIRST&lt;br /&gt;
XDIM           0.00025&lt;br /&gt;
YDIM           0.00025&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Verify using CCA+GTL ==&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
! With fixed scale &lt;br /&gt;
|-&lt;br /&gt;
| [[File:cca_noGTL_stock.png|400px|thumb|no GTL]]&lt;br /&gt;
|-&lt;br /&gt;
! With fixed scalebar,&lt;br /&gt;
|-&lt;br /&gt;
| [[File:cca_GTL_stock_s.png|400px|thumb|GTL, with Original ucvm.e]]&lt;br /&gt;
| [[File:cca_GTL_old_s.png|400px|thumb|GTL, with recreated ucvm.e]]&lt;br /&gt;
| [[File:cca_GTL_new_s.png|400px|thumb|GTL, with ucvm.e with Wills 2015]]&lt;br /&gt;
|-&lt;br /&gt;
! With variable scalebar,&lt;br /&gt;
|-&lt;br /&gt;
| [[File:cca_GTL_stock.png|400px|thumb|GTL, with Original ucvm.e]]&lt;br /&gt;
| [[File:cca_GTL_old.png|400px|thumb|GTL, with recreated ucvm.e]]&lt;br /&gt;
| [[File:cca_GTL_new.png|400px|thumb|GTL, with ucvm.e with Wills 2015]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
command used to generate the plots,&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./plot_horizontal_slice.py -b 33,-123 -u 39.5,-115 -d vs -c cca -a sd -s 0.05 -e 0 -o cca.png&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Verify using vs30 etree map ==&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|-&lt;br /&gt;
! With variable scale &lt;br /&gt;
|-&lt;br /&gt;
| [[File:cca_vs30_etree_old.png|400px|thumb|GTL, with recreated ucvm.e]]&lt;br /&gt;
| [[File:cca_vs30_etree_new.png |400px|thumb|GTL, with. ucvm.e with Wills2015]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
command used to generate the plots,&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./plot_vs30_etree_map.py -b 33.5,-119 -u 34.5,-117 -s 0.0005 -c cca -a sd_r -o cca_vs30_etree_new.png&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Comparison of Wills Map 2006 versus 2015 ==&lt;br /&gt;
These two wills maps were made from gis files released by the state.&lt;br /&gt;
{|&lt;br /&gt;
| [[File:wills_2006.png|thumb|300px|Wills Map 2006]]&lt;br /&gt;
| [[File:wills_2015.png|thumb|300px|Wills Map 2015]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
These two wills maps are extracted by ucvm. The etree vs30 values are based on wills 2006, so we expect to see data from above in these maps.&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Cvmh_nogtl_vs30_etree_map.png|thumb|300px|Vs30 Map based on UCVM Etree for 2006]]&lt;br /&gt;
| [[File:wills_2015.png|thumb|300px|Wills Map 2015]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Description of Wills Digital Map ==&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The supplementary data of the Wills et al. (2015) paper provides the map as a GIS shapfile (http://www.seismosoc.org/Publications/BSSA_html/bssa_105-6/2015105-esupp/index.html). I then rasterized it to a binary floating point file (which similar to the 2006 file, except with floats instead of short ints) using the gdal_rasterize command (http://www.gdal.org/gdal_rasterize.html). The command to rasterize it is:&lt;br /&gt;
&lt;br /&gt;
gdal_rasterize -a Vs30_Mean -tr $RES $RES -l wills_2015_vs30_projected -of EHdr -ot Float32 ./wills_2015_vs30_projected.shp ./raster_$RES.flt&lt;br /&gt;
&lt;br /&gt;
Where $RES is the grid spacing in degrees.&lt;br /&gt;
&lt;br /&gt;
The original and rasterized data is stored here on opensha.usc.edu:&lt;br /&gt;
&lt;br /&gt;
/export/opensha-00/data/siteData/wills_2015&lt;br /&gt;
&lt;br /&gt;
I suggest you use raster_0.00025.flt, which is rasterized with 0.00025 degree spacing (~25 meters). The raster_0.00025.hdr file contains the necessary coordinates for georeferencing it, but I'll summarize here:&lt;br /&gt;
&lt;br /&gt;
The first location in the file is at:&lt;br /&gt;
&lt;br /&gt;
x/longitude/ULXMAP = -124.406528862058&lt;br /&gt;
y/latitude/ULYMAP = 42.0090923024621&lt;br /&gt;
&lt;br /&gt;
This is the upper left (northwest) point in the map.&lt;br /&gt;
&lt;br /&gt;
numX/numLon/NCOLS = 41104&lt;br /&gt;
numY/numLat/NROWS = 37900&lt;br /&gt;
&lt;br /&gt;
Grid spacing (both X and Y): 0.00025 degrees&lt;br /&gt;
&lt;br /&gt;
Data type: 4-byte/32-bit little-endian floating point numbers&lt;br /&gt;
&lt;br /&gt;
Mesh ordering: fast XY, so the first few points are:&lt;br /&gt;
&lt;br /&gt;
    Index 0 (at byte 0): -124.406528862058, 42.0090923024621&lt;br /&gt;
    Index 1 (at byte 4): -124.406528862058+0.00025, 42.0090923024621 = -124.406278862058, 42.0090923024621&lt;br /&gt;
    Index 2 (at byte 8): -124.406528862058+0.0005, 42.0090923024621 = -124.406028862058, 42.0090923024621&lt;br /&gt;
    ...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
*[[CME Project]]&lt;br /&gt;
*[[CyberShake]]&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_Restart_and_Processing_Instructions&amp;diff=23076</id>
		<title>RSQSim Restart and Processing Instructions</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_Restart_and_Processing_Instructions&amp;diff=23076"/>
		<updated>2019-07-09T01:29:21Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* Restarting an RSQSim simulation in order to extend a catalog from where it ended */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;'''&lt;br /&gt;
== Restarting an RSQSim simulation in order to extend a catalog from where it ended ==&lt;br /&gt;
'''&lt;br /&gt;
&lt;br /&gt;
Note: You'll need some of the R scripts from RSQSimPostProcess. All commands below are R commands.&lt;br /&gt;
&lt;br /&gt;
1.  Create a new directory for the simulation you are setting up, and copy over the input file, pbs script, fault file (e.g. *.flt, or zfault_Deepen.in), *.KZero, and *.neighbors files. For this example, we'll call the previous catalog &amp;quot;cat1&amp;quot; and the extension &amp;quot;cat2&amp;quot;&lt;br /&gt;
&lt;br /&gt;
2. Rename cat1.in and cat1.pbs to match the new catalog name (cat2.in and cat2.pbs).&lt;br /&gt;
&lt;br /&gt;
3. Update the catalog name and input file name in the new cat2.pbs, and the outNameInfix in the new cat2.in files.&lt;br /&gt;
&lt;br /&gt;
4. Read in catalog 1:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readEqs(&amp;quot;/path/to/eqs.cat1.out&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
5. Create the new input files for shear stress, normal stress, and theta from the end of catalog 1 and fill those in for the variables initTauFname, initSigmaFname, and initThetaFname in your cat2.in file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tmp = mkInitTauSigmaThetaSlipSpeed(eqs, Inf, writeTau = &amp;quot;final&amp;quot;, writeSlipSpeed = 0,   &lt;br /&gt;
		                   initTauFile=&amp;quot;cat2.initTau&amp;quot;,&lt;br /&gt;
		                   initSigmaFile=&amp;quot;cat2.initSigma&amp;quot;,&lt;br /&gt;
	                           initThetaFile = &amp;quot;cat2.initTheta&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* If the final snapshots (.out.final files) weren’t written out for cat1, but snapshots were written out during the simulations, you can change writeTau = “final” to writeTau = 2 in order to use those snapshots (out.2 files). If no snapshots were written then you’re out of luck and can’t restart that simulation.&lt;br /&gt;
&lt;br /&gt;
6. Get the start time for cat2 and copy the whole number (all 22 digits) into the variable tStart in the new cat2.in file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tStart = format(tmp$t, digits=22) 	 &lt;br /&gt;
&amp;lt;/pre&amp;gt;					&lt;br /&gt;
7.  Read or create a list of pinned patches using the variable pin:&lt;br /&gt;
&lt;br /&gt;
If there is a pin file for cat1:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
pin = scan(&amp;quot;../cat1.pin&amp;quot;) &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Otherwise, the pinned file is just a list of 0's (not pinned) or 1's (pinned) for each patch in the fault model, so if there isn't one, you can just create one of 0's:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
pin = rep(0, eqs$fault$np)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
8. Update pin from the list of patches that got locked in cat1 and write out the new pin file for cat2:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
system(&amp;quot;grep Eliminating ../*e | awk '{print $4}' &amp;gt; ../cat1.locked&amp;quot;)&lt;br /&gt;
pin2 = scan(file=&amp;quot;../cat1.locked&amp;quot;)			&lt;br /&gt;
pin[pin2] = 1						&lt;br /&gt;
write(pin, file=&amp;quot;cat2.pin&amp;quot;, ncol=1)				&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
9. Update the pinnedFname in your cat2.in file to cat2.pin&lt;br /&gt;
&lt;br /&gt;
10. Update maxT in your cat2.in file to the length you want the extended catalog to be (in seconds).&lt;br /&gt;
&lt;br /&gt;
=='''Combining extended/restarted catalogs into one long catalog''' ==&lt;br /&gt;
&lt;br /&gt;
Note: You'll need some of the R scripts from RSQSimPostProcess &lt;br /&gt;
&lt;br /&gt;
1. Set up a list to tell R to only load the files you need (eList, pList, dList, and tList):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
L = c(&amp;quot;e&amp;quot;, &amp;quot;p&amp;quot;, &amp;quot;d&amp;quot;, &amp;quot;t&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;		&lt;br /&gt;
2. Set up the list of catalogs to combine (in the order that they ran):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqFiles = c(&amp;quot;home/user/name/catalog/cat1/eqs.cat1.out&amp;quot;,&lt;br /&gt;
	    &amp;quot;home/user/name/catalog/cat2/eqs.cat2.out&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
3. Read and combine them into one catalog:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readAndCombineEqfiles(eqFiles, returnLists=L)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
4. Write out the new eqs.out, and List files:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
writeEqsAndLists(eqs, outFnameInfix = paste(&amp;quot;combinedCat&amp;quot;, sep = &amp;quot;&amp;quot;))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Note: Reading in an eqs.RData file is faster so it’s helpful save that version too. The whole combined ‘eqs’ data structure will be stored in the combinedCat.RData file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
save(eqs, file = &amp;quot;combinedCat.RData&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Final Note: Use the R function load() to read the combinedCat.RData file back in (only use readEqs.R on the eqs.out files). &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
load(&amp;quot;combinedCat.RData&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
5. If you also saved and need to combine your transitions file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
transFiles = c(“home/user/name/catalog/cat1/trans.cat1.out&amp;quot;,&lt;br /&gt;
	       &amp;quot;home/user/name/catalog/cat2/trans.cat2.out&amp;quot;)&lt;br /&gt;
&lt;br /&gt;
combineTransFiles(transfiles=transFiles, outtransfile = &amp;quot;trans.combinedCat.out&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''Filtering a catalog by magnitude and saving a new filtered catalog''' ==&lt;br /&gt;
&lt;br /&gt;
1. Read the full catalog in (or load the eqs.RData file):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readEqs(&amp;quot;eqs.cat.out&amp;quot;)			&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Pick a minimum magnitude M for your new catalog (you'll get M=muse &amp;amp; greater):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
muse = 7 &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
3. Get a list of the events &amp;gt;= muse&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
euse = which(eqs$M&amp;gt;muse)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
4. Subset/filter the catalog by euse (set renumberEvents=FALSE if you want to keep the original event ID's):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqsNew = subsetEqs(eqs,euse, renumberEvents=FALSE)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
5. Save the new filtered catalog:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
subName = &amp;quot;new_catalog_M7&amp;quot;				&lt;br /&gt;
writeEqsAndLists(eqsNew, outFnameInfix = paste(subName, sep = &amp;quot;&amp;quot;))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''Filtering a catalog by event time and removing events in the simulation runup''' ==&lt;br /&gt;
Note: To save the filtered catalog, see above instructions.&lt;br /&gt;
&lt;br /&gt;
1. Read the full catalog in (if you have already done this in your current R session, you don’t need to read it in again):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readEqs(&amp;quot;eqs.cat.out&amp;quot;)		&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Check the range of event times in the catalog (in catalog years):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
range(eqs$t0yr)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
3. Choose the minimun and maximum catalog times you’re interested in:&lt;br /&gt;
&lt;br /&gt;
Note: The simulation runup time depends on the slip rates of your faults (you want each one to rupture at least once). If you are running a simulation with the UCERF3 California fault model and you’re interested in the San Andreas Fault, a reasonable runup time is 5,000 years.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tmin = 5000&lt;br /&gt;
tmax = max(eqs$t0yr)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
4. Get a list of all events in your specified range (in this example, after the runup time):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
inRange = which(eqs$t0yr &amp;gt; tmin &amp;amp; eqs$t0yr &amp;lt;tmax)	&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Note: You can combine the time and magnitude filtering into a single which function:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Mmin = 7&lt;br /&gt;
Mmax = 8&lt;br /&gt;
&lt;br /&gt;
inRange = which(eqs$t0yr &amp;gt; tmin &amp;amp; eqs$t0yr &amp;lt;tmax &amp;amp; eqs$M &amp;gt;= Mmin &amp;amp; eqs$M &amp;lt;=Mmax)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=='''Filtering a catalog by which fault(s) ruptured'''==&lt;br /&gt;
&lt;br /&gt;
1. Read the full catalog in (if you have already don’t this in your current R session, you don’t need to read it in again):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readEqs(&amp;quot;eqs.cat.out&amp;quot;)		&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Make a list of the fault segments you are interested in:&lt;br /&gt;
&lt;br /&gt;
Note: If you are using the UCERF3 fault model, you can pass eqs$fault$faultName to the function unique() to get a list of the options. &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
faults = which(eqs$fault$faultName %in% c(&amp;quot;SanAndreas(MojaveN)&amp;quot;, &amp;quot;SanAndreas(MojaveS)&amp;quot;))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
See the next instruction set (Plotting the fault model in plotFault3d) to visually check that you’ve selected the correct fault segments.&lt;br /&gt;
&lt;br /&gt;
3. Get a list of all of the events that rupture patches of those fault segments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
euse = unique(eqs$eList[which(eqs$pList %in% faults)])	&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Note: You can easily filter the catalog further by the range of events in the previous instruction set:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
use = euse[which(euse %in% inRange)]	&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== '''Plotting the fault model in plotFault3d''' ==&lt;br /&gt;
&lt;br /&gt;
1. Go through steps 1 and 2 in the previous instruction set to get the list of fault patches you want to work with.	&lt;br /&gt;
&lt;br /&gt;
2. Initialize a variable for the plot colors: &lt;br /&gt;
&lt;br /&gt;
Note: This will create list of NA (which will show up as clear when you plot it) that is the length of the number of fault patches in the model.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
col = rep(NA, length(eqs$fault$np))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
3. Set the color of the fault patches that match segments you selected:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
col[faults]= &amp;quot;red&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
4. Plot the fault model, colored by the color variable you created:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
plotFault3d(eqs$fault, col = col, lwd = 0.5)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Note: The argument lwd sets the line width. To see the additional arguments you can pass to plotFault3d() use the args() function:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
args(plotFault3d)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_Restart_and_Processing_Instructions&amp;diff=23075</id>
		<title>RSQSim Restart and Processing Instructions</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_Restart_and_Processing_Instructions&amp;diff=23075"/>
		<updated>2019-07-09T01:28:30Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* Restarting an RSQSim simulation in order to extend a catalog from where it ended */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;'''&lt;br /&gt;
== Restarting an RSQSim simulation in order to extend a catalog from where it ended ==&lt;br /&gt;
'''&lt;br /&gt;
&lt;br /&gt;
Note: You'll need some of the R scripts from RSQSimPostProcess&lt;br /&gt;
&lt;br /&gt;
1.  Create a new directory for the simulation you are setting up, and copy over the input file, pbs script, fault file (e.g. *.flt, or zfault_Deepen.in), *.KZero, and *.neighbors files. For this example, we'll call the previous catalog &amp;quot;cat1&amp;quot; and the extension &amp;quot;cat2&amp;quot;&lt;br /&gt;
&lt;br /&gt;
2. Rename cat1.in and cat1.pbs to match the new catalog name (cat2.in and cat2.pbs).&lt;br /&gt;
&lt;br /&gt;
3. Update the catalog name and input file name in the new cat2.pbs, and the outNameInfix in the new cat2.in files.&lt;br /&gt;
&lt;br /&gt;
4. Read in catalog 1:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readEqs(&amp;quot;/path/to/eqs.cat1.out&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
5. Create the new input files for shear stress, normal stress, and theta from the end of catalog 1 and fill those in for the variables initTauFname, initSigmaFname, and initThetaFname in your cat2.in file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tmp = mkInitTauSigmaThetaSlipSpeed(eqs, Inf, writeTau = &amp;quot;final&amp;quot;, writeSlipSpeed = 0,   &lt;br /&gt;
		                   initTauFile=&amp;quot;cat2.initTau&amp;quot;,&lt;br /&gt;
		                   initSigmaFile=&amp;quot;cat2.initSigma&amp;quot;,&lt;br /&gt;
	                           initThetaFile = &amp;quot;cat2.initTheta&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* If the final snapshots (.out.final files) weren’t written out for cat1, but snapshots were written out during the simulations, you can change writeTau = “final” to writeTau = 2 in order to use those snapshots (out.2 files). If no snapshots were written then you’re out of luck and can’t restart that simulation.&lt;br /&gt;
&lt;br /&gt;
6. Get the start time for cat2 and copy the whole number (all 22 digits) into the variable tStart in the new cat2.in file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tStart = format(tmp$t, digits=22) 	 &lt;br /&gt;
&amp;lt;/pre&amp;gt;					&lt;br /&gt;
7.  Read or create a list of pinned patches using the variable pin:&lt;br /&gt;
&lt;br /&gt;
If there is a pin file for cat1:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
pin = scan(&amp;quot;../cat1.pin&amp;quot;) &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Otherwise, the pinned file is just a list of 0's (not pinned) or 1's (pinned) for each patch in the fault model, so if there isn't one, you can just create one of 0's:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
pin = rep(0, eqs$fault$np)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
8. Update pin from the list of patches that got locked in cat1 and write out the new pin file for cat2:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
system(&amp;quot;grep Eliminating ../*e | awk '{print $4}' &amp;gt; ../cat1.locked&amp;quot;)&lt;br /&gt;
pin2 = scan(file=&amp;quot;../cat1.locked&amp;quot;)			&lt;br /&gt;
pin[pin2] = 1						&lt;br /&gt;
write(pin, file=&amp;quot;cat2.pin&amp;quot;, ncol=1)				&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
9. Update the pinnedFname in your cat2.in file to cat2.pin&lt;br /&gt;
&lt;br /&gt;
10. Update maxT in your cat2.in file to the length you want the extended catalog to be (in seconds).&lt;br /&gt;
&lt;br /&gt;
=='''Combining extended/restarted catalogs into one long catalog''' ==&lt;br /&gt;
&lt;br /&gt;
Note: You'll need some of the R scripts from RSQSimPostProcess &lt;br /&gt;
&lt;br /&gt;
1. Set up a list to tell R to only load the files you need (eList, pList, dList, and tList):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
L = c(&amp;quot;e&amp;quot;, &amp;quot;p&amp;quot;, &amp;quot;d&amp;quot;, &amp;quot;t&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;		&lt;br /&gt;
2. Set up the list of catalogs to combine (in the order that they ran):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqFiles = c(&amp;quot;home/user/name/catalog/cat1/eqs.cat1.out&amp;quot;,&lt;br /&gt;
	    &amp;quot;home/user/name/catalog/cat2/eqs.cat2.out&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
3. Read and combine them into one catalog:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readAndCombineEqfiles(eqFiles, returnLists=L)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
4. Write out the new eqs.out, and List files:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
writeEqsAndLists(eqs, outFnameInfix = paste(&amp;quot;combinedCat&amp;quot;, sep = &amp;quot;&amp;quot;))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Note: Reading in an eqs.RData file is faster so it’s helpful save that version too. The whole combined ‘eqs’ data structure will be stored in the combinedCat.RData file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
save(eqs, file = &amp;quot;combinedCat.RData&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Final Note: Use the R function load() to read the combinedCat.RData file back in (only use readEqs.R on the eqs.out files). &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
load(&amp;quot;combinedCat.RData&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
5. If you also saved and need to combine your transitions file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
transFiles = c(“home/user/name/catalog/cat1/trans.cat1.out&amp;quot;,&lt;br /&gt;
	       &amp;quot;home/user/name/catalog/cat2/trans.cat2.out&amp;quot;)&lt;br /&gt;
&lt;br /&gt;
combineTransFiles(transfiles=transFiles, outtransfile = &amp;quot;trans.combinedCat.out&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''Filtering a catalog by magnitude and saving a new filtered catalog''' ==&lt;br /&gt;
&lt;br /&gt;
1. Read the full catalog in (or load the eqs.RData file):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readEqs(&amp;quot;eqs.cat.out&amp;quot;)			&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Pick a minimum magnitude M for your new catalog (you'll get M=muse &amp;amp; greater):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
muse = 7 &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
3. Get a list of the events &amp;gt;= muse&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
euse = which(eqs$M&amp;gt;muse)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
4. Subset/filter the catalog by euse (set renumberEvents=FALSE if you want to keep the original event ID's):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqsNew = subsetEqs(eqs,euse, renumberEvents=FALSE)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
5. Save the new filtered catalog:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
subName = &amp;quot;new_catalog_M7&amp;quot;				&lt;br /&gt;
writeEqsAndLists(eqsNew, outFnameInfix = paste(subName, sep = &amp;quot;&amp;quot;))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== '''Filtering a catalog by event time and removing events in the simulation runup''' ==&lt;br /&gt;
Note: To save the filtered catalog, see above instructions.&lt;br /&gt;
&lt;br /&gt;
1. Read the full catalog in (if you have already done this in your current R session, you don’t need to read it in again):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readEqs(&amp;quot;eqs.cat.out&amp;quot;)		&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Check the range of event times in the catalog (in catalog years):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
range(eqs$t0yr)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
3. Choose the minimun and maximum catalog times you’re interested in:&lt;br /&gt;
&lt;br /&gt;
Note: The simulation runup time depends on the slip rates of your faults (you want each one to rupture at least once). If you are running a simulation with the UCERF3 California fault model and you’re interested in the San Andreas Fault, a reasonable runup time is 5,000 years.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tmin = 5000&lt;br /&gt;
tmax = max(eqs$t0yr)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
4. Get a list of all events in your specified range (in this example, after the runup time):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
inRange = which(eqs$t0yr &amp;gt; tmin &amp;amp; eqs$t0yr &amp;lt;tmax)	&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Note: You can combine the time and magnitude filtering into a single which function:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Mmin = 7&lt;br /&gt;
Mmax = 8&lt;br /&gt;
&lt;br /&gt;
inRange = which(eqs$t0yr &amp;gt; tmin &amp;amp; eqs$t0yr &amp;lt;tmax &amp;amp; eqs$M &amp;gt;= Mmin &amp;amp; eqs$M &amp;lt;=Mmax)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=='''Filtering a catalog by which fault(s) ruptured'''==&lt;br /&gt;
&lt;br /&gt;
1. Read the full catalog in (if you have already don’t this in your current R session, you don’t need to read it in again):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readEqs(&amp;quot;eqs.cat.out&amp;quot;)		&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Make a list of the fault segments you are interested in:&lt;br /&gt;
&lt;br /&gt;
Note: If you are using the UCERF3 fault model, you can pass eqs$fault$faultName to the function unique() to get a list of the options. &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
faults = which(eqs$fault$faultName %in% c(&amp;quot;SanAndreas(MojaveN)&amp;quot;, &amp;quot;SanAndreas(MojaveS)&amp;quot;))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
See the next instruction set (Plotting the fault model in plotFault3d) to visually check that you’ve selected the correct fault segments.&lt;br /&gt;
&lt;br /&gt;
3. Get a list of all of the events that rupture patches of those fault segments:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
euse = unique(eqs$eList[which(eqs$pList %in% faults)])	&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Note: You can easily filter the catalog further by the range of events in the previous instruction set:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
use = euse[which(euse %in% inRange)]	&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== '''Plotting the fault model in plotFault3d''' ==&lt;br /&gt;
&lt;br /&gt;
1. Go through steps 1 and 2 in the previous instruction set to get the list of fault patches you want to work with.	&lt;br /&gt;
&lt;br /&gt;
2. Initialize a variable for the plot colors: &lt;br /&gt;
&lt;br /&gt;
Note: This will create list of NA (which will show up as clear when you plot it) that is the length of the number of fault patches in the model.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
col = rep(NA, length(eqs$fault$np))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
3. Set the color of the fault patches that match segments you selected:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
col[faults]= &amp;quot;red&amp;quot;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
4. Plot the fault model, colored by the color variable you created:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
plotFault3d(eqs$fault, col = col, lwd = 0.5)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Note: The argument lwd sets the line width. To see the additional arguments you can pass to plotFault3d() use the args() function:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
args(plotFault3d)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_Restart_and_Processing_Instructions&amp;diff=22317</id>
		<title>RSQSim Restart and Processing Instructions</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_Restart_and_Processing_Instructions&amp;diff=22317"/>
		<updated>2019-02-22T21:20:08Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* Instructions for combining extended/restarted catalogs into one long catalog */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;'''&lt;br /&gt;
== Instructions for restarting an RSQSim simulation in order to extend a catalog from where it ended ==&lt;br /&gt;
'''&lt;br /&gt;
&lt;br /&gt;
Note: You'll need some of the R scripts from RSQSimPostProcess&lt;br /&gt;
&lt;br /&gt;
1.  Create a new directory for the simulation you are setting up, and copy over the cat1.in, cat1.pbs, UCERF3.flt, UCERF3.KZero, and UCERF3.neighbors files.&lt;br /&gt;
&lt;br /&gt;
2. Rename cat1.in and cat1.pbs to match the new catalog name.&lt;br /&gt;
&lt;br /&gt;
3. Update the catalog name and input file name in the new cat2.pbs, and the outNameInfix in the new cat2.in files.&lt;br /&gt;
&lt;br /&gt;
4. Read in catalog 1:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readEqs(&amp;quot;home/user/name/catalog/eqs.catalog1.out&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
5. Create the new input files for shear stress, normal stress, and theta from the end of catalog 1 and fill those in for the variables initTauFname, initSigmaFname, and initThetaFname in your cat2.in file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tmp = mkInitTauSigmaThetaSlipSpeed(eqs, Inf, writeTau = &amp;quot;final&amp;quot;, writeSlipSpeed = 0,   &lt;br /&gt;
		                   initTauFile=&amp;quot;cat2.initTau&amp;quot;,&lt;br /&gt;
		                   initSigmaFile=&amp;quot;cat2.initSigma&amp;quot;,&lt;br /&gt;
	                           initThetaFile = &amp;quot;cat2.initTheta&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
* If the final snapshots (.out.final files) weren’t written out for cat1, but snapshots were written out during the simulations, you can change writeTau = “final” to writeTau = 2 in order to use those snapshots (out.2 files). If no snapshots were written then you’re out of luck and can’t restart that simulation.&lt;br /&gt;
&lt;br /&gt;
6. Get the start time for cat2 and copy the whole number (all 22 digits) into the variable tStart in the new cat2.in file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
tStart = format(tmp$t, digits=22) 	 &lt;br /&gt;
&amp;lt;/pre&amp;gt;					&lt;br /&gt;
7.  Read or create a list of pinned patches using the variable pin:&lt;br /&gt;
&lt;br /&gt;
If there is a pin file for cat1:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
pin = scan(&amp;quot;../cat1.pin&amp;quot;) &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Otherwise, the pinned file is just a list of 0's (not pinned) or 1's (pinned) for each patch in the fault model, so if there isn't one, you can just create one of 0's:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
pin = rep(0, eqs$fault$np)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
8. Update pin from the list of patches that got locked in cat1 and write out the new pin file for cat2:&lt;br /&gt;
&amp;lt;pre&amp;gt;				&lt;br /&gt;
system(&amp;quot;grep Eliminating ../*e | awk '{print $4}' &amp;gt; ../cat1.locked&amp;quot;)&lt;br /&gt;
pin2 = scan(file=&amp;quot;../cat1.locked&amp;quot;)			&lt;br /&gt;
pin[pin2] = 1						&lt;br /&gt;
write(pin, file=&amp;quot;cat2.pin&amp;quot;, ncol=1)				&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
9. Update the pinnedFname in your cat2.in file to cat2.pin&lt;br /&gt;
&lt;br /&gt;
10. Update maxT in your cat2.in file to the length you want the extended catalog to be (in seconds).&lt;br /&gt;
&lt;br /&gt;
=='''Instructions for combining extended/restarted catalogs into one long catalog''' ==&lt;br /&gt;
&lt;br /&gt;
Note: You'll need some of the R scripts from RSQSimPostProcess &lt;br /&gt;
&lt;br /&gt;
1. Set up a list to tell R to only load the files you need (eList, pList, dList, and tList):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
L = c(&amp;quot;e&amp;quot;, &amp;quot;p&amp;quot;, &amp;quot;d&amp;quot;, &amp;quot;t&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;		&lt;br /&gt;
2. Set up the list of catalogs to combine (in the order that they ran):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqFiles = c(&amp;quot;home/user/name/catalog/cat1/eqs.cat1.out&amp;quot;,&lt;br /&gt;
		  	 &amp;quot;home/user/name/catalog/cat2/eqs.cat2.out&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
3. Read and combine them into one catalog:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqs = readAndCombineEqfiles(eqFiles, returnLists=L)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
4. Write out the new eqs.out, and List files:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
writeEqsAndLists(eqs, outFnameInfix = paste(&amp;quot;combinedCat&amp;quot;, sep = &amp;quot;&amp;quot;))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Note: Reading in an eqs.RData file is faster so it’s helpful save that version too. The whole combined ‘eqs’ data structure will be stored in the combinedCat.RData file:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
save(eqs, file = &amp;quot;combinedCat.RData&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Final Note: Use the R function load() to read the combinedCat.RData file back in (only use readEqs.R on the eqs.out files). &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
load(&amp;quot;combinedCat.RData&amp;quot;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== '''Instructions for filtering a catalog by magnitude and saving a new filtered catalog''' ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
1. Pick a minimum magnitude M for your new catalog (you'll get M=muse &amp;amp; greater):&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
muse = 7 &lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
2. Read the full catalog in:&lt;br /&gt;
&amp;lt;pre&amp;gt;						 &lt;br /&gt;
eqs = readEqs(&amp;quot;eqs.cat.out&amp;quot;)			&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
3. Get a list of the events &amp;gt;= muse&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
euse = which(eqs$M&amp;gt;muse)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
4. Subset/filter the catalog by euse:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
eqsNew = subsetEqs(eqs,euse, renumberEvents=FALSE)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
5. Save the new filtered catalog:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
subName = &amp;quot;new_catalog_M7&amp;quot;				&lt;br /&gt;
writeEqsAndLists(eqsNew, outFnameInfix = paste(subName, sep = &amp;quot;&amp;quot;))&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=21289</id>
		<title>RSQSim CyberShake</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=21289"/>
		<updated>2018-07-03T20:52:01Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We are planning to perform CyberShake simulations using RSQSim as the ERF.  This page documents the decisions and results.&lt;br /&gt;
&lt;br /&gt;
== Modifications from UCERF2 ERF ==&lt;br /&gt;
&lt;br /&gt;
The biggest differences from the UCERF2 ERF is that RSQSim ruptures don't neatly fall into the source/rupture/rupture variation hierarchy, and the rupture surfaces are UCERF3 surfaces, so they fall on a triangular, not rectangular grid.&lt;br /&gt;
&lt;br /&gt;
=== Rupture hierarchy ===&lt;br /&gt;
&lt;br /&gt;
[[File:Rsqsim_rupture_mapping.png|thumb|right|400px|RSQSim Rupture Types]]&lt;br /&gt;
&lt;br /&gt;
RSQSim catalogs generate ruptures on triangular elements. Those elements each have an associated UCERF3 fault subsection, which in turn has an associated fault section (e.g. Mojave S). GMPE comparisons use the associated UCERF3 subsections (after applying a filter to remove subsections for which only a few elements participate).&lt;br /&gt;
&lt;br /&gt;
*Raw RSQSim Rupture: the actual simulator elements that ruptured in the RSQSim event. This is what is used in CyberShake (in the SRF, and for distance cutoff calculations). NOTE: we also filter all elements which are more than 100km from the nearest mapped &amp;quot;Significant Subsection&amp;quot; in order to filter out stray elements which could have ruptured co-seismically (perhaps unrelated) in another part of the state but don't contribute to hazard. [[:File:Rsqsim_stray_element_example.png|Example of rupture with stray elements to the N-W which are filtered out]])&lt;br /&gt;
*Significant Subsections: mapped UCERF3 subsections for which at least 20% of each subsection (by area) participates in the rupture. This is what is used for GMPE comparisons, and some distance calculations in the database.&lt;br /&gt;
&lt;br /&gt;
Here is the structure for the RSQSim ERF:&lt;br /&gt;
&lt;br /&gt;
*Source: all ruptures which involve the same set of UCERF3 fault sections (aka 'parent sections'), after mapping to &amp;quot;Significant Subsections&amp;quot;. Source names are all of the sections involved, and sources are sorted alphabetically&lt;br /&gt;
*Rupture: an individual RSQSim rupture (with it's own full slip/time history) that occurred in the RSQSim catalog. Ruptures are sorted by magnitude (increasing)&lt;br /&gt;
*Rupture Variation: there is 1 rupture variation for each rupture, as each rupture has as slip/time history from RSQSim&lt;br /&gt;
&lt;br /&gt;
=== Database changes ===&lt;br /&gt;
&lt;br /&gt;
*In the Ruptures table, we are using the square root of the average element area for GridSpacing - basically, the side length if they were on a rectangular grid.&lt;br /&gt;
*In the Ruptures table, we are setting NumRows and NumCols to 0, but using the correct value for NumPoints.&lt;br /&gt;
*In the Ruptures table, Start/End Lat/Lon/Depth now represents the cube (in 3-d, non-rotated) which contains the entire rupture&lt;br /&gt;
*In the CyberShake_Site_Ruptures table, Site_Rupture_Dist now represents rRup (3-d site/source distance) to the GMPE comparison &amp;quot;Significant Subsections&amp;quot; surface, which can be greater than Cutoff_Dist. This field is usually used to look at amplitudes with distance, so the closest raw rupture surface distance may not be appropriate. This listed distance can be either less or greater than the actual raw rupture distance. Ruptures included in this table are all ruptures for which the raw rupture distance is less than the cutoff dist. The horizontal distance to the center of the nearest raw rupture triangular element is used for cutoff distance checks.&lt;br /&gt;
&lt;br /&gt;
=== Input file changes ===&lt;br /&gt;
&lt;br /&gt;
*Since the rupture geometry files also expect GridSpacing, NumRows, and NumCols, we are using the same approach as in the database.  GridSpacing is replaced by AveArea, and NumRows and NumCols are replaced by NumPoints.&lt;br /&gt;
*Hazard curve/disaggregations require an ERF specific XML file, erf_params.xml. This file also references paths to 2 files, a mappings binary file and a simulator geometry file. If this file is relocated, referenced files must be copied and paths updated in the XML file. TODO: fix disagg code to work with this file rather than hardcoded UCERF2&lt;br /&gt;
&lt;br /&gt;
=== Code changes ===&lt;br /&gt;
&lt;br /&gt;
*A new version of DirectSynth, DirectSynth_RSQSim, was created, which takes in an input file consisting of a list of SRFs for processing.  &lt;br /&gt;
&lt;br /&gt;
== Small-scale catalog, ERFID=42 ==&lt;br /&gt;
&lt;br /&gt;
Initially, we are using a small RSQSim catalog for testing on 4 CyberShake sites (USC, PAS, WNGC, SBSM).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2457&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 41829 M&amp;gt;=6.5 events (2947 sources)&lt;br /&gt;
* Catalog Duration Used: 187,782.42 years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2457&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_params.xml&lt;br /&gt;
&lt;br /&gt;
=== Test seismograms ===&lt;br /&gt;
&lt;br /&gt;
Seismograms were synthesized for site USC (1 Hz) for two RSQSim events.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_5085_s74399.png|thumb|left|750px|Seismograms at USC for event 74399]]&lt;br /&gt;
| [[File:event_74399.png|thumb|left|700px|event 74399, M6.64 on the Elsinore Fault]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_5085_s33801.png|thumb|left|750px|Seismograms at USC for event 33801]]&lt;br /&gt;
| [[File:event_33801.png|thumb|left|700px|event 33801, M7.71 on the SAF, Garlock, and San Jacinto]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Hazard Curves ===&lt;br /&gt;
&lt;br /&gt;
Below are hazard curves, calculated for USC at 0.5 Hz on Blue Waters.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_5812_RotD50_10s.png|thumb|400px|RotD50, 10 sec]]&lt;br /&gt;
| [[File:USC_5812_RotD50_5s.png|thumb|400px|RotD50, 5 sec]]&lt;br /&gt;
| [[File:USC_5812_RotD50_3s.png|thumb|400px|RotD50, 3 sec]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== 1m Year Candidate Catalog, ERFID=48 ==&lt;br /&gt;
&lt;br /&gt;
The first 1 million year candidate catalog, an extension of Bruce's 2585 (the catalog used for the UCERF3 hazard comparison paper), is inserted as ERF 48. We will initially test on 6 CyberShake sites (USC, PAS, WNGC, SBSM, STNI, LAPD).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2585 1myr&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 252,534 M&amp;gt;=6.5 events (3154 sources)&lt;br /&gt;
* Catalog Duration Used: 1 million years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2585_1myr&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2585_1myrs/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2585_1myrs/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2585_1myrs/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2585_1myrs/erf_params.xml&lt;br /&gt;
&lt;br /&gt;
== 259k Year Candidate Catalog, ERFID=49 ==&lt;br /&gt;
&lt;br /&gt;
This is a 250k year catalog which addresses some of the rupture propagation velocity issues seen with the previous CyberShake/RSQSim simulations. It is Bruce Shaw's Catalog 2740, and is inserted as ERF 29. We will initially test on 9 CyberShake sites (USC, PAS, WNGC, SBSM, STNI, LAPD, s119, s279, s480). The latter 3 sites are intended to help understand differences between 3-D CyberShake and 1-D broadband, and are chosen because they have high Vs30 values which are similar to the BBP 1-D Vs30 value for the LA Basin.&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2470 1myr&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 83,407 M&amp;gt;=6.5 events (1889 sources)&lt;br /&gt;
* Catalog Duration Used: ~250k years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2740&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2740/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2740/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2740/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2740/erf_params.xml&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=20378</id>
		<title>RSQSim CyberShake</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=20378"/>
		<updated>2018-04-10T17:42:47Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We are planning to perform CyberShake simulations using RSQSim as the ERF.  This page documents the decisions and results.&lt;br /&gt;
&lt;br /&gt;
== Modifications from UCERF2 ERF ==&lt;br /&gt;
&lt;br /&gt;
The biggest differences from the UCERF2 ERF is that RSQSim ruptures don't neatly fall into the source/rupture/rupture variation hierarchy, and the rupture surfaces are UCERF3 surfaces, so they fall on a triangular, not rectangular grid.&lt;br /&gt;
&lt;br /&gt;
=== Rupture hierarchy ===&lt;br /&gt;
&lt;br /&gt;
[[File:Rsqsim_rupture_mapping.png|thumb|right|400px|RSQSim Rupture Types]]&lt;br /&gt;
&lt;br /&gt;
RSQSim catalogs generate ruptures on triangular elements. Those elements each have an associated UCERF3 fault subsection, which in turn has an associated fault section (e.g. Mojave S). GMPE comparisons use the associated UCERF3 subsections (after applying a filter to remove subsections for which only a few elements participate).&lt;br /&gt;
&lt;br /&gt;
*Raw RSQSim Rupture: the actual simulator elements that ruptured in the RSQSim event. This is what is used in CyberShake (in the SRF, and for distance cutoff calculations). NOTE: we also filter all elements which are more than 100km from the nearest mapped &amp;quot;Significant Subsection&amp;quot; in order to filter out stray elements which could have ruptured co-seismically (perhaps unrelated) in another part of the state but don't contribute to hazard. [[:File:Rsqsim_stray_element_example.png|Example of rupture with stray elements to the N-W which are filtered out]])&lt;br /&gt;
*Significant Subsections: mapped UCERF3 subsections for which at least 20% of each subsection (by area) participates in the rupture. This is what is used for GMPE comparisons, and some distance calculations in the database.&lt;br /&gt;
&lt;br /&gt;
Here is the structure for the RSQSim ERF:&lt;br /&gt;
&lt;br /&gt;
*Source: all ruptures which involve the same set of UCERF3 fault sections (aka 'parent sections'), after mapping to &amp;quot;Significant Subsections&amp;quot;. Source names are all of the sections involved, and sources are sorted alphabetically&lt;br /&gt;
*Rupture: an individual RSQSim rupture (with it's own full slip/time history) that occurred in the RSQSim catalog. Ruptures are sorted by magnitude (increasing)&lt;br /&gt;
*Rupture Variation: there is 1 rupture variation for each rupture, as each rupture has as slip/time history from RSQSim&lt;br /&gt;
&lt;br /&gt;
=== Database changes ===&lt;br /&gt;
&lt;br /&gt;
*In the Ruptures table, we are using the square root of the average element area for GridSpacing - basically, the side length if they were on a rectangular grid.&lt;br /&gt;
*In the Ruptures table, we are setting NumRows and NumCols to 0, but using the correct value for NumPoints.&lt;br /&gt;
*In the Ruptures table, Start/End Lat/Lon/Depth now represents the cube (in 3-d, non-rotated) which contains the entire rupture&lt;br /&gt;
*In the CyberShake_Site_Ruptures table, Site_Rupture_Dist now represents rRup (3-d site/source distance) to the GMPE comparison &amp;quot;Significant Subsections&amp;quot; surface, which can be greater than Cutoff_Dist. This field is usually used to look at amplitudes with distance, so the closest raw rupture surface distance may not be appropriate. This listed distance can be either less or greater than the actual raw rupture distance. Ruptures included in this table are all ruptures for which the raw rupture distance is less than the cutoff dist. The horizontal distance to the center of the nearest raw rupture triangular element is used for cutoff distance checks.&lt;br /&gt;
&lt;br /&gt;
=== Input file changes ===&lt;br /&gt;
&lt;br /&gt;
*Since the rupture geometry files also expect GridSpacing, NumRows, and NumCols, we are using the same approach as in the database.  GridSpacing is replaced by AveArea, and NumRows and NumCols are replaced by NumPoints.&lt;br /&gt;
*Hazard curve/disaggregations require an ERF specific XML file, erf_params.xml. This file also references paths to 2 files, a mappings binary file and a simulator geometry file. If this file is relocated, referenced files must be copied and paths updated in the XML file. TODO: fix disagg code to work with this file rather than hardcoded UCERF2&lt;br /&gt;
&lt;br /&gt;
=== Code changes ===&lt;br /&gt;
&lt;br /&gt;
*A new version of DirectSynth, DirectSynth_RSQSim, was created, which takes in an input file consisting of a list of SRFs for processing.  &lt;br /&gt;
&lt;br /&gt;
== Small-scale catalog, ERFID=42 ==&lt;br /&gt;
&lt;br /&gt;
Initially, we are using a small RSQSim catalog for testing on 4 CyberShake sites (USC, PAS, WNGC, SBSM).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2457&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 41829 M&amp;gt;=6.5 events (2947 sources)&lt;br /&gt;
* Catalog Duration Used: 187,782.42 years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2457&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_params.xml&lt;br /&gt;
&lt;br /&gt;
=== Test seismograms ===&lt;br /&gt;
&lt;br /&gt;
Seismograms were synthesized for site USC (1 Hz) for two RSQSim events.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_5085_s74399.png|thumb|left|750px|Seismograms at USC for event 74399]]&lt;br /&gt;
| [[File:event_74399.png|thumb|left|700px|event 74399, M6.64 on the Elsinore Fault]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_5085_s33801.png|thumb|left|750px|Seismograms at USC for event 33801]]&lt;br /&gt;
| [[File:event_33801.png|thumb|left|700px|event 33801, M7.71 on the SAF, Garlock, and San Jacinto]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== 1m Year Candidate Catalog, ERFID=48 ==&lt;br /&gt;
&lt;br /&gt;
The first 1 million year candidate catalog, an extension of Bruce's 2585 (the catalog used for the UCERF3 hazard comparison paper), is inserted as ERF 48. We will initially test on 6 CyberShake sites (USC, PAS, WNGC, SBSM, STNI, LAPD).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2585 1myr&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 252,534 M&amp;gt;=6.5 events (3154 sources)&lt;br /&gt;
* Catalog Duration Used: 1 million years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2585_1myr&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2585_1myrs/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2585_1myrs/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2585_1myrs/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2585_1myrs/erf_params.xml&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19735</id>
		<title>RSQSim CyberShake</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19735"/>
		<updated>2018-02-07T19:10:20Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We are planning to perform CyberShake simulations using RSQSim as the ERF.  This page documents the decisions and results.&lt;br /&gt;
&lt;br /&gt;
== Modifications from UCERF2 ERF ==&lt;br /&gt;
&lt;br /&gt;
The biggest differences from the UCERF2 ERF is that RSQSim ruptures don't neatly fall into the source/rupture/rupture variation hierarchy, and the rupture surfaces are UCERF3 surfaces, so they fall on a triangular, not rectangular grid.&lt;br /&gt;
&lt;br /&gt;
=== Rupture hierarchy ===&lt;br /&gt;
&lt;br /&gt;
[[File:Rsqsim_rupture_mapping.png|thumb|right|400px|RSQSim Rupture Types]]&lt;br /&gt;
&lt;br /&gt;
RSQSim catalogs generate ruptures on triangular elements. Those elements each have an associated UCERF3 fault subsection, which in turn has an associated fault section (e.g. Mojave S). GMPE comparisons use the associated UCERF3 subsections (after applying a filter to remove subsections for which only a few elements participate).&lt;br /&gt;
&lt;br /&gt;
*Raw RSQSim Rupture: the actual simulator elements that ruptured in the RSQSim event. This is what is used in CyberShake (in the SRF, and for distance cutoff calculations). NOTE: we also filter all elements which are more than 100km from the nearest mapped &amp;quot;Significant Subsection&amp;quot; in order to filter out stray elements which could have ruptured co-seismically (perhaps unrelated) in another part of the state but don't contribute to hazard. [[:File:Rsqsim_stray_element_example.png|Example of rupture with stray elements to the N-W which are filtered out]])&lt;br /&gt;
*Significant Subsections: mapped UCERF3 subsections for which at least 20% of each subsection (by area) participates in the rupture. This is what is used for GMPE comparisons, and some distance calculations in the database.&lt;br /&gt;
&lt;br /&gt;
Here is the structure for the RSQSim ERF:&lt;br /&gt;
&lt;br /&gt;
*Source: all ruptures which involve the same set of UCERF3 fault sections (aka 'parent sections'), after mapping to &amp;quot;Significant Subsections&amp;quot;. Source names are all of the sections involved, and sources are sorted alphabetically&lt;br /&gt;
*Rupture: an individual RSQSim rupture (with it's own full slip/time history) that occurred in the RSQSim catalog. Ruptures are sorted by magnitude (increasing)&lt;br /&gt;
*Rupture Variation: there is 1 rupture variation for each rupture, as each rupture has as slip/time history from RSQSim&lt;br /&gt;
&lt;br /&gt;
=== Database changes ===&lt;br /&gt;
&lt;br /&gt;
*In the Ruptures table, we are using the square root of the average element area for GridSpacing - basically, the side length if they were on a rectangular grid.&lt;br /&gt;
*In the Ruptures table, we are setting NumRows and NumCols to 0, but using the correct value for NumPoints.&lt;br /&gt;
*In the Ruptures table, Start/End Lat/Lon/Depth now represents the cube (in 3-d, non-rotated) which contains the entire rupture&lt;br /&gt;
*In the CyberShake_Site_Ruptures table, Site_Rupture_Dist now represents rRup (3-d site/source distance) to the GMPE comparison &amp;quot;Significant Subsections&amp;quot; surface, which can be greater than Cutoff_Dist. This field is usually used to look at amplitudes with distance, so the closest raw rupture surface distance may not be appropriate. This listed distance can be either less or greater than the actual raw rupture distance. Ruptures included in this table are all ruptures for which the raw rupture distance is less than the cutoff dist. The horizontal distance to the center of the nearest raw rupture triangular element is used for cutoff distance checks.&lt;br /&gt;
&lt;br /&gt;
=== Input file changes ===&lt;br /&gt;
&lt;br /&gt;
*Since the rupture geometry files also expect GridSpacing, NumRows, and NumCols, we are using the same approach as in the database.  GridSpacing is replaced by AveArea, and NumRows and NumCols are replaced by NumPoints.&lt;br /&gt;
*Hazard curve/disaggregations require an ERF specific XML file, erf_params.xml. This file also references paths to 2 files, a mappings binary file and a simulator geometry file. If this file is relocated, referenced files must be copied and paths updated in the XML file. TODO: fix disagg code to work with this file rather than hardcoded UCERF2&lt;br /&gt;
&lt;br /&gt;
=== Code changes ===&lt;br /&gt;
&lt;br /&gt;
*A new version of DirectSynth, DirectSynth_RSQSim, was created, which takes in an input file consisting of a list of SRFs for processing.  &lt;br /&gt;
&lt;br /&gt;
== Small-scale catalog, ERFID=42 ==&lt;br /&gt;
&lt;br /&gt;
Initially, we are using a small RSQSim catalog for testing on 4 CyberShake sites (USC, PAS, WNGC, SBSM).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2457&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 41829 M&amp;gt;=6.5 events (2947 sources)&lt;br /&gt;
* Catalog Duration Used: 187,782.42 years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2457&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_params.xml&lt;br /&gt;
&lt;br /&gt;
=== Test seismograms ===&lt;br /&gt;
&lt;br /&gt;
Seismograms were synthesized for site USC (1 Hz) for two RSQSim events.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_5085_s74399.png|thumb|left|750px|Seismograms at USC for event 74399]]&lt;br /&gt;
| [[File:event_74399.png|thumb|left|700px|event 74399, M6.64 on the Elsinore Fault]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_5085_s33801.png|thumb|left|750px|Seismograms at USC for event 33801]]&lt;br /&gt;
| [[File:event_33801.png|thumb|left|700px|event 33801, M7.71 on the SAF, Garlock, and San Jacinto]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19734</id>
		<title>RSQSim CyberShake</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19734"/>
		<updated>2018-02-07T19:08:42Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We are planning to perform CyberShake simulations using RSQSim as the ERF.  This page documents the decisions and results.&lt;br /&gt;
&lt;br /&gt;
== Modifications from UCERF2 ERF ==&lt;br /&gt;
&lt;br /&gt;
The biggest differences from the UCERF2 ERF is that RSQSim ruptures don't neatly fall into the source/rupture/rupture variation hierarchy, and the rupture surfaces are UCERF3 surfaces, so they fall on a triangular, not rectangular grid.&lt;br /&gt;
&lt;br /&gt;
=== Rupture hierarchy ===&lt;br /&gt;
&lt;br /&gt;
[[File:Rsqsim_rupture_mapping.png|thumb|right|400px|RSQSim Rupture Types]]&lt;br /&gt;
&lt;br /&gt;
RSQSim catalogs generate ruptures on triangular elements. Those elements each have an associated UCERF3 fault subsection, which in turn has an associated fault section (e.g. Mojave S). GMPE comparisons use the associated UCERF3 subsections (after applying a filter to remove subsections for which only a few elements participate).&lt;br /&gt;
&lt;br /&gt;
*Raw RSQSim Rupture: the actual simulator elements that ruptured in the RSQSim event. This is what is used in CyberShake (in the SRF, and for distance cutoff calculations). NOTE: we also filter all elements which are more than 100km from the nearest mapped &amp;quot;Significant Subsection&amp;quot; in order to filter out stray elements which could have ruptured co-seismically (perhaps unrelated) in another part of the state but don't contribute to hazard. [[:File:Rsqsim_stray_element_example.png|Example of rupture with stray elements to the N-W which are filtered out]])&lt;br /&gt;
*Significant Subsections: mapped UCERF3 subsections for which at least 20% of each subsection (by area) participates in the rupture. This is what is used for GMPE comparisons, and some distance calculations in the database.&lt;br /&gt;
&lt;br /&gt;
Here is the structure for the RSQSim ERF:&lt;br /&gt;
&lt;br /&gt;
*Source: all ruptures which involve the same set of UCERF3 fault sections (aka 'parent sections'), after mapping to &amp;quot;Significant Subsections&amp;quot;. Source names are all of the sections involved, and sources are sorted alphabetically&lt;br /&gt;
*Rupture: an individual RSQSim rupture (with it's own full slip/time history) that occurred in the RSQSim catalog. Ruptures are sorted by magnitude (increasing)&lt;br /&gt;
*Rupture Variation: there is 1 rupture variation for each rupture, as each rupture has as slip/time history from RSQSim&lt;br /&gt;
&lt;br /&gt;
=== Database changes ===&lt;br /&gt;
&lt;br /&gt;
*In the Ruptures table, we are using the square root of the average element area for GridSpacing - basically, the side length if they were on a rectangular grid.&lt;br /&gt;
*In the Ruptures table, we are setting NumRows and NumCols to 0, but using the correct value for NumPoints.&lt;br /&gt;
*In the Ruptures table, Start/End Lat/Lon/Depth now represents the cube (in 3-d, non-rotated) which contains the entire rupture&lt;br /&gt;
*In the CyberShake_Site_Ruptures table, Site_Rupture_Dist now represents rRup (3-d site/source distance) to the GMPE comparison &amp;quot;Significant Subsections&amp;quot; surface, which can be greater than Cutoff_Dist. This field is usually used to look at amplitudes with distance, so the closest raw rupture surface distance may not be appropriate. This listed distance can be either less or greater than the actual raw rupture distance. Ruptures included in this table are all ruptures for which the raw rupture distance is less than the cutoff dist. The raw rupture distance is used for cutoff distance checks.&lt;br /&gt;
&lt;br /&gt;
=== Input file changes ===&lt;br /&gt;
&lt;br /&gt;
*Since the rupture geometry files also expect GridSpacing, NumRows, and NumCols, we are using the same approach as in the database.  GridSpacing is replaced by AveArea, and NumRows and NumCols are replaced by NumPoints.&lt;br /&gt;
*Hazard curve/disaggregations require an ERF specific XML file, erf_params.xml. This file also references paths to 2 files, a mappings binary file and a simulator geometry file. If this file is relocated, referenced files must be copied and paths updated in the XML file. TODO: fix disagg code to work with this file rather than hardcoded UCERF2&lt;br /&gt;
&lt;br /&gt;
=== Code changes ===&lt;br /&gt;
&lt;br /&gt;
*A new version of DirectSynth, DirectSynth_RSQSim, was created, which takes in an input file consisting of a list of SRFs for processing.  &lt;br /&gt;
&lt;br /&gt;
== Small-scale catalog, ERFID=42 ==&lt;br /&gt;
&lt;br /&gt;
Initially, we are using a small RSQSim catalog for testing on 4 CyberShake sites (USC, PAS, WNGC, SBSM).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2457&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 41829 M&amp;gt;=6.5 events (2947 sources)&lt;br /&gt;
* Catalog Duration Used: 187,782.42 years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2457&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_params.xml&lt;br /&gt;
&lt;br /&gt;
=== Test seismograms ===&lt;br /&gt;
&lt;br /&gt;
Seismograms were synthesized for site USC (1 Hz) for two RSQSim events.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_5085_s74399.png|thumb|left|750px|Seismograms at USC for event 74399]]&lt;br /&gt;
| [[File:event_74399.png|thumb|left|700px|event 74399, M6.64 on the Elsinore Fault]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_5085_s33801.png|thumb|left|750px|Seismograms at USC for event 33801]]&lt;br /&gt;
| [[File:event_33801.png|thumb|left|700px|event 33801, M7.71 on the SAF, Garlock, and San Jacinto]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Rsqsim_stray_element_example.png&amp;diff=19733</id>
		<title>File:Rsqsim stray element example.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Rsqsim_stray_element_example.png&amp;diff=19733"/>
		<updated>2018-02-07T19:03:11Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19727</id>
		<title>RSQSim CyberShake</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19727"/>
		<updated>2018-02-07T01:56:38Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: /* Rupture hierarchy */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We are planning to perform CyberShake simulations using RSQSim as the ERF.  This page documents the decisions and results.&lt;br /&gt;
&lt;br /&gt;
== Modifications from UCERF2 ERF ==&lt;br /&gt;
&lt;br /&gt;
The biggest differences from the UCERF2 ERF is that RSQSim ruptures don't neatly fall into the source/rupture/rupture variation hierarchy, and the rupture surfaces are UCERF3 surfaces, so they fall on a triangular, not rectangular grid.&lt;br /&gt;
&lt;br /&gt;
=== Rupture hierarchy ===&lt;br /&gt;
&lt;br /&gt;
[[File:Rsqsim_rupture_mapping.png|thumb|right|400px|RSQSim Rupture Types]]&lt;br /&gt;
&lt;br /&gt;
RSQSim catalogs generate ruptures on triangular elements. Those elements each have an associated UCERF3 fault subsection, which in turn has an associated fault section (e.g. Mojave S). GMPE comparisons use the associated UCERF3 subsections (after applying a filter to remove subsections for which only a few elements participate). &lt;br /&gt;
&lt;br /&gt;
*Raw RSQSim Rupture: the actual simulator elements that ruptured in the RSQSim event. This is what is used in CyberShake. NOTE: we also filter all elements which are more than 100km from the nearest mapped &amp;quot;Significant Subsection&amp;quot; in order to filter out stray elements which could have ruptured co-seismically (perhaps unrelated) in another part of the state but don't contribute to hazard)&lt;br /&gt;
*Significant Subsections: mapped UCERF3 subsections for which at least 20% of each subsection (by area) participates in the rupture. This is what is used for GMPE comparisons, and some distance calculations in the database. &lt;br /&gt;
&lt;br /&gt;
Here is the structure for the RSQSim ERF:&lt;br /&gt;
&lt;br /&gt;
*Source: all ruptures which involve the same set of UCERF3 fault sections (aka 'parent sections'), after mapping to &amp;quot;Significant Subsections&amp;quot;. Source names are all of the sections involved, and sources are sorted alphabetically&lt;br /&gt;
*Rupture: an individual RSQSim rupture (with it's own full slip/time history) that occurred in the RSQSim catalog. Ruptures are sorted by magnitude (increasing)&lt;br /&gt;
*Rupture Variation: there is 1 rupture variation for each rupture, as each rupture has as slip/time history from RSQSim&lt;br /&gt;
&lt;br /&gt;
=== Database changes ===&lt;br /&gt;
&lt;br /&gt;
*In the Ruptures table, we are using the square root of the average element area for GridSpacing - basically, the side length if they were on a rectangular grid.&lt;br /&gt;
*In the Ruptures table, we are setting NumRows and NumCols to 0, but using the correct value for NumPoints.&lt;br /&gt;
*In the Ruptures table, Start/End Lat/Lon/Depth now represents the cube (in 3-d, non-rotated) which contains the entire rupture&lt;br /&gt;
*In the CyberShake_Site_Ruptures table, Site_Rupture_Dist now represents rRup (3-d site/source distance) to the GMPE comparison &amp;quot;Significant Subsections&amp;quot; surface, which can be greater than Cutoff_Dist. This field is usually used to look at amplitudes with distance, so the closest raw rupture surface distance may not be appropriate. This listed distance can be either less or greater than the actual raw rupture distance. Ruptures included in this table are all ruptures for which the raw rupture distance is less than the cutoff dist.&lt;br /&gt;
&lt;br /&gt;
=== Input file changes ===&lt;br /&gt;
&lt;br /&gt;
*Since the rupture geometry files also expect GridSpacing, NumRows, and NumCols, we are using the same approach as in the database.  GridSpacing is replaced by AveArea, and NumRows and NumCols are replaced by NumPoints.&lt;br /&gt;
*Hazard curve/disaggregations require an ERF specific XML file, erf_params.xml. This file also references paths to 2 files, a mappings binary file and a simulator geometry file. If this file is relocated, referenced files must be copied and paths updated in the XML file. TODO: fix disagg code to work with this file rather than hardcoded UCERF2&lt;br /&gt;
&lt;br /&gt;
=== Code changes ===&lt;br /&gt;
&lt;br /&gt;
*A new version of DirectSynth, DirectSynth_RSQSim, was created, which takes in an input file consisting of a list of SRFs for processing.  &lt;br /&gt;
&lt;br /&gt;
== Small-scale catalog, ERFID=42 ==&lt;br /&gt;
&lt;br /&gt;
Initially, we are using a small RSQSim catalog for testing on 4 CyberShake sites (USC, PAS, WNGC, SBSM).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2457&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 41829 M&amp;gt;=6.5 events (2947 sources)&lt;br /&gt;
* Catalog Duration Used: 187,782.42 years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2457&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_params.xml&lt;br /&gt;
&lt;br /&gt;
=== Test seismograms ===&lt;br /&gt;
&lt;br /&gt;
Seismograms were synthesized for site USC (1 Hz) for two RSQSim events.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_5085_s74399.png|thumb|left|750px|Seismograms at USC for event 74399]]&lt;br /&gt;
| [[File:event_74399.png|thumb|left|700px|event 74399, M6.64 on the Elsinore Fault]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_5085_s33801.png|thumb|left|750px|Seismograms at USC for event 33801]]&lt;br /&gt;
| [[File:event_33801.png|thumb|left|700px|event 33801, M7.71 on the SAF, Garlock, and San Jacinto]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19726</id>
		<title>RSQSim CyberShake</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19726"/>
		<updated>2018-02-07T01:33:28Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We are planning to perform CyberShake simulations using RSQSim as the ERF.  This page documents the decisions and results.&lt;br /&gt;
&lt;br /&gt;
== Modifications from UCERF2 ERF ==&lt;br /&gt;
&lt;br /&gt;
The biggest differences from the UCERF2 ERF is that RSQSim ruptures don't neatly fall into the source/rupture/rupture variation hierarchy, and the rupture surfaces are UCERF3 surfaces, so they fall on a triangular, not rectangular grid.&lt;br /&gt;
&lt;br /&gt;
=== Rupture hierarchy ===&lt;br /&gt;
&lt;br /&gt;
[[File:Rsqsim_rupture_mapping.png|thumb|right|400px|RSQSim Rupture Types]]&lt;br /&gt;
&lt;br /&gt;
RSQSim catalogs generate ruptures on triangular elements. Those elements each have an associated UCERF3 fault subsection, which in turn has an associated fault section (e.g. Mojave S). GMPE comparisons use the associated UCERF3 subsections (after applying a filter to remove subsections for which only a few elements participate). &lt;br /&gt;
&lt;br /&gt;
*Raw RSQSim Rupture: the actual simulator elements that ruptured in the RSQSim event. This is what is used in CyberShake. NOTE: we also filter all elements which are more than 100km from the nearest mapped &amp;quot;Significant Subsection&amp;quot; in order to filter out stray elements which could have ruptured co-seismically (perhaps unrelated) in another part of the state but don't contribute to hazard)&lt;br /&gt;
*Significant Subsections: this is what is used for GMPE comparisons, and some distance calculations in the database. &lt;br /&gt;
&lt;br /&gt;
Here is the structure for the RSQSim ERF:&lt;br /&gt;
&lt;br /&gt;
*Source: all ruptures which involve the same set of UCERF3 fault sections (aka 'parent sections'), after mapping to &amp;quot;Significant Subsections&amp;quot;. Source names are all of the sections involved, and sources are sorted alphabetically&lt;br /&gt;
*Rupture: an individual RSQSim rupture (with it's own full slip/time history) that occurred in the RSQSim catalog. Ruptures are sorted by magnitude (increasing)&lt;br /&gt;
*Rupture Variation: there is 1 rupture variation for each rupture, as each rupture has as slip/time history from RSQSim&lt;br /&gt;
&lt;br /&gt;
=== Database changes ===&lt;br /&gt;
&lt;br /&gt;
*In the Ruptures table, we are using the square root of the average element area for GridSpacing - basically, the side length if they were on a rectangular grid.&lt;br /&gt;
*In the Ruptures table, we are setting NumRows and NumCols to 0, but using the correct value for NumPoints.&lt;br /&gt;
*In the Ruptures table, Start/End Lat/Lon/Depth now represents the cube (in 3-d, non-rotated) which contains the entire rupture&lt;br /&gt;
*In the CyberShake_Site_Ruptures table, Site_Rupture_Dist now represents rRup (3-d site/source distance) to the GMPE comparison &amp;quot;Significant Subsections&amp;quot; surface, which can be greater than Cutoff_Dist. This field is usually used to look at amplitudes with distance, so the closest raw rupture surface distance may not be appropriate. This listed distance can be either less or greater than the actual raw rupture distance. Ruptures included in this table are all ruptures for which the raw rupture distance is less than the cutoff dist.&lt;br /&gt;
&lt;br /&gt;
=== Input file changes ===&lt;br /&gt;
&lt;br /&gt;
*Since the rupture geometry files also expect GridSpacing, NumRows, and NumCols, we are using the same approach as in the database.  GridSpacing is replaced by AveArea, and NumRows and NumCols are replaced by NumPoints.&lt;br /&gt;
*Hazard curve/disaggregations require an ERF specific XML file, erf_params.xml. This file also references paths to 2 files, a mappings binary file and a simulator geometry file. If this file is relocated, referenced files must be copied and paths updated in the XML file. TODO: fix disagg code to work with this file rather than hardcoded UCERF2&lt;br /&gt;
&lt;br /&gt;
=== Code changes ===&lt;br /&gt;
&lt;br /&gt;
*A new version of DirectSynth, DirectSynth_RSQSim, was created, which takes in an input file consisting of a list of SRFs for processing.  &lt;br /&gt;
&lt;br /&gt;
== Small-scale catalog, ERFID=42 ==&lt;br /&gt;
&lt;br /&gt;
Initially, we are using a small RSQSim catalog for testing on 4 CyberShake sites (USC, PAS, WNGC, SBSM).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2457&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 41829 M&amp;gt;=6.5 events (2947 sources)&lt;br /&gt;
* Catalog Duration Used: 187,782.42 years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2457&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_params.xml&lt;br /&gt;
&lt;br /&gt;
=== Test seismograms ===&lt;br /&gt;
&lt;br /&gt;
Seismograms were synthesized for site USC (1 Hz) for two RSQSim events.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_5085_s74399.png|thumb|left|750px|Seismograms at USC for event 74399]]&lt;br /&gt;
| [[File:event_74399.png|thumb|left|700px|event 74399, M6.64 on the Elsinore Fault]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_5085_s33801.png|thumb|left|750px|Seismograms at USC for event 33801]]&lt;br /&gt;
| [[File:event_33801.png|thumb|left|700px|event 33801, M7.71 on the SAF, Garlock, and San Jacinto]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19725</id>
		<title>RSQSim CyberShake</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19725"/>
		<updated>2018-02-07T01:23:12Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We are planning to perform CyberShake simulations using RSQSim as the ERF.  This page documents the decisions and results.&lt;br /&gt;
&lt;br /&gt;
== Modifications from UCERF2 ERF ==&lt;br /&gt;
&lt;br /&gt;
The biggest differences from the UCERF2 ERF is that RSQSim ruptures don't neatly fall into the source/rupture/rupture variation hierarchy, and the rupture surfaces are UCERF3 surfaces, so they fall on a triangular, not rectangular grid.&lt;br /&gt;
&lt;br /&gt;
=== Rupture hierarchy ===&lt;br /&gt;
&lt;br /&gt;
[[File:Rsqsim_rupture_mapping.png|thumb|right|400px|RSQSim Rupture Types]]&lt;br /&gt;
&lt;br /&gt;
RSQSim catalogs generate ruptures on triangular elements. Those elements each have an associated UCERF3 fault subsection, which in turn has an associated fault section (e.g. Mojave S). GMPE comparisons use the associated UCERF3 subsections (after applying a filter to remove subsections for which only a few elements participate). &lt;br /&gt;
&lt;br /&gt;
*Raw RSQSim Rupture: the actual simulator elements that ruptured in the RSQSim event. This is what is used in CyberShake. NOTE: we also filter all elements which are more than 100km from the nearest mapped &amp;quot;Significant Subsection&amp;quot; in order to filter out stray elements which could have ruptured co-seismically (perhaps unrelated) in another part of the state but don't contribute to hazard)&lt;br /&gt;
*Significant Subsections: this is what is used for GMPE comparisons, and some distance calculations in the database. &lt;br /&gt;
&lt;br /&gt;
Here is the structure for the RSQSim ERF:&lt;br /&gt;
&lt;br /&gt;
*Source: all ruptures which involve the same set of UCERF3 fault sections (aka 'parent sections'), after mapping to &amp;quot;Significant Subsections&amp;quot;. Source names are all of the sections involved, and sources are sorted alphabetically&lt;br /&gt;
*Rupture: an individual RSQSim rupture (with it's own full slip/time history) that occurred in the RSQSim catalog. Ruptures are sorted by magnitude (increasing)&lt;br /&gt;
*Rupture Variation: there is 1 rupture variation for each rupture, as each rupture has as slip/time history from RSQSim&lt;br /&gt;
&lt;br /&gt;
=== Database changes ===&lt;br /&gt;
&lt;br /&gt;
*In the Ruptures table, we are using the square root of the average element area for GridSpacing - basically, the side length if they were on a rectangular grid.&lt;br /&gt;
*In the Ruptures table, we are setting NumRows and NumCols to 0, but using the correct value for NumPoints.&lt;br /&gt;
*In the Ruptures table, Start/End Lat/Lon/Depth now represents the cube (in 3-d, non-rotated) which contains the entire rupture&lt;br /&gt;
*In the CyberShake_Site_Ruptures table, Site_Rupture_Dist now represents rRup (3-d site/source distance) to the GMPE comparison &amp;quot;Significant Subsections&amp;quot; surface, which can be greater than Cutoff_Dist. This field is usually used to look at amplitudes with distance, so the closest raw rupture surface distance may not be appropriate. This listed distance can be either less or greater than the actual raw rupture distance. Ruptures included in this table are all ruptures for which the raw rupture distance is less than the cutoff dist.&lt;br /&gt;
&lt;br /&gt;
=== Input file changes ===&lt;br /&gt;
&lt;br /&gt;
*Since the rupture geometry files also expect GridSpacing, NumRows, and NumCols, we are using the same approach as in the database.  GridSpacing is replaced by AveArea, and NumRows and NumCols are replaced by NumPoints.&lt;br /&gt;
&lt;br /&gt;
=== Code changes ===&lt;br /&gt;
&lt;br /&gt;
*A new version of DirectSynth, DirectSynth_RSQSim, was created, which takes in an input file consisting of a list of SRFs for processing.  &lt;br /&gt;
&lt;br /&gt;
== Small-scale catalog, ERFID=42 ==&lt;br /&gt;
&lt;br /&gt;
Initially, we are using a small RSQSim catalog for testing on 4 CyberShake sites (USC, PAS, WNGC, SBSM).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2457&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 41829 M&amp;gt;=6.5 events (2947 sources)&lt;br /&gt;
* Catalog Duration Used: 187,782.42 years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2457&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_params.xml&lt;br /&gt;
&lt;br /&gt;
=== Test seismograms ===&lt;br /&gt;
&lt;br /&gt;
Seismograms were synthesized for site USC (1 Hz) for two RSQSim events.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_5085_s74399.png|thumb|left|750px|Seismograms at USC for event 74399]]&lt;br /&gt;
| [[File:event_74399.png|thumb|left|700px|event 74399, M6.64 on the Elsinore Fault]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_5085_s33801.png|thumb|left|750px|Seismograms at USC for event 33801]]&lt;br /&gt;
| [[File:event_33801.png|thumb|left|700px|event 33801, M7.71 on the SAF, Garlock, and San Jacinto]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Rsqsim_rupture_mapping.png&amp;diff=19724</id>
		<title>File:Rsqsim rupture mapping.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Rsqsim_rupture_mapping.png&amp;diff=19724"/>
		<updated>2018-02-07T01:16:49Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19723</id>
		<title>RSQSim CyberShake</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19723"/>
		<updated>2018-02-07T01:14:19Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We are planning to perform CyberShake simulations using RSQSim as the ERF.  This page documents the decisions and results.&lt;br /&gt;
&lt;br /&gt;
== Modifications from UCERF2 ERF ==&lt;br /&gt;
&lt;br /&gt;
The biggest differences from the UCERF2 ERF is that RSQSim ruptures don't neatly fall into the source/rupture/rupture variation hierarchy, and the rupture surfaces are UCERF3 surfaces, so they fall on a triangular, not rectangular grid.&lt;br /&gt;
&lt;br /&gt;
=== Rupture hierarchy ===&lt;br /&gt;
&lt;br /&gt;
RSQSim catalogs generate ruptures on triangular elements. Those elements each have an associated UCERF3 fault subsection, which in turn has an associated fault section (e.g. Mojave S). GMPE comparisons use the associated UCERF3 subsections (after applying a filter to remove subsections for which only a few elements participate). Here is the structure for the RSQSim ERF:&lt;br /&gt;
&lt;br /&gt;
*Source: all ruptures which involve the same set of UCERF3 fault sections (aka 'parent sections'). Source names are all of the sections involved, and sources are sorted alphabetically&lt;br /&gt;
*Rupture: an individual RSQSim rupture (with it's own full slip/time history) that occurred in the RSQSim catalog. Ruptures are sorted by magnitude (increasing)&lt;br /&gt;
*Rupture Variation: there is 1 rupture variation for each rupture, as each rupture has as slip/time history from RSQSim&lt;br /&gt;
&lt;br /&gt;
=== Database changes ===&lt;br /&gt;
&lt;br /&gt;
*In the Ruptures table, we are using the square root of the average element area for GridSpacing - basically, the side length if they were on a rectangular grid.&lt;br /&gt;
*In the Ruptures table, we are setting NumRows and NumCols to 0, but using the correct value for NumPoints.&lt;br /&gt;
*In the Ruptures table, Start/End Lat/Lon/Depth now represents the cube (in 3-d, non-rotated) which contains the entire rupture&lt;br /&gt;
*In the CyberShake_Site_Ruptures table, Site_Rupture_Dist now represents rRup (3-d site/source distance) to the GMPE comparison surface, which can be greater than Cutoff_Dist. This field is usually used to look at amplitudes with distance, so the closest raw rupture surface distance may not be appropriate. This listed distance can be either less or greater than the actual raw rupture distance. Ruptures included in this table are all ruptures for which the raw rupture distance is less than the cutoff dist.&lt;br /&gt;
&lt;br /&gt;
=== Input file changes ===&lt;br /&gt;
&lt;br /&gt;
*Since the rupture geometry files also expect GridSpacing, NumRows, and NumCols, we are using the same approach as in the database.  GridSpacing is replaced by AveArea, and NumRows and NumCols are replaced by NumPoints.&lt;br /&gt;
&lt;br /&gt;
=== Code changes ===&lt;br /&gt;
&lt;br /&gt;
*A new version of DirectSynth, DirectSynth_RSQSim, was created, which takes in an input file consisting of a list of SRFs for processing.  &lt;br /&gt;
&lt;br /&gt;
== Small-scale catalog, ERFID=42 ==&lt;br /&gt;
&lt;br /&gt;
Initially, we are using a small RSQSim catalog for testing on 4 CyberShake sites (USC, PAS, WNGC, SBSM).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2457&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 41829 M&amp;gt;=6.5 events (2947 sources)&lt;br /&gt;
* Catalog Duration Used: 187,782.42 years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2457&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_params.xml&lt;br /&gt;
&lt;br /&gt;
=== Test seismograms ===&lt;br /&gt;
&lt;br /&gt;
Seismograms were synthesized for site USC (1 Hz) for two RSQSim events.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_5085_s74399.png|thumb|left|750px|Seismograms at USC for event 74399]]&lt;br /&gt;
| [[File:event_74399.png|thumb|left|700px|event 74399, M6.64 on the Elsinore Fault]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_5085_s33801.png|thumb|left|750px|Seismograms at USC for event 33801]]&lt;br /&gt;
| [[File:event_33801.png|thumb|left|700px|event 33801, M7.71 on the SAF, Garlock, and San Jacinto]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19722</id>
		<title>RSQSim CyberShake</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=RSQSim_CyberShake&amp;diff=19722"/>
		<updated>2018-02-07T01:08:43Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;We are planning to perform CyberShake simulations using RSQSim as the ERF.  This page documents the decisions and results.&lt;br /&gt;
&lt;br /&gt;
== Modifications from UCERF2 ERF ==&lt;br /&gt;
&lt;br /&gt;
The biggest differences from the UCERF2 ERF is that RSQSim ruptures don't neatly fall into the source/rupture/rupture variation hierarchy, and the rupture surfaces are UCERF3 surfaces, so they fall on a triangular, not rectangular grid.&lt;br /&gt;
&lt;br /&gt;
=== Rupture hierarchy ===&lt;br /&gt;
&lt;br /&gt;
RSQSim catalogs generate ruptures on triangular elements. Those elements each have an associated UCERF3 fault subsection, which in turn has an associated fault section (e.g. Mojave S). GMPE comparisons use the associated UCERF3 subsections (after applying a filter to remove subsections for which only a few elements participate). Here is the structure for the RSQSim ERF:&lt;br /&gt;
&lt;br /&gt;
*Source: all ruptures which involve the same set of UCERF3 fault sections (aka 'parent sections'). Source names are all of the sections involved, and sources are sorted alphabetically&lt;br /&gt;
*Rupture: an individual RSQSim rupture (with it's own full slip/time history) that occurred in the RSQSim catalog. Ruptures are sorted by magnitude (increasing)&lt;br /&gt;
*Rupture Variation: there is 1 rupture variation for each rupture, as each rupture has as slip/time history from RSQSim&lt;br /&gt;
&lt;br /&gt;
=== Database changes ===&lt;br /&gt;
&lt;br /&gt;
*In the Ruptures table, we are using the square root of the average element area for GridSpacing - basically, the side length if they were on a rectangular grid.&lt;br /&gt;
*In the Ruptures table, we are setting NumRows and NumCols to 0, but using the correct value for NumPoints.&lt;br /&gt;
&lt;br /&gt;
=== Input file changes ===&lt;br /&gt;
&lt;br /&gt;
*Since the rupture geometry files also expect GridSpacing, NumRows, and NumCols, we are using the same approach as in the database.  GridSpacing is replaced by AveArea, and NumRows and NumCols are replaced by NumPoints.&lt;br /&gt;
&lt;br /&gt;
=== Code changes ===&lt;br /&gt;
&lt;br /&gt;
*A new version of DirectSynth, DirectSynth_RSQSim, was created, which takes in an input file consisting of a list of SRFs for processing.  &lt;br /&gt;
&lt;br /&gt;
== Small-scale catalog, ERFID=42 ==&lt;br /&gt;
&lt;br /&gt;
Initially, we are using a small RSQSim catalog for testing on 4 CyberShake sites (USC, PAS, WNGC, SBSM).&lt;br /&gt;
&lt;br /&gt;
Catalog details:&lt;br /&gt;
* Name: Bruce 2457&lt;br /&gt;
* Average element area: 1.35 km^2&lt;br /&gt;
* Min Mag Considered: 6.5&lt;br /&gt;
* Num Events: 41829 M&amp;gt;=6.5 events (2947 sources)&lt;br /&gt;
* Catalog Duration Used: 187,782.42 years&lt;br /&gt;
* More information: https://github.com/kevinmilner/rsqsim-analysis/tree/master/catalogs/rundir2457&lt;br /&gt;
&lt;br /&gt;
=== Input file locations ===&lt;br /&gt;
&lt;br /&gt;
* SRF files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Point files: /home/scec-02/kmilner/simulators/catalogs/rundir2457/cybershake_inputs&lt;br /&gt;
* Catalog/ERF Mappings: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_mappings.bin&lt;br /&gt;
* ERF Metadata XML: /home/scec-02/kmilner/simulators/catalogs/rundir2457/erf_params.xml&lt;br /&gt;
&lt;br /&gt;
=== Test seismograms ===&lt;br /&gt;
&lt;br /&gt;
Seismograms were synthesized for site USC (1 Hz) for two RSQSim events.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_5085_s74399.png|thumb|left|750px|Seismograms at USC for event 74399]]&lt;br /&gt;
| [[File:event_74399.png|thumb|left|700px|event 74399, M6.64 on the Elsinore Fault]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:USC_5085_s33801.png|thumb|left|750px|Seismograms at USC for event 33801]]&lt;br /&gt;
| [[File:event_33801.png|thumb|left|700px|event 33801, M7.71 on the SAF, Garlock, and San Jacinto]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Background_Seismicity&amp;diff=19585</id>
		<title>CyberShake Background Seismicity</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Background_Seismicity&amp;diff=19585"/>
		<updated>2018-01-19T19:33:14Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;'''Goal: determine the importance of background seismicity in different parts of CA to either validate it's exclusion in CyberShake or motivate an inclusion strategy.'''&lt;br /&gt;
&lt;br /&gt;
== UCERF2 ==&lt;br /&gt;
&lt;br /&gt;
Calculation parameters:&lt;br /&gt;
* GMPE: NGA-West2 GMPE (average of 4 models)&lt;br /&gt;
* ERF: UCERF2, with and without background seismicity&lt;br /&gt;
* Site Effects: Vs30 is from Wills (2015), no basin depth terms included&lt;br /&gt;
* IMT: 2 second SA.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Model&lt;br /&gt;
! 10% in 50yr&lt;br /&gt;
! 2% in 50yr&lt;br /&gt;
! 1% in 50yr&lt;br /&gt;
|-&lt;br /&gt;
! Excluding Background&lt;br /&gt;
| [[File:ngaw2_exclude_2sec_10p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ngaw2_exclude_2sec_2p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ngaw2_exclude_2sec_1p_in_50.png|400px|thumb|left]]&lt;br /&gt;
|-&lt;br /&gt;
! Including Background&lt;br /&gt;
| [[File:ngaw2_include_2sec_10p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ngaw2_include_2sec_2p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ngaw2_include_2sec_1p_in_50.png|400px|thumb|left]]&lt;br /&gt;
|-&lt;br /&gt;
! Ratio Exclude/Include&lt;br /&gt;
| [[File:ratio_ngaw2_exclude_ngaw2_include_10p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ratio_ngaw2_exclude_ngaw2_include_2p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ratio_ngaw2_exclude_ngaw2_include_1p_in_50.png|400px|thumb|left]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== UCERF3 ==&lt;br /&gt;
&lt;br /&gt;
Calculation parameters:&lt;br /&gt;
* GMPE: NGA-West2 GMPE (average of 4 models)&lt;br /&gt;
* ERF: UCERF3, with the following configurations&lt;br /&gt;
** Supra-seismogenic ruptures only (only supra-seismogenic ruptures have finite fault surfaces in UCERF3, all others treated as gridded)&lt;br /&gt;
** Supra and Sub-seismogenic ruptures only (this adds gridded ruptures that correspond to sub-seimogenic ruptures on known faults)&lt;br /&gt;
** Supra, Sub-seismogenic, and off fault ruptures (complete model)&lt;br /&gt;
* Site Effects: Vs30 is from Wills (2015), no basin depth terms included&lt;br /&gt;
* IMT: 2 second SA.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Model&lt;br /&gt;
! 10% in 50yr&lt;br /&gt;
! 2% in 50yr&lt;br /&gt;
! 1% in 50yr&lt;br /&gt;
|-&lt;br /&gt;
! Supra-Seismogenic On-Fault Only&lt;br /&gt;
| [[File:ucerf3_supra_only_10p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ucerf3_supra_only_2p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ucerf3_supra_only_1p_in_50.png|400px|thumb|left]]&lt;br /&gt;
|-&lt;br /&gt;
! Sub+Supra-Seismogenic On-Fault Only&lt;br /&gt;
| [[File:ucerf3_sub+supra_10p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ucerf3_sub+supra_2p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ucerf3_sub+supra_1p_in_50.png|400px|thumb|left]]&lt;br /&gt;
|-&lt;br /&gt;
! Complete Model (Supra+Sub+Off)&lt;br /&gt;
| [[File:ucerf3_sub+supra+off_10p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ucerf3_sub+supra+off_2p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ucerf3_sub+supra+off_1p_in_50.png|400px|thumb|left]]&lt;br /&gt;
|-&lt;br /&gt;
! Ratio Supra-Only/Complete&lt;br /&gt;
| [[File:ratio_ucerf3_supra_only_vs_sub+supra+off_10p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ratio_ucerf3_supra_only_vs_sub+supra+off_2p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ratio_ucerf3_supra_only_vs_sub+supra+off_1p_in_50.png|400px|thumb|left]]&lt;br /&gt;
|-&lt;br /&gt;
! Ratio Supra+Sub/Complete&lt;br /&gt;
| [[File:ratio_ucerf3_sub+supra_vs_sub+supra+off_10p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ratio_ucerf3_sub+supra_vs_sub+supra+off_2p_in_50.png|400px|thumb|left]]&lt;br /&gt;
| [[File:ratio_ucerf3_sub+supra_vs_sub+supra+off_1p_in_50.png|400px|thumb|left]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Ucerf3_supra_only_10p_in_50.png&amp;diff=19584</id>
		<title>File:Ucerf3 supra only 10p in 50.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Ucerf3_supra_only_10p_in_50.png&amp;diff=19584"/>
		<updated>2018-01-19T19:25:28Z</updated>

		<summary type="html">&lt;p&gt;Kmilner: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Kmilner</name></author>
		
	</entry>
</feed>