

<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://strike.scec.org/scecwiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Scottcal</id>
	<title>SCECpedia - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://strike.scec.org/scecwiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Scottcal"/>
	<link rel="alternate" type="text/html" href="https://strike.scec.org/scecpedia/Special:Contributions/Scottcal"/>
	<updated>2026-05-09T11:00:11Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.34.2</generator>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:MUSCAL_region_extent.png&amp;diff=30761</id>
		<title>File:MUSCAL region extent.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:MUSCAL_region_extent.png&amp;diff=30761"/>
		<updated>2026-04-27T21:05:18Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30760</id>
		<title>UCVM muscal</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30760"/>
		<updated>2026-04-27T21:01:47Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Delivered Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page describes the Multi-Scale CALifornia (MUSCAL) velocity model and its integration into UCVM.&lt;br /&gt;
&lt;br /&gt;
== Model overview ==&lt;br /&gt;
&lt;br /&gt;
The MUSCAL Vp and Vs model was created by Te-Yang Yeh and Yehuda Ben-Zion in 2025-6 by starting with the CANVAS tomography model as a base and then integrating multiple regional and local high-resolution models.  Each submodel is evaluated to determine where it improves the fit through simulations of M4 historical events to 1 Hz, and then the overall model is updated where the submodel improves the fit.&lt;br /&gt;
&lt;br /&gt;
=== Model Components ===&lt;br /&gt;
&lt;br /&gt;
The model was built by starting with a base model (the CANVAS (Doody et al., 2023) tomography model).  Next, regional and local models are added where they improve the fit, based on M4 simulations.  These simulations were run to 1 Hz with topography using a minimum Vs of 450 m/s.  Finally, a spatially varying taper following the Ely approach is added.&lt;br /&gt;
&lt;br /&gt;
==== Regional Models ====&lt;br /&gt;
&lt;br /&gt;
Regional models evaluated for inclusion are:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Northern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Lin et al. (2010)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Furlong et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Central California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CCA-06&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H v15.1.1&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Southern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Fang et al. (2022)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H v15.1.1&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Local Models ====&lt;br /&gt;
&lt;br /&gt;
Below is a list of local high-resolution models that were evaluated for inclusion.&lt;br /&gt;
&lt;br /&gt;
Northern and Central California:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ridgecrest regional (Li and Ben-Zion (2024))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ridgecrest fault zone (Zhou et al. (2022))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Joaquin basin (Shaw &amp;amp; Plesch (2016))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;SFCVM v21.1 (Hirakawa &amp;amp; Aagaard (2022))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Southern California:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Jacinto fault zone (Fang et al. (2019))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Gabriel-San Bernardino Basin (Li et al. (2023))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Imperial Valley (Persaud et al. (2016))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Coachella Valley (Ajala et al. (2019))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Santa Maria Basin (Plesch et al. (2020))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Taper ====&lt;br /&gt;
&lt;br /&gt;
MUSCAL includes a low-velocity taper using the Ely-Jordan approach.  Vs30 for this method is produced by merging the Thompson (2018) Vs30 map with the [https://earthquake.usgs.gov/data/vs30/ USGS global Vs30 dataset].&lt;br /&gt;
&lt;br /&gt;
The depth of the taper varies depending on the site and ranges from 300m to 1500m.&lt;br /&gt;
&lt;br /&gt;
== Delivered Model ==&lt;br /&gt;
&lt;br /&gt;
The final version of MUSCAL is delivered in netCDF format.  The dimensions of the model in grid points are 1251 (lat) x 1301 (lon) x 210 (depth). Grid points have a spacing of 0.01 degrees in the horizontal; the depth is variable with denser sampling near the surface.  The model is interpolated using trilinear interpolation.&lt;br /&gt;
&lt;br /&gt;
Details about the model creation are provided in the paper and the [https://zenodo.org/records/19243477 Zenodo page].&lt;br /&gt;
&lt;br /&gt;
Below is a plot of the 3D model extent.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:MUSCAL_region_extent.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Datasets ===&lt;br /&gt;
&lt;br /&gt;
zone: 11&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  command  use:&lt;br /&gt;
     plot_depth_profile.py -n $UCVM_INSTALL_PATH/conf/ucvm.conf -i $UCVM_INSTALL_PATH -d vs -c muscal &lt;br /&gt;
                           -o muscal_small_depth_1000.png -C 'Multi-Scale Statewide California Velocity Model'&lt;br /&gt;
                           -v 1000 -b 0 -s 36.5054,-119.0587 -e 30000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== big data in netcdf ====&lt;br /&gt;
  &lt;br /&gt;
  model_MSCAL_CANVAS_dll0.01_dz50_cmpd.nc&lt;br /&gt;
  4.5G&lt;br /&gt;
&lt;br /&gt;
  longitude:1301 from -126 to -113 &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:671      from 0 to 100,000 &lt;br /&gt;
                   50 increments until 30,000&lt;br /&gt;
                   1000 increments til 100000&lt;br /&gt;
&lt;br /&gt;
Plot Depth profile at  36.5054,-119.0587 in different step increments. No interpretation and access data using nc api&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:muscal_big_deth_50.png|thumb|300px|muscal big 50 vs]]&lt;br /&gt;
| [[FILE:muscal_big_deth_100.png|thumb|300px|muscal big 100 vs]]&lt;br /&gt;
| [[FILE:muscal_big_depth_500.png|thumb|300px|muscal big 500 vs]] &lt;br /&gt;
| [[FILE:muscal_big_deth_1000.png|thumb|300px|muscal big 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_big_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== small data in netcdf ====&lt;br /&gt;
   model_MUSCAL_CANVAS_dll0.01_vardz_cmpd.nc &lt;br /&gt;
   2.1G &lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113   &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:210      from 0 to 99,000 &lt;br /&gt;
                   50 increments upto 3000 &lt;br /&gt;
                   100 increments upto 5000&lt;br /&gt;
                   250 increments upto 10000&lt;br /&gt;
                   500 increments upto 30000&lt;br /&gt;
                   1000 increments upto 99000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Depth profiles in different steps : 50m,100m,500m,1000m&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Direct from netcdf as external file&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_depth_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_depth_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data no interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_no_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_no_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_no_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== Another small dataset in netcdf ====&lt;br /&gt;
&lt;br /&gt;
  model_MUSCAL_CANVAS_dll0.01_vardz_float32_cmpd.nc&lt;br /&gt;
  1.4G&lt;br /&gt;
&lt;br /&gt;
  All longitude, latitude, and depth points are now saved as float32, as well as the vp, vs, and rho. &lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data with interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
=== Validation ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Several steps of pre/post processing are done to original MUSCAL model(netcdf format) in order to &lt;br /&gt;
incorporate it into UCVM and CVM explorer.&lt;br /&gt;
&lt;br /&gt;
   Because of the speed of netcdf-C code is too slow to support in-time nature of the explorer and would &lt;br /&gt;
   like to have interpolation on query,&lt;br /&gt;
&lt;br /&gt;
      * Number of depth layer is reduced with deeper layers merged into fewer layers&lt;br /&gt;
      * preprocessing the netcdf data into binary data files&lt;br /&gt;
      * data are all in float32&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Final result from ucvm_query, (with interpolation)&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.final.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.final.txt]]&lt;br /&gt;
&lt;br /&gt;
== Extended Model ==&lt;br /&gt;
&lt;br /&gt;
The 3D MUSCAL model is defined over the same volume as the CANVAS base model.  However, some simulation regions may extend beyond the boundaries, like the blue corners in the example CyberShake volume for USC below: &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_muscal_horiz_slice_z0.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
To solve this in a consistent and reproducible way, we will extend the MUSCAL model through UCVM.  The initial plan is to use the following algorithm:&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query 3D MUSCAL model at a given point.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;If the model returns NaNs (indicating the point is beyond the bounds of the 3D model):&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query the 1D MUSCAL model (Te-Yang to provide this).&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine Vs30 from the USGS Global Vs30 model.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Apply the Ely taper (LVT) to the top 300m.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30759</id>
		<title>UCVM muscal</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30759"/>
		<updated>2026-04-27T21:01:21Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Delivered Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page describes the Multi-Scale CALifornia (MUSCAL) velocity model and its integration into UCVM.&lt;br /&gt;
&lt;br /&gt;
== Model overview ==&lt;br /&gt;
&lt;br /&gt;
The MUSCAL Vp and Vs model was created by Te-Yang Yeh and Yehuda Ben-Zion in 2025-6 by starting with the CANVAS tomography model as a base and then integrating multiple regional and local high-resolution models.  Each submodel is evaluated to determine where it improves the fit through simulations of M4 historical events to 1 Hz, and then the overall model is updated where the submodel improves the fit.&lt;br /&gt;
&lt;br /&gt;
=== Model Components ===&lt;br /&gt;
&lt;br /&gt;
The model was built by starting with a base model (the CANVAS (Doody et al., 2023) tomography model).  Next, regional and local models are added where they improve the fit, based on M4 simulations.  These simulations were run to 1 Hz with topography using a minimum Vs of 450 m/s.  Finally, a spatially varying taper following the Ely approach is added.&lt;br /&gt;
&lt;br /&gt;
==== Regional Models ====&lt;br /&gt;
&lt;br /&gt;
Regional models evaluated for inclusion are:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Northern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Lin et al. (2010)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Furlong et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Central California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CCA-06&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H v15.1.1&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Southern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Fang et al. (2022)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H v15.1.1&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Local Models ====&lt;br /&gt;
&lt;br /&gt;
Below is a list of local high-resolution models that were evaluated for inclusion.&lt;br /&gt;
&lt;br /&gt;
Northern and Central California:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ridgecrest regional (Li and Ben-Zion (2024))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ridgecrest fault zone (Zhou et al. (2022))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Joaquin basin (Shaw &amp;amp; Plesch (2016))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;SFCVM v21.1 (Hirakawa &amp;amp; Aagaard (2022))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Southern California:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Jacinto fault zone (Fang et al. (2019))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Gabriel-San Bernardino Basin (Li et al. (2023))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Imperial Valley (Persaud et al. (2016))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Coachella Valley (Ajala et al. (2019))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Santa Maria Basin (Plesch et al. (2020))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Taper ====&lt;br /&gt;
&lt;br /&gt;
MUSCAL includes a low-velocity taper using the Ely-Jordan approach.  Vs30 for this method is produced by merging the Thompson (2018) Vs30 map with the [https://earthquake.usgs.gov/data/vs30/ USGS global Vs30 dataset].&lt;br /&gt;
&lt;br /&gt;
The depth of the taper varies depending on the site and ranges from 300m to 1500m.&lt;br /&gt;
&lt;br /&gt;
== Delivered Model ==&lt;br /&gt;
&lt;br /&gt;
The final version of MUSCAL is delivered in netCDF format.  The dimensions of the model in grid points are 1251 (lat) x 1301 (lon) x 210 (depth). Grid points have a spacing of 0.01 degrees in the horizontal; the depth is variable with denser sampling near the surface.  The model is interpolated using trilinear interpolation.&lt;br /&gt;
&lt;br /&gt;
Details about the model creation are provided in the paper and the [https://zenodo.org/records/19243477 Zenodo page].&lt;br /&gt;
&lt;br /&gt;
Below is a plot of the 3D model extent.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [File:MUSCAL_region_extent.png|thumb|400px]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Datasets ===&lt;br /&gt;
&lt;br /&gt;
zone: 11&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  command  use:&lt;br /&gt;
     plot_depth_profile.py -n $UCVM_INSTALL_PATH/conf/ucvm.conf -i $UCVM_INSTALL_PATH -d vs -c muscal &lt;br /&gt;
                           -o muscal_small_depth_1000.png -C 'Multi-Scale Statewide California Velocity Model'&lt;br /&gt;
                           -v 1000 -b 0 -s 36.5054,-119.0587 -e 30000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== big data in netcdf ====&lt;br /&gt;
  &lt;br /&gt;
  model_MSCAL_CANVAS_dll0.01_dz50_cmpd.nc&lt;br /&gt;
  4.5G&lt;br /&gt;
&lt;br /&gt;
  longitude:1301 from -126 to -113 &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:671      from 0 to 100,000 &lt;br /&gt;
                   50 increments until 30,000&lt;br /&gt;
                   1000 increments til 100000&lt;br /&gt;
&lt;br /&gt;
Plot Depth profile at  36.5054,-119.0587 in different step increments. No interpretation and access data using nc api&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:muscal_big_deth_50.png|thumb|300px|muscal big 50 vs]]&lt;br /&gt;
| [[FILE:muscal_big_deth_100.png|thumb|300px|muscal big 100 vs]]&lt;br /&gt;
| [[FILE:muscal_big_depth_500.png|thumb|300px|muscal big 500 vs]] &lt;br /&gt;
| [[FILE:muscal_big_deth_1000.png|thumb|300px|muscal big 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_big_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== small data in netcdf ====&lt;br /&gt;
   model_MUSCAL_CANVAS_dll0.01_vardz_cmpd.nc &lt;br /&gt;
   2.1G &lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113   &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:210      from 0 to 99,000 &lt;br /&gt;
                   50 increments upto 3000 &lt;br /&gt;
                   100 increments upto 5000&lt;br /&gt;
                   250 increments upto 10000&lt;br /&gt;
                   500 increments upto 30000&lt;br /&gt;
                   1000 increments upto 99000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Depth profiles in different steps : 50m,100m,500m,1000m&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Direct from netcdf as external file&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_depth_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_depth_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data no interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_no_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_no_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_no_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== Another small dataset in netcdf ====&lt;br /&gt;
&lt;br /&gt;
  model_MUSCAL_CANVAS_dll0.01_vardz_float32_cmpd.nc&lt;br /&gt;
  1.4G&lt;br /&gt;
&lt;br /&gt;
  All longitude, latitude, and depth points are now saved as float32, as well as the vp, vs, and rho. &lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data with interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
=== Validation ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Several steps of pre/post processing are done to original MUSCAL model(netcdf format) in order to &lt;br /&gt;
incorporate it into UCVM and CVM explorer.&lt;br /&gt;
&lt;br /&gt;
   Because of the speed of netcdf-C code is too slow to support in-time nature of the explorer and would &lt;br /&gt;
   like to have interpolation on query,&lt;br /&gt;
&lt;br /&gt;
      * Number of depth layer is reduced with deeper layers merged into fewer layers&lt;br /&gt;
      * preprocessing the netcdf data into binary data files&lt;br /&gt;
      * data are all in float32&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Final result from ucvm_query, (with interpolation)&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.final.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.final.txt]]&lt;br /&gt;
&lt;br /&gt;
== Extended Model ==&lt;br /&gt;
&lt;br /&gt;
The 3D MUSCAL model is defined over the same volume as the CANVAS base model.  However, some simulation regions may extend beyond the boundaries, like the blue corners in the example CyberShake volume for USC below: &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_muscal_horiz_slice_z0.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
To solve this in a consistent and reproducible way, we will extend the MUSCAL model through UCVM.  The initial plan is to use the following algorithm:&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query 3D MUSCAL model at a given point.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;If the model returns NaNs (indicating the point is beyond the bounds of the 3D model):&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query the 1D MUSCAL model (Te-Yang to provide this).&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine Vs30 from the USGS Global Vs30 model.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Apply the Ely taper (LVT) to the top 300m.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30758</id>
		<title>UCVM muscal</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30758"/>
		<updated>2026-04-27T20:47:12Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Regional Models */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page describes the Multi-Scale CALifornia (MUSCAL) velocity model and its integration into UCVM.&lt;br /&gt;
&lt;br /&gt;
== Model overview ==&lt;br /&gt;
&lt;br /&gt;
The MUSCAL Vp and Vs model was created by Te-Yang Yeh and Yehuda Ben-Zion in 2025-6 by starting with the CANVAS tomography model as a base and then integrating multiple regional and local high-resolution models.  Each submodel is evaluated to determine where it improves the fit through simulations of M4 historical events to 1 Hz, and then the overall model is updated where the submodel improves the fit.&lt;br /&gt;
&lt;br /&gt;
=== Model Components ===&lt;br /&gt;
&lt;br /&gt;
The model was built by starting with a base model (the CANVAS (Doody et al., 2023) tomography model).  Next, regional and local models are added where they improve the fit, based on M4 simulations.  These simulations were run to 1 Hz with topography using a minimum Vs of 450 m/s.  Finally, a spatially varying taper following the Ely approach is added.&lt;br /&gt;
&lt;br /&gt;
==== Regional Models ====&lt;br /&gt;
&lt;br /&gt;
Regional models evaluated for inclusion are:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Northern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Lin et al. (2010)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Furlong et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Central California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CCA-06&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H v15.1.1&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Southern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Fang et al. (2022)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H v15.1.1&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Local Models ====&lt;br /&gt;
&lt;br /&gt;
Below is a list of local high-resolution models that were evaluated for inclusion.&lt;br /&gt;
&lt;br /&gt;
Northern and Central California:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ridgecrest regional (Li and Ben-Zion (2024))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ridgecrest fault zone (Zhou et al. (2022))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Joaquin basin (Shaw &amp;amp; Plesch (2016))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;SFCVM v21.1 (Hirakawa &amp;amp; Aagaard (2022))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Southern California:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Jacinto fault zone (Fang et al. (2019))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Gabriel-San Bernardino Basin (Li et al. (2023))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Imperial Valley (Persaud et al. (2016))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Coachella Valley (Ajala et al. (2019))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Santa Maria Basin (Plesch et al. (2020))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Taper ====&lt;br /&gt;
&lt;br /&gt;
MUSCAL includes a low-velocity taper using the Ely-Jordan approach.  Vs30 for this method is produced by merging the Thompson (2018) Vs30 map with the [https://earthquake.usgs.gov/data/vs30/ USGS global Vs30 dataset].&lt;br /&gt;
&lt;br /&gt;
The depth of the taper varies depending on the site and ranges from 300m to 1500m.&lt;br /&gt;
&lt;br /&gt;
== Delivered Model ==&lt;br /&gt;
&lt;br /&gt;
The final version of MUSCAL is delivered in netCDF format.  The dimensions of the model are &amp;lt;X x Y x Z&amp;gt;. Grid points have a spacing of &amp;lt;m&amp;gt;.  The model is interpolated using trilinear interpolation.&lt;br /&gt;
&lt;br /&gt;
Details about the model creation are provided in the paper and the [https://zenodo.org/records/19243477 Zenodo page].&lt;br /&gt;
&lt;br /&gt;
Below is a plot of the 3D model extent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Datasets ===&lt;br /&gt;
&lt;br /&gt;
zone: 11&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  command  use:&lt;br /&gt;
     plot_depth_profile.py -n $UCVM_INSTALL_PATH/conf/ucvm.conf -i $UCVM_INSTALL_PATH -d vs -c muscal &lt;br /&gt;
                           -o muscal_small_depth_1000.png -C 'Multi-Scale Statewide California Velocity Model'&lt;br /&gt;
                           -v 1000 -b 0 -s 36.5054,-119.0587 -e 30000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== big data in netcdf ====&lt;br /&gt;
  &lt;br /&gt;
  model_MSCAL_CANVAS_dll0.01_dz50_cmpd.nc&lt;br /&gt;
  4.5G&lt;br /&gt;
&lt;br /&gt;
  longitude:1301 from -126 to -113 &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:671      from 0 to 100,000 &lt;br /&gt;
                   50 increments until 30,000&lt;br /&gt;
                   1000 increments til 100000&lt;br /&gt;
&lt;br /&gt;
Plot Depth profile at  36.5054,-119.0587 in different step increments. No interpretation and access data using nc api&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:muscal_big_deth_50.png|thumb|300px|muscal big 50 vs]]&lt;br /&gt;
| [[FILE:muscal_big_deth_100.png|thumb|300px|muscal big 100 vs]]&lt;br /&gt;
| [[FILE:muscal_big_depth_500.png|thumb|300px|muscal big 500 vs]] &lt;br /&gt;
| [[FILE:muscal_big_deth_1000.png|thumb|300px|muscal big 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_big_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== small data in netcdf ====&lt;br /&gt;
   model_MUSCAL_CANVAS_dll0.01_vardz_cmpd.nc &lt;br /&gt;
   2.1G &lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113   &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:210      from 0 to 99,000 &lt;br /&gt;
                   50 increments upto 3000 &lt;br /&gt;
                   100 increments upto 5000&lt;br /&gt;
                   250 increments upto 10000&lt;br /&gt;
                   500 increments upto 30000&lt;br /&gt;
                   1000 increments upto 99000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Depth profiles in different steps : 50m,100m,500m,1000m&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Direct from netcdf as external file&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_depth_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_depth_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data no interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_no_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_no_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_no_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== Another small dataset in netcdf ====&lt;br /&gt;
&lt;br /&gt;
  model_MUSCAL_CANVAS_dll0.01_vardz_float32_cmpd.nc&lt;br /&gt;
  1.4G&lt;br /&gt;
&lt;br /&gt;
  All longitude, latitude, and depth points are now saved as float32, as well as the vp, vs, and rho. &lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data with interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
=== Validation ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Several steps of pre/post processing are done to original MUSCAL model(netcdf format) in order to &lt;br /&gt;
incorporate it into UCVM and CVM explorer.&lt;br /&gt;
&lt;br /&gt;
   Because of the speed of netcdf-C code is too slow to support in-time nature of the explorer and would &lt;br /&gt;
   like to have interpolation on query,&lt;br /&gt;
&lt;br /&gt;
      * Number of depth layer is reduced with deeper layers merged into fewer layers&lt;br /&gt;
      * preprocessing the netcdf data into binary data files&lt;br /&gt;
      * data are all in float32&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Final result from ucvm_query, (with interpolation)&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.final.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.final.txt]]&lt;br /&gt;
&lt;br /&gt;
== Extended Model ==&lt;br /&gt;
&lt;br /&gt;
The 3D MUSCAL model is defined over the same volume as the CANVAS base model.  However, some simulation regions may extend beyond the boundaries, like the blue corners in the example CyberShake volume for USC below: &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_muscal_horiz_slice_z0.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
To solve this in a consistent and reproducible way, we will extend the MUSCAL model through UCVM.  The initial plan is to use the following algorithm:&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query 3D MUSCAL model at a given point.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;If the model returns NaNs (indicating the point is beyond the bounds of the 3D model):&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query the 1D MUSCAL model (Te-Yang to provide this).&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine Vs30 from the USGS Global Vs30 model.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Apply the Ely taper (LVT) to the top 300m.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30757</id>
		<title>UCVM muscal</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30757"/>
		<updated>2026-04-27T20:46:05Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Local Models */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page describes the Multi-Scale CALifornia (MUSCAL) velocity model and its integration into UCVM.&lt;br /&gt;
&lt;br /&gt;
== Model overview ==&lt;br /&gt;
&lt;br /&gt;
The MUSCAL Vp and Vs model was created by Te-Yang Yeh and Yehuda Ben-Zion in 2025-6 by starting with the CANVAS tomography model as a base and then integrating multiple regional and local high-resolution models.  Each submodel is evaluated to determine where it improves the fit through simulations of M4 historical events to 1 Hz, and then the overall model is updated where the submodel improves the fit.&lt;br /&gt;
&lt;br /&gt;
=== Model Components ===&lt;br /&gt;
&lt;br /&gt;
The model was built by starting with a base model (the CANVAS (Doody et al., 2023) tomography model).  Next, regional and local models are added where they improve the fit, based on M4 simulations.  These simulations were run to 1 Hz with topography using a minimum Vs of 450 m/s.  Finally, a spatially varying taper following the Ely approach is added.&lt;br /&gt;
&lt;br /&gt;
==== Regional Models ====&lt;br /&gt;
&lt;br /&gt;
Regional models evaluated for inclusion are:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Northern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Lin et al. (2010)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Furlong et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Central California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CCA-06&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Southern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Fang et al. (2022)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Local Models ====&lt;br /&gt;
&lt;br /&gt;
Below is a list of local high-resolution models that were evaluated for inclusion.&lt;br /&gt;
&lt;br /&gt;
Northern and Central California:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ridgecrest regional (Li and Ben-Zion (2024))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ridgecrest fault zone (Zhou et al. (2022))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Joaquin basin (Shaw &amp;amp; Plesch (2016))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;SFCVM v21.1 (Hirakawa &amp;amp; Aagaard (2022))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Southern California:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Jacinto fault zone (Fang et al. (2019))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Gabriel-San Bernardino Basin (Li et al. (2023))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Imperial Valley (Persaud et al. (2016))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Coachella Valley (Ajala et al. (2019))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Santa Maria Basin (Plesch et al. (2020))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Taper ====&lt;br /&gt;
&lt;br /&gt;
MUSCAL includes a low-velocity taper using the Ely-Jordan approach.  Vs30 for this method is produced by merging the Thompson (2018) Vs30 map with the [https://earthquake.usgs.gov/data/vs30/ USGS global Vs30 dataset].&lt;br /&gt;
&lt;br /&gt;
The depth of the taper varies depending on the site and ranges from 300m to 1500m.&lt;br /&gt;
&lt;br /&gt;
== Delivered Model ==&lt;br /&gt;
&lt;br /&gt;
The final version of MUSCAL is delivered in netCDF format.  The dimensions of the model are &amp;lt;X x Y x Z&amp;gt;. Grid points have a spacing of &amp;lt;m&amp;gt;.  The model is interpolated using trilinear interpolation.&lt;br /&gt;
&lt;br /&gt;
Details about the model creation are provided in the paper and the [https://zenodo.org/records/19243477 Zenodo page].&lt;br /&gt;
&lt;br /&gt;
Below is a plot of the 3D model extent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Datasets ===&lt;br /&gt;
&lt;br /&gt;
zone: 11&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  command  use:&lt;br /&gt;
     plot_depth_profile.py -n $UCVM_INSTALL_PATH/conf/ucvm.conf -i $UCVM_INSTALL_PATH -d vs -c muscal &lt;br /&gt;
                           -o muscal_small_depth_1000.png -C 'Multi-Scale Statewide California Velocity Model'&lt;br /&gt;
                           -v 1000 -b 0 -s 36.5054,-119.0587 -e 30000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== big data in netcdf ====&lt;br /&gt;
  &lt;br /&gt;
  model_MSCAL_CANVAS_dll0.01_dz50_cmpd.nc&lt;br /&gt;
  4.5G&lt;br /&gt;
&lt;br /&gt;
  longitude:1301 from -126 to -113 &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:671      from 0 to 100,000 &lt;br /&gt;
                   50 increments until 30,000&lt;br /&gt;
                   1000 increments til 100000&lt;br /&gt;
&lt;br /&gt;
Plot Depth profile at  36.5054,-119.0587 in different step increments. No interpretation and access data using nc api&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:muscal_big_deth_50.png|thumb|300px|muscal big 50 vs]]&lt;br /&gt;
| [[FILE:muscal_big_deth_100.png|thumb|300px|muscal big 100 vs]]&lt;br /&gt;
| [[FILE:muscal_big_depth_500.png|thumb|300px|muscal big 500 vs]] &lt;br /&gt;
| [[FILE:muscal_big_deth_1000.png|thumb|300px|muscal big 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_big_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== small data in netcdf ====&lt;br /&gt;
   model_MUSCAL_CANVAS_dll0.01_vardz_cmpd.nc &lt;br /&gt;
   2.1G &lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113   &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:210      from 0 to 99,000 &lt;br /&gt;
                   50 increments upto 3000 &lt;br /&gt;
                   100 increments upto 5000&lt;br /&gt;
                   250 increments upto 10000&lt;br /&gt;
                   500 increments upto 30000&lt;br /&gt;
                   1000 increments upto 99000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Depth profiles in different steps : 50m,100m,500m,1000m&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Direct from netcdf as external file&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_depth_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_depth_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data no interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_no_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_no_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_no_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== Another small dataset in netcdf ====&lt;br /&gt;
&lt;br /&gt;
  model_MUSCAL_CANVAS_dll0.01_vardz_float32_cmpd.nc&lt;br /&gt;
  1.4G&lt;br /&gt;
&lt;br /&gt;
  All longitude, latitude, and depth points are now saved as float32, as well as the vp, vs, and rho. &lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data with interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
=== Validation ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Several steps of pre/post processing are done to original MUSCAL model(netcdf format) in order to &lt;br /&gt;
incorporate it into UCVM and CVM explorer.&lt;br /&gt;
&lt;br /&gt;
   Because of the speed of netcdf-C code is too slow to support in-time nature of the explorer and would &lt;br /&gt;
   like to have interpolation on query,&lt;br /&gt;
&lt;br /&gt;
      * Number of depth layer is reduced with deeper layers merged into fewer layers&lt;br /&gt;
      * preprocessing the netcdf data into binary data files&lt;br /&gt;
      * data are all in float32&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Final result from ucvm_query, (with interpolation)&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.final.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.final.txt]]&lt;br /&gt;
&lt;br /&gt;
== Extended Model ==&lt;br /&gt;
&lt;br /&gt;
The 3D MUSCAL model is defined over the same volume as the CANVAS base model.  However, some simulation regions may extend beyond the boundaries, like the blue corners in the example CyberShake volume for USC below: &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_muscal_horiz_slice_z0.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
To solve this in a consistent and reproducible way, we will extend the MUSCAL model through UCVM.  The initial plan is to use the following algorithm:&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query 3D MUSCAL model at a given point.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;If the model returns NaNs (indicating the point is beyond the bounds of the 3D model):&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query the 1D MUSCAL model (Te-Yang to provide this).&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine Vs30 from the USGS Global Vs30 model.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Apply the Ely taper (LVT) to the top 300m.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30756</id>
		<title>UCVM muscal</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30756"/>
		<updated>2026-04-27T20:43:27Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page describes the Multi-Scale CALifornia (MUSCAL) velocity model and its integration into UCVM.&lt;br /&gt;
&lt;br /&gt;
== Model overview ==&lt;br /&gt;
&lt;br /&gt;
The MUSCAL Vp and Vs model was created by Te-Yang Yeh and Yehuda Ben-Zion in 2025-6 by starting with the CANVAS tomography model as a base and then integrating multiple regional and local high-resolution models.  Each submodel is evaluated to determine where it improves the fit through simulations of M4 historical events to 1 Hz, and then the overall model is updated where the submodel improves the fit.&lt;br /&gt;
&lt;br /&gt;
=== Model Components ===&lt;br /&gt;
&lt;br /&gt;
The model was built by starting with a base model (the CANVAS (Doody et al., 2023) tomography model).  Next, regional and local models are added where they improve the fit, based on M4 simulations.  These simulations were run to 1 Hz with topography using a minimum Vs of 450 m/s.  Finally, a spatially varying taper following the Ely approach is added.&lt;br /&gt;
&lt;br /&gt;
==== Regional Models ====&lt;br /&gt;
&lt;br /&gt;
Regional models evaluated for inclusion are:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Northern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Lin et al. (2010)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Furlong et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Central California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CCA-06&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Southern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Fang et al. (2022)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Local Models ====&lt;br /&gt;
&lt;br /&gt;
Below is a list of local high-resolution models that were evaluated for inclusion.&lt;br /&gt;
&lt;br /&gt;
Northern and Central California:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ridgecrest regional (Li and Ben-Zion (2024))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ridgecrest fault zone (Zhou et al. (2022))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Joaquin basin (Shaw &amp;amp; Plesch (2016))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;SFVM (Hirakawa &amp;amp; Aagaard (2022))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Southern California:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Jacinto fault zone (Fang et al. (2019))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;San Gabriel-San Bernardino Basin (Li et al. (2023))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Imperial Valley (Persaud et al. (2016))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Coachella Valley (Ajala et al. (2019))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Santa Maria Basin (Plesch et al. (2020))&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Taper ====&lt;br /&gt;
&lt;br /&gt;
MUSCAL includes a low-velocity taper using the Ely-Jordan approach.  Vs30 for this method is produced by merging the Thompson (2018) Vs30 map with the [https://earthquake.usgs.gov/data/vs30/ USGS global Vs30 dataset].&lt;br /&gt;
&lt;br /&gt;
The depth of the taper varies depending on the site and ranges from 300m to 1500m.&lt;br /&gt;
&lt;br /&gt;
== Delivered Model ==&lt;br /&gt;
&lt;br /&gt;
The final version of MUSCAL is delivered in netCDF format.  The dimensions of the model are &amp;lt;X x Y x Z&amp;gt;. Grid points have a spacing of &amp;lt;m&amp;gt;.  The model is interpolated using trilinear interpolation.&lt;br /&gt;
&lt;br /&gt;
Details about the model creation are provided in the paper and the [https://zenodo.org/records/19243477 Zenodo page].&lt;br /&gt;
&lt;br /&gt;
Below is a plot of the 3D model extent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Datasets ===&lt;br /&gt;
&lt;br /&gt;
zone: 11&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  command  use:&lt;br /&gt;
     plot_depth_profile.py -n $UCVM_INSTALL_PATH/conf/ucvm.conf -i $UCVM_INSTALL_PATH -d vs -c muscal &lt;br /&gt;
                           -o muscal_small_depth_1000.png -C 'Multi-Scale Statewide California Velocity Model'&lt;br /&gt;
                           -v 1000 -b 0 -s 36.5054,-119.0587 -e 30000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== big data in netcdf ====&lt;br /&gt;
  &lt;br /&gt;
  model_MSCAL_CANVAS_dll0.01_dz50_cmpd.nc&lt;br /&gt;
  4.5G&lt;br /&gt;
&lt;br /&gt;
  longitude:1301 from -126 to -113 &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:671      from 0 to 100,000 &lt;br /&gt;
                   50 increments until 30,000&lt;br /&gt;
                   1000 increments til 100000&lt;br /&gt;
&lt;br /&gt;
Plot Depth profile at  36.5054,-119.0587 in different step increments. No interpretation and access data using nc api&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:muscal_big_deth_50.png|thumb|300px|muscal big 50 vs]]&lt;br /&gt;
| [[FILE:muscal_big_deth_100.png|thumb|300px|muscal big 100 vs]]&lt;br /&gt;
| [[FILE:muscal_big_depth_500.png|thumb|300px|muscal big 500 vs]] &lt;br /&gt;
| [[FILE:muscal_big_deth_1000.png|thumb|300px|muscal big 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_big_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== small data in netcdf ====&lt;br /&gt;
   model_MUSCAL_CANVAS_dll0.01_vardz_cmpd.nc &lt;br /&gt;
   2.1G &lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113   &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:210      from 0 to 99,000 &lt;br /&gt;
                   50 increments upto 3000 &lt;br /&gt;
                   100 increments upto 5000&lt;br /&gt;
                   250 increments upto 10000&lt;br /&gt;
                   500 increments upto 30000&lt;br /&gt;
                   1000 increments upto 99000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Depth profiles in different steps : 50m,100m,500m,1000m&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Direct from netcdf as external file&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_depth_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_depth_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data no interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_no_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_no_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_no_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== Another small dataset in netcdf ====&lt;br /&gt;
&lt;br /&gt;
  model_MUSCAL_CANVAS_dll0.01_vardz_float32_cmpd.nc&lt;br /&gt;
  1.4G&lt;br /&gt;
&lt;br /&gt;
  All longitude, latitude, and depth points are now saved as float32, as well as the vp, vs, and rho. &lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data with interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
=== Validation ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Several steps of pre/post processing are done to original MUSCAL model(netcdf format) in order to &lt;br /&gt;
incorporate it into UCVM and CVM explorer.&lt;br /&gt;
&lt;br /&gt;
   Because of the speed of netcdf-C code is too slow to support in-time nature of the explorer and would &lt;br /&gt;
   like to have interpolation on query,&lt;br /&gt;
&lt;br /&gt;
      * Number of depth layer is reduced with deeper layers merged into fewer layers&lt;br /&gt;
      * preprocessing the netcdf data into binary data files&lt;br /&gt;
      * data are all in float32&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Final result from ucvm_query, (with interpolation)&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.final.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.final.txt]]&lt;br /&gt;
&lt;br /&gt;
== Extended Model ==&lt;br /&gt;
&lt;br /&gt;
The 3D MUSCAL model is defined over the same volume as the CANVAS base model.  However, some simulation regions may extend beyond the boundaries, like the blue corners in the example CyberShake volume for USC below: &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_muscal_horiz_slice_z0.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
To solve this in a consistent and reproducible way, we will extend the MUSCAL model through UCVM.  The initial plan is to use the following algorithm:&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query 3D MUSCAL model at a given point.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;If the model returns NaNs (indicating the point is beyond the bounds of the 3D model):&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query the 1D MUSCAL model (Te-Yang to provide this).&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine Vs30 from the USGS Global Vs30 model.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Apply the Ely taper (LVT) to the top 300m.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30755</id>
		<title>UCVM muscal</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30755"/>
		<updated>2026-04-24T22:06:15Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Extended Model */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page describes the Multi-Scale CALifornia (MUSCAL) velocity model and its integration into UCVM.&lt;br /&gt;
&lt;br /&gt;
== Model overview ==&lt;br /&gt;
&lt;br /&gt;
The MUSCAL Vp and Vs model was created by Te-Yang Yeh and Yehuda Ben-Zion in 2025-6 by starting with the CANVAS tomography model as a base and then integrating multiple regional and local high-resolution models.  Each submodel is evaluated to determine where it improves the fit through simulations of M4 historical events to 1 Hz, and then the overall model is updated where the submodel improves the fit.&lt;br /&gt;
&lt;br /&gt;
=== Model Components ===&lt;br /&gt;
&lt;br /&gt;
The model was built by starting with a base model (the CANVAS (Doody et al., 2023) tomography model).  Next, regional and local models are added where they improve the fit, based on M4 simulations.  These simulations were run to 1 Hz with topography using a minimum Vs of 450 m/s.  Finally, a spatially varying taper following the Ely approach is added.&lt;br /&gt;
&lt;br /&gt;
==== Regional Models ====&lt;br /&gt;
&lt;br /&gt;
Regional models evaluated for inclusion are:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Northern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Lin et al. (2010)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Furlong et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Central California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CCA-06&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Southern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Fang et al. (2022)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Local Models ====&lt;br /&gt;
&lt;br /&gt;
Local high-resolution models that were evaluated for inclusion are:&lt;br /&gt;
&amp;lt;list&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Taper ====&lt;br /&gt;
&lt;br /&gt;
MUSCAL includes a low-velocity taper using the Ely-Jordan approach.  Vs30 for this method is produced by merging the Thompson (2018) Vs30 map with the [https://earthquake.usgs.gov/data/vs30/ USGS global Vs30 dataset].&lt;br /&gt;
&lt;br /&gt;
The depth of the taper varies depending on the site and ranges from 300m to 1500m.&lt;br /&gt;
&lt;br /&gt;
== Delivered Model ==&lt;br /&gt;
&lt;br /&gt;
The final version of MUSCAL is delivered in netCDF format.  The dimensions of the model are &amp;lt;X x Y x Z&amp;gt;. Grid points have a spacing of &amp;lt;m&amp;gt;.  The model is interpolated using trilinear interpolation.&lt;br /&gt;
&lt;br /&gt;
Details about the model creation are provided in the paper and the [https://zenodo.org/records/19243477 Zenodo page].&lt;br /&gt;
&lt;br /&gt;
Below is a plot of the 3D model extent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Datasets ===&lt;br /&gt;
&lt;br /&gt;
zone: 11&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  command  use:&lt;br /&gt;
     plot_depth_profile.py -n $UCVM_INSTALL_PATH/conf/ucvm.conf -i $UCVM_INSTALL_PATH -d vs -c muscal &lt;br /&gt;
                           -o muscal_small_depth_1000.png -C 'Multi-Scale Statewide California Velocity Model'&lt;br /&gt;
                           -v 1000 -b 0 -s 36.5054,-119.0587 -e 30000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== big data in netcdf ====&lt;br /&gt;
  &lt;br /&gt;
  model_MSCAL_CANVAS_dll0.01_dz50_cmpd.nc&lt;br /&gt;
  4.5G&lt;br /&gt;
&lt;br /&gt;
  longitude:1301 from -126 to -113 &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:671      from 0 to 100,000 &lt;br /&gt;
                   50 increments until 30,000&lt;br /&gt;
                   1000 increments til 100000&lt;br /&gt;
&lt;br /&gt;
Plot Depth profile at  36.5054,-119.0587 in different step increments. No interpretation and access data using nc api&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:muscal_big_deth_50.png|thumb|300px|muscal big 50 vs]]&lt;br /&gt;
| [[FILE:muscal_big_deth_100.png|thumb|300px|muscal big 100 vs]]&lt;br /&gt;
| [[FILE:muscal_big_depth_500.png|thumb|300px|muscal big 500 vs]] &lt;br /&gt;
| [[FILE:muscal_big_deth_1000.png|thumb|300px|muscal big 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_big_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== small data in netcdf ====&lt;br /&gt;
   model_MUSCAL_CANVAS_dll0.01_vardz_cmpd.nc &lt;br /&gt;
   2.1G &lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113   &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:210      from 0 to 99,000 &lt;br /&gt;
                   50 increments upto 3000 &lt;br /&gt;
                   100 increments upto 5000&lt;br /&gt;
                   250 increments upto 10000&lt;br /&gt;
                   500 increments upto 30000&lt;br /&gt;
                   1000 increments upto 99000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Depth profiles in different steps : 50m,100m,500m,1000m&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Direct from netcdf as external file&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_depth_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_depth_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data no interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_no_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_no_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_no_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== Another small dataset in netcdf ====&lt;br /&gt;
&lt;br /&gt;
  model_MUSCAL_CANVAS_dll0.01_vardz_float32_cmpd.nc&lt;br /&gt;
  1.4G&lt;br /&gt;
&lt;br /&gt;
  All longitude, latitude, and depth points are now saved as float32, as well as the vp, vs, and rho. &lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data with interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
=== Validation ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Several steps of pre/post processing are done to original MUSCAL model(netcdf format) in order to &lt;br /&gt;
incorporate it into UCVM and CVM explorer.&lt;br /&gt;
&lt;br /&gt;
   Because of the speed of netcdf-C code is too slow to support in-time nature of the explorer and would &lt;br /&gt;
   like to have interpolation on query,&lt;br /&gt;
&lt;br /&gt;
      * Number of depth layer is reduced with deeper layers merged into fewer layers&lt;br /&gt;
      * preprocessing the netcdf data into binary data files&lt;br /&gt;
      * data are all in float32&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Final result from ucvm_query, (with interpolation)&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.final.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.final.txt]]&lt;br /&gt;
&lt;br /&gt;
== Extended Model ==&lt;br /&gt;
&lt;br /&gt;
The 3D MUSCAL model is defined over the same volume as the CANVAS base model.  However, some simulation regions may extend beyond the boundaries, like the blue corners in the example CyberShake volume for USC below: &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_muscal_horiz_slice_z0.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
To solve this in a consistent and reproducible way, we will extend the MUSCAL model through UCVM.  The initial plan is to use the following algorithm:&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query 3D MUSCAL model at a given point.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;If the model returns NaNs (indicating the point is beyond the bounds of the 3D model):&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Query the 1D MUSCAL model (Te-Yang to provide this).&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine Vs30 from the USGS Global Vs30 model.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Apply the Ely taper (LVT) to the top 300m.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:USC_muscal_horiz_slice_z0.png&amp;diff=30754</id>
		<title>File:USC muscal horiz slice z0.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:USC_muscal_horiz_slice_z0.png&amp;diff=30754"/>
		<updated>2026-04-24T22:04:34Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30753</id>
		<title>UCVM muscal</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30753"/>
		<updated>2026-04-24T22:04:13Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page describes the Multi-Scale CALifornia (MUSCAL) velocity model and its integration into UCVM.&lt;br /&gt;
&lt;br /&gt;
== Model overview ==&lt;br /&gt;
&lt;br /&gt;
The MUSCAL Vp and Vs model was created by Te-Yang Yeh and Yehuda Ben-Zion in 2025-6 by starting with the CANVAS tomography model as a base and then integrating multiple regional and local high-resolution models.  Each submodel is evaluated to determine where it improves the fit through simulations of M4 historical events to 1 Hz, and then the overall model is updated where the submodel improves the fit.&lt;br /&gt;
&lt;br /&gt;
=== Model Components ===&lt;br /&gt;
&lt;br /&gt;
The model was built by starting with a base model (the CANVAS (Doody et al., 2023) tomography model).  Next, regional and local models are added where they improve the fit, based on M4 simulations.  These simulations were run to 1 Hz with topography using a minimum Vs of 450 m/s.  Finally, a spatially varying taper following the Ely approach is added.&lt;br /&gt;
&lt;br /&gt;
==== Regional Models ====&lt;br /&gt;
&lt;br /&gt;
Regional models evaluated for inclusion are:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Northern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Lin et al. (2010)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Furlong et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Central California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CCA-06&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Southern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Fang et al. (2022)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Local Models ====&lt;br /&gt;
&lt;br /&gt;
Local high-resolution models that were evaluated for inclusion are:&lt;br /&gt;
&amp;lt;list&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Taper ====&lt;br /&gt;
&lt;br /&gt;
MUSCAL includes a low-velocity taper using the Ely-Jordan approach.  Vs30 for this method is produced by merging the Thompson (2018) Vs30 map with the [https://earthquake.usgs.gov/data/vs30/ USGS global Vs30 dataset].&lt;br /&gt;
&lt;br /&gt;
The depth of the taper varies depending on the site and ranges from 300m to 1500m.&lt;br /&gt;
&lt;br /&gt;
== Delivered Model ==&lt;br /&gt;
&lt;br /&gt;
The final version of MUSCAL is delivered in netCDF format.  The dimensions of the model are &amp;lt;X x Y x Z&amp;gt;. Grid points have a spacing of &amp;lt;m&amp;gt;.  The model is interpolated using trilinear interpolation.&lt;br /&gt;
&lt;br /&gt;
Details about the model creation are provided in the paper and the [https://zenodo.org/records/19243477 Zenodo page].&lt;br /&gt;
&lt;br /&gt;
Below is a plot of the 3D model extent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Datasets ===&lt;br /&gt;
&lt;br /&gt;
zone: 11&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  command  use:&lt;br /&gt;
     plot_depth_profile.py -n $UCVM_INSTALL_PATH/conf/ucvm.conf -i $UCVM_INSTALL_PATH -d vs -c muscal &lt;br /&gt;
                           -o muscal_small_depth_1000.png -C 'Multi-Scale Statewide California Velocity Model'&lt;br /&gt;
                           -v 1000 -b 0 -s 36.5054,-119.0587 -e 30000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== big data in netcdf ====&lt;br /&gt;
  &lt;br /&gt;
  model_MSCAL_CANVAS_dll0.01_dz50_cmpd.nc&lt;br /&gt;
  4.5G&lt;br /&gt;
&lt;br /&gt;
  longitude:1301 from -126 to -113 &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:671      from 0 to 100,000 &lt;br /&gt;
                   50 increments until 30,000&lt;br /&gt;
                   1000 increments til 100000&lt;br /&gt;
&lt;br /&gt;
Plot Depth profile at  36.5054,-119.0587 in different step increments. No interpretation and access data using nc api&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:muscal_big_deth_50.png|thumb|300px|muscal big 50 vs]]&lt;br /&gt;
| [[FILE:muscal_big_deth_100.png|thumb|300px|muscal big 100 vs]]&lt;br /&gt;
| [[FILE:muscal_big_depth_500.png|thumb|300px|muscal big 500 vs]] &lt;br /&gt;
| [[FILE:muscal_big_deth_1000.png|thumb|300px|muscal big 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_big_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== small data in netcdf ====&lt;br /&gt;
   model_MUSCAL_CANVAS_dll0.01_vardz_cmpd.nc &lt;br /&gt;
   2.1G &lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113   &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:210      from 0 to 99,000 &lt;br /&gt;
                   50 increments upto 3000 &lt;br /&gt;
                   100 increments upto 5000&lt;br /&gt;
                   250 increments upto 10000&lt;br /&gt;
                   500 increments upto 30000&lt;br /&gt;
                   1000 increments upto 99000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Depth profiles in different steps : 50m,100m,500m,1000m&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Direct from netcdf as external file&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_depth_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_depth_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data no interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_no_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_no_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_no_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== Another small dataset in netcdf ====&lt;br /&gt;
&lt;br /&gt;
  model_MUSCAL_CANVAS_dll0.01_vardz_float32_cmpd.nc&lt;br /&gt;
  1.4G&lt;br /&gt;
&lt;br /&gt;
  All longitude, latitude, and depth points are now saved as float32, as well as the vp, vs, and rho. &lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data with interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
=== Validation ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Several steps of pre/post processing are done to original MUSCAL model(netcdf format) in order to &lt;br /&gt;
incorporate it into UCVM and CVM explorer.&lt;br /&gt;
&lt;br /&gt;
   Because of the speed of netcdf-C code is too slow to support in-time nature of the explorer and would &lt;br /&gt;
   like to have interpolation on query,&lt;br /&gt;
&lt;br /&gt;
      * Number of depth layer is reduced with deeper layers merged into fewer layers&lt;br /&gt;
      * preprocessing the netcdf data into binary data files&lt;br /&gt;
      * data are all in float32&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Final result from ucvm_query, (with interpolation)&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.final.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.final.txt]]&lt;br /&gt;
&lt;br /&gt;
== Extended Model ==&lt;br /&gt;
&lt;br /&gt;
The 3D MUSCAL model is defined over the same volume as the CANVAS base model.  However, some simulation regions may extend beyond the boundaries, like the blue corners in the example CyberShake volume for USC below: &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:USC_muscal_horiz_slice_z0.png|thumb|400px]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30752</id>
		<title>UCVM muscal</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30752"/>
		<updated>2026-04-24T21:08:59Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Model Components */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page describes the Multi-Scale CALifornia (MUSCAL) velocity model and its integration into UCVM.&lt;br /&gt;
&lt;br /&gt;
== Model overview ==&lt;br /&gt;
&lt;br /&gt;
The MUSCAL Vp and Vs model was created by Te-Yang Yeh and Yehuda Ben-Zion in 2025-6 by starting with the CANVAS tomography model as a base and then integrating multiple regional and local high-resolution models.  Each submodel is evaluated to determine where it improves the fit through simulations of M4 historical events to 1 Hz, and then the overall model is updated where the submodel improves the fit.&lt;br /&gt;
&lt;br /&gt;
=== Model Components ===&lt;br /&gt;
&lt;br /&gt;
The model was built by starting with a base model (the CANVAS (Doody et al., 2023) tomography model).  Next, regional and local models are added where they improve the fit, based on M4 simulations.  These simulations were run to 1 Hz with topography using a minimum Vs of 450 m/s.  Finally, a spatially varying taper following the Ely approach is added.&lt;br /&gt;
&lt;br /&gt;
==== Regional Models ====&lt;br /&gt;
&lt;br /&gt;
Regional models evaluated for inclusion are:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Northern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Lin et al. (2010)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Furlong et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Central California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CCA-06&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Southern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Fang et al. (2022)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Local Models ====&lt;br /&gt;
&lt;br /&gt;
Local high-resolution models that were evaluated for inclusion are:&lt;br /&gt;
&amp;lt;list&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Taper ====&lt;br /&gt;
&lt;br /&gt;
MUSCAL includes a low-velocity taper using the Ely-Jordan approach.  Vs30 for this method is produced by merging the Thompson (2018) Vs30 map with the [https://earthquake.usgs.gov/data/vs30/ USGS global Vs30 dataset].&lt;br /&gt;
&lt;br /&gt;
The depth of the taper varies depending on the site and ranges from 300m to 1500m.&lt;br /&gt;
&lt;br /&gt;
== Delivered Model ==&lt;br /&gt;
&lt;br /&gt;
The final version of MUSCAL is delivered in netCDF format.  The dimensions of the model are &amp;lt;X x Y x Z&amp;gt;. Grid points have a spacing of &amp;lt;m&amp;gt;.  The model is intended to be interpolated using trilinear interpolation.&lt;br /&gt;
&lt;br /&gt;
Details about the model creation are provided in &amp;lt;paper&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Below is a plot of the model extent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Datasets ===&lt;br /&gt;
&lt;br /&gt;
zone: 11&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  command  use:&lt;br /&gt;
     plot_depth_profile.py -n $UCVM_INSTALL_PATH/conf/ucvm.conf -i $UCVM_INSTALL_PATH -d vs -c muscal &lt;br /&gt;
                           -o muscal_small_depth_1000.png -C 'Multi-Scale Statewide California Velocity Model'&lt;br /&gt;
                           -v 1000 -b 0 -s 36.5054,-119.0587 -e 30000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== big data in netcdf ====&lt;br /&gt;
  &lt;br /&gt;
  model_MSCAL_CANVAS_dll0.01_dz50_cmpd.nc&lt;br /&gt;
  4.5G&lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113 &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:671      from 0 to 100,000 &lt;br /&gt;
                   50 increments until 30,000&lt;br /&gt;
                   1000 increments til 100000&lt;br /&gt;
&lt;br /&gt;
Plot Depth profile at  36.5054,-119.0587 in different step increments. No interpretation and access data using nc api&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:muscal_big_deth_50.png|thumb|300px|muscal big 50 vs]]&lt;br /&gt;
| [[FILE:muscal_big_deth_100.png|thumb|300px|muscal big 100 vs]]&lt;br /&gt;
| [[FILE:muscal_big_depth_500.png|thumb|300px|muscal big 500 vs]] &lt;br /&gt;
| [[FILE:muscal_big_deth_1000.png|thumb|300px|muscal big 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_big_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== small data in netcdf ====&lt;br /&gt;
   model_MUSCAL_CANVAS_dll0.01_vardz_cmpd.nc &lt;br /&gt;
   2.1G &lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113   &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:210      from 0 to 99,000 &lt;br /&gt;
                   50 increments upto 3000 &lt;br /&gt;
                   100 increments upto 5000&lt;br /&gt;
                   250 increments upto 10000&lt;br /&gt;
                   500 increments upto 30000&lt;br /&gt;
                   1000 increments upto 99000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Depth profiles in different steps : 50m,100m,500m,1000m&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Direct from netcdf as external file&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_depth_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_depth_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data no interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_no_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_no_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_no_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== Another small dataset in netcdf ====&lt;br /&gt;
&lt;br /&gt;
  model_MUSCAL_CANVAS_dll0.01_vardz_float32_cmpd.nc&lt;br /&gt;
  1.4G&lt;br /&gt;
&lt;br /&gt;
  All longitude, latitude, and depth points are now saved as float32, as well as the vp, vs, and rho. &lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data with interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
=== Validation ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Several steps of pre/post processing are done to original MUSCAL model(netcdf format) in order to &lt;br /&gt;
incorporate it into UCVM and CVM explorer.&lt;br /&gt;
&lt;br /&gt;
   Because of the speed of netcdf-C code is too slow to support in-time nature of the explorer and would &lt;br /&gt;
   like to have interpolation on query,&lt;br /&gt;
&lt;br /&gt;
      * Number of depth layer is reduced with deeper layers merged into fewer layers&lt;br /&gt;
      * preprocessing the netcdf data into binary data files&lt;br /&gt;
      * data are all in float32&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Final result from ucvm_query, (with interpolation)&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.final.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.final.txt]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30751</id>
		<title>UCVM muscal</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30751"/>
		<updated>2026-04-24T18:14:07Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page describes the Multi-Scale CALifornia (MUSCAL) velocity model and its integration into UCVM.&lt;br /&gt;
&lt;br /&gt;
== Model overview ==&lt;br /&gt;
&lt;br /&gt;
The MUSCAL Vp and Vs model was created by Te-Yang Yeh and Yehuda Ben-Zion in 2025-6 by starting with the CANVAS tomography model as a base and then integrating multiple regional and local high-resolution models.  Each submodel is evaluated to determine where it improves the fit through simulations of M4 historical events to 1 Hz, and then the overall model is updated where the submodel improves the fit.&lt;br /&gt;
&lt;br /&gt;
=== Model Components ===&lt;br /&gt;
&lt;br /&gt;
The model was built by starting with a base model (the CANVAS (Doody et al., 2023) tomography model) and then adding regional and local models where they improve the fit, based on M4 simulations.&lt;br /&gt;
&lt;br /&gt;
==== Regional Models ====&lt;br /&gt;
&lt;br /&gt;
Regional models evaluated for inclusion are:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Northern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Lin et al. (2010)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Furlong et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Central California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CCA-06&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Guo et al. (2024)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Southern California&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Fang et al. (2022)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-S4.26&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;CVM-H &amp;lt;what version?&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Berg et al. (2021)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Local Models ====&lt;br /&gt;
&lt;br /&gt;
Local high-resolution models that were evaluated for inclusion are:&lt;br /&gt;
&amp;lt;list&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Taper ====&lt;br /&gt;
&lt;br /&gt;
MUSCAL includes a low-velocity taper using the Ely-Jordan approach.  Vs30 for this method is produced by merging the Thompson (2018) Vs30 map with the [https://earthquake.usgs.gov/data/vs30/ USGS global Vs30 dataset].&lt;br /&gt;
&lt;br /&gt;
The depth of the taper varies depending on the site and ranges from 300m to 1500m.&lt;br /&gt;
&lt;br /&gt;
== Delivered Model ==&lt;br /&gt;
&lt;br /&gt;
The final version of MUSCAL is delivered in netCDF format.  The dimensions of the model are &amp;lt;X x Y x Z&amp;gt;. Grid points have a spacing of &amp;lt;m&amp;gt;.  The model is intended to be interpolated using trilinear interpolation.&lt;br /&gt;
&lt;br /&gt;
Details about the model creation are provided in &amp;lt;paper&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
Below is a plot of the model extent.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Datasets ===&lt;br /&gt;
&lt;br /&gt;
zone: 11&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  command  use:&lt;br /&gt;
     plot_depth_profile.py -n $UCVM_INSTALL_PATH/conf/ucvm.conf -i $UCVM_INSTALL_PATH -d vs -c muscal &lt;br /&gt;
                           -o muscal_small_depth_1000.png -C 'Multi-Scale Statewide California Velocity Model'&lt;br /&gt;
                           -v 1000 -b 0 -s 36.5054,-119.0587 -e 30000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== big data in netcdf ====&lt;br /&gt;
  &lt;br /&gt;
  model_MSCAL_CANVAS_dll0.01_dz50_cmpd.nc&lt;br /&gt;
  4.5G&lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113 &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:671      from 0 to 100,000 &lt;br /&gt;
                   50 increments until 30,000&lt;br /&gt;
                   1000 increments til 100000&lt;br /&gt;
&lt;br /&gt;
Plot Depth profile at  36.5054,-119.0587 in different step increments. No interpretation and access data using nc api&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:muscal_big_deth_50.png|thumb|300px|muscal big 50 vs]]&lt;br /&gt;
| [[FILE:muscal_big_deth_100.png|thumb|300px|muscal big 100 vs]]&lt;br /&gt;
| [[FILE:muscal_big_depth_500.png|thumb|300px|muscal big 500 vs]] &lt;br /&gt;
| [[FILE:muscal_big_deth_1000.png|thumb|300px|muscal big 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_big_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== small data in netcdf ====&lt;br /&gt;
   model_MUSCAL_CANVAS_dll0.01_vardz_cmpd.nc &lt;br /&gt;
   2.1G &lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113   &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:210      from 0 to 99,000 &lt;br /&gt;
                   50 increments upto 3000 &lt;br /&gt;
                   100 increments upto 5000&lt;br /&gt;
                   250 increments upto 10000&lt;br /&gt;
                   500 increments upto 30000&lt;br /&gt;
                   1000 increments upto 99000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Depth profiles in different steps : 50m,100m,500m,1000m&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Direct from netcdf as external file&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_depth_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_depth_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data no interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_no_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_no_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_no_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== Another small dataset in netcdf ====&lt;br /&gt;
&lt;br /&gt;
  model_MUSCAL_CANVAS_dll0.01_vardz_float32_cmpd.nc&lt;br /&gt;
  1.4G&lt;br /&gt;
&lt;br /&gt;
  All longitude, latitude, and depth points are now saved as float32, as well as the vp, vs, and rho. &lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data with interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
=== Validation ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Several steps of pre/post processing are done to original MUSCAL model(netcdf format) in order to &lt;br /&gt;
incorporate it into UCVM and CVM explorer.&lt;br /&gt;
&lt;br /&gt;
   Because of the speed of netcdf-C code is too slow to support in-time nature of the explorer and would &lt;br /&gt;
   like to have interpolation on query,&lt;br /&gt;
&lt;br /&gt;
      * Number of depth layer is reduced with deeper layers merged into fewer layers&lt;br /&gt;
      * preprocessing the netcdf data into binary data files&lt;br /&gt;
      * data are all in float32&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Final result from ucvm_query, (with interpolation)&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.final.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.final.txt]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30750</id>
		<title>Callaghan Presentations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30750"/>
		<updated>2026-04-22T04:17:55Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Below are links to presentations and related resources given by Scott Callaghan.&lt;br /&gt;
&lt;br /&gt;
== 2026 ==&lt;br /&gt;
*SSA CyberShake presentaiton: [[:File:2026_SSA_CyberShake_presentation.pptx | PPTX]], [[:File:2026_SSA_CyberShake_presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2025 ==&lt;br /&gt;
*SC25 presentation at OSU booth: [[File:AWP_presentation_for_OSU_booth.pptx | PPTX]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2025_IHPCSS_workflows.pptx | PPTX]], [[:File:2025_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*USGS NorCal Earthquake Hazards workshop CyberShake presentation: [[:File:2025_USGS_NorCal_CyberShake.pptx | PPTX]], [[:File:2025_USGS_NorCal_CyberShake.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2024 ==&lt;br /&gt;
*AGU24 and December staff meeting presentation: [[:File:AGU24_CyberShake_24_8_presentation.pptx | PPTX]], [[:File:AGU24_CyberShake_24_8_presentation.pdf | PDF]]&lt;br /&gt;
*SC24 presentation at OSU booth: [[:File:SC24_OSU_MPI_compression.pptx | PPTX]], [[:File:SC24_OSU_MPI_compression.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake Study 24.8 overview: [[:File:Study_24.8_overview_for_NGAW3.odp | ODP]], [[:File:Study_24.8_overview_for_NGAW3.pdf | PDF]]&lt;br /&gt;
*Geo-INQUIRE Data Lake workshop CyberShake presentation: [[:File:CyberShake_Data_Lake_workshop.pptx | PPTX]], [[:File:CyberShake_Data_Lake_workshop.pdf | PDF]]&lt;br /&gt;
*IHPCSS Workflow talk: [[:File:2024_IHPCSS_workflows.pptx | PPTX]], [[:File:2024_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2023 ==&lt;br /&gt;
*December Staff Meeting CyberShake updates: [[:File:Dec_Staffmtg_CyberShake_update.pptx | PPTX]], [[:File:Dec_Staffmtg_CyberShake_update.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake presentation: [[:File:2023_NGA_West3.pptx | PPTX]], [[:File:2023_NGA_West3.pdf | PDF]]&lt;br /&gt;
*SC23 early career talk: [[:File:SC23_ECP_career_talk.pptx | PPTX]], [[:File:SC23_ECP_career_talk.pdf | PDF]]&lt;br /&gt;
*IHPCSS seismology presentation: [[:File:2023_IHPCSS_seismology.pptx | PPTX]], [[:File:2023_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2023_IHPCSS_workflows.pptx | PPTX]], [[:File:2023_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*GC11 conference (Solid Earth and Geohazards in the Exascale Era): [[:File:GC11_Callaghan_workflows.pptx | PPTX]]&lt;br /&gt;
*CyberTraining for Seismology talk on CyberShake Data Access tool: [[:File:CyberShake_tutorial_for_2023_CyberTraining.pptx | PPTX]]&lt;br /&gt;
*SSA CyberShake Study 22.12 talk: [[:File:2023_SSA_CyberShake_22_12.pptx | PPTX]], [[:File:2023_SSA_CyberShake_22_12.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2022 ==&lt;br /&gt;
&lt;br /&gt;
*SC22 Early Career talk (also given to ECP Work/Life balance group, and Sandia Parents Group): [[:File:2022_parental_balance.ppt | PPT]], [[:File:2022_parental_balance.pdf | PDF]]&lt;br /&gt;
*SC22 SIGHPC Education Chapter overview: [[:File:SC22_SIGHPC_Edu_overview.pptx | PPTX]], [[:File:SC22_SIGHPC_Edu_overview.pdf | PDF]]&lt;br /&gt;
*SOURCES career talk: [[:File:2022_Sources_career_talk.odp | ODP ]]&lt;br /&gt;
*IHPCSS workflow talk: [[:File:2022_IHPCSS_talk.pptx | PPTX]]&lt;br /&gt;
*SSA Broadband CyberShake Validation talk: [[:File:2022_SSA_Broadband_CyberShake.pptx | PPTX]], [[:File:2022_SSA_Broadband_CyberShake.pdf | PDF]]&lt;br /&gt;
*SSA CyberShake Study 21.12 talk: [[:File:2022_SSA_CyberShake_21_12.pptx | PPTX]], [[:File:2022_SSA_CyberShake_21_12.pdf | PDF]]&lt;br /&gt;
*SCEC staff meeting talk: [[:File:Feb_2022_staff_meeting.pptx | PPTX]], [[:File:Feb_2022_staff_meeting.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2021 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2021_CyberShake.pptx | PPTX full]], [[:File:AGU_2021_CyberShake_lighting.pptx | PPTX lightning]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2021_IHPCSS_workflow.pptx | PPTX]], [[:File:2021_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2020 ==&lt;br /&gt;
* Polytechnic talk resources:  [[Poly 2020 outreach discussion]]&lt;br /&gt;
* AGU CyberShake talk: [[File:AGU_2020_CyberShake.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2019 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2019_CyberShake.pptx | PPTX]]&lt;br /&gt;
* SC19 USC booth talk: [[:File:SC19_Callaghan_USC_booth.pptx | PPTX]] or [[:File:SC19_Callaghan_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SCEC Research Computing workshop lightning talk: [[:File:2019_SCEC_Research_Computing_CyberShake_lightning.pdf | PDF]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2019_IHPCSS_workflow.pptx | PPTX]]&lt;br /&gt;
* UseIT talk about HPC at SCEC: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202019.pptx slides (PPTX), external link]&lt;br /&gt;
* SSA CyberShake science talk: [[:File:2019_SSA_CyberShake_Science_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Science_Presentation.pdf | PDF]].  Here are links to the [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/point_src_gtl_v2.wmv 10 km smoothing movie] and [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/pt_src_gtl_20km_v2.wmv 20 km smoothing movie]&lt;br /&gt;
* SSA CyberShake technical talk: [[:File:2019_SSA_CyberShake_Technical_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Technical_Presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2018 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2018_IHPCSS_workflow.pptx | PPTX]] or [[:File:2018_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* Blue Waters Symposium talk: [[:File:Callaghan_Blue_Waters_Symposium.pptx | PPTX]]&lt;br /&gt;
* QuakeCore GMS&amp;amp;V talk: [[:File:CyberShake_QuakeCore_Presentation.pptx | PPTX]] or [[:File:CyberShake_QuakeCore_Presentation.pdf | PDF]]&lt;br /&gt;
* Machine learning with Keras overview: [[:File:Machine_Learning_with_Keras.odp | ODP]]&lt;br /&gt;
&lt;br /&gt;
== 2017 ==&lt;br /&gt;
&lt;br /&gt;
* AGU CyberShake talk: [[:File:2017_AGU_CyberShake.pptx | PPTX]]&lt;br /&gt;
* PG&amp;amp;E CyberShake update: [[:File:PGE_CyberShake_update.pptx | PPTX]] or [[:File:PGE_CyberShake_update.pdf | PDF]]&lt;br /&gt;
* SC17 USC booth talk: [[:File:SC17_USC_booth.pptx | PPTX]] or [[:File:SC17_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SC17 WORKS'17 talk on rvGAHP: [[:File:WORKS17_rvGAHP.pptx | PPTX]] or [[:File:WORKS17_rvGAHP.pdf | PDF]]&lt;br /&gt;
* SC17 Women in HPC Mentoring talk: [[:File:2017_WHPC_Workshop.pptx | PPTX]] or [[:File:2017_WHPC_Workshop.pdf | PDF]]&lt;br /&gt;
* SCEC Annual Meeting plenary CyberShake presentation: [[:File:2017_SCEC_AM_CyberShake.pptx | PPTX]] or [[:File:2017_SCEC_AM_CyberShake.pdf | PDF]]&lt;br /&gt;
* SCEC Nonlinear Workshop presentation: [[:File:2017_SCEC_Nonlinear_Workshop.pptx | PPTX]] or [[:File:2017_SCEC_Nonlinear_Workshop.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2017_IHPCSS_workflow.pptx | PPTX]] or [[:File:2017_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT Blue Waters tutorial: [[:File:2017_UseIT_HPC_tutorial.odt | tutorial text (ODT)]] and [[:File:2017_UseIT_Linux_commands.doc | Linux Command guide (DOC)]]&lt;br /&gt;
* UseIT HPC at SCEC talk: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202017.pptx slides (PPTX), external link] and [[:File:2017_UseIT_HPC_spreadsheet.xlsx | supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* SSA CyberShake talk: [[:File:Callaghan_2017_SSA_CyberShake.pptx | PPTX]] or [[:File:Callaghan_2017_SSA_CyberShake.pdf | PDF]]&lt;br /&gt;
* Blue Waters workflow seminar: [[:File:Blue_Waters_Workflow_Seminar_Overview.pptx | PPTX]] or [[:File:Blue_Waters_Workflow_Seminar_Overview.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2016 ==&lt;br /&gt;
&lt;br /&gt;
* SC16 USC booth talk: [[:File:SC16_RSQSim_UseIT_USC_booth.pdf | PDF]] or [[:File:SC16_RSQSim_UseIT_USC_booth.odp | ODP]]&lt;br /&gt;
* SCEC Annual Meeting: [[:File:SCEC_2016_AM_CyberShake_CISM.pptx | PPTX]] or [[:File:SCEC_2016_AM_CyberShake_CISM.pdf | PDF]]&lt;br /&gt;
* XSEDE Workflow overview talk: [[File:2016_Callaghan_overview_of_workflows.pptx | PPTX]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2016_IHPCSS_workflow.pptx | PPTX]] or [[:File:2016_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC.pptx | HPC talk (PPTX)]] and [[:File:2016_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC_tutorial.odt | HPC tutorial (ODT)]] and [[:File:2016_UseIT_Linux_commands.odt | Sample Linux Commands (ODT)]]&lt;br /&gt;
* CyberShake [[:File:2016_CCSP.odp | CCSP presentation (ODP)]]&lt;br /&gt;
* CyberShake [[:File:2016_UCERF3_downsampling.odp | UCERF3 downsampling presentation (ODP)]]&lt;br /&gt;
&lt;br /&gt;
== 2015 ==&lt;br /&gt;
&lt;br /&gt;
* SC15 USC booth talk:  [[:File:SC15_CyberShake_USC_booth.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2014 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS seismology talk: [[:File:2014_IHPCSS_seismology.pptx | PPTX]] or [[:File:2014_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2014_IHPCSS_workflow.pptx | PPTX]] or [[:File:2014_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2014_UseIT_HPC.pptx |  HPC tutorial (PPTX)]]  [[:File:2014_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]  [[:File:2014_UseIT_HPC_matrix_mult.docx | Supplemental matrix multiplication (DOCX)]]&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
*[[SC16]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:2026_SSA_CyberShake_presentation.pptx&amp;diff=30749</id>
		<title>File:2026 SSA CyberShake presentation.pptx</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:2026_SSA_CyberShake_presentation.pptx&amp;diff=30749"/>
		<updated>2026-04-22T04:17:41Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:2026_SSA_CyberShake_presentation.pdf&amp;diff=30748</id>
		<title>File:2026 SSA CyberShake presentation.pdf</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:2026_SSA_CyberShake_presentation.pdf&amp;diff=30748"/>
		<updated>2026-04-22T04:16:19Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30747</id>
		<title>Callaghan Presentations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30747"/>
		<updated>2026-04-22T04:15:30Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Below are links to presentations and related resources given by Scott Callaghan.&lt;br /&gt;
&lt;br /&gt;
== 2026 ==&lt;br /&gt;
*SSA CyberShake presentaiton: [[File:2026_SSA_CyberShake_presentation.pptx | PPTX]], [[File:2026_SSA_CyberShake_presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2025 ==&lt;br /&gt;
*SC25 presentation at OSU booth: [[File:AWP_presentation_for_OSU_booth.pptx | PPTX]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2025_IHPCSS_workflows.pptx | PPTX]], [[:File:2025_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*USGS NorCal Earthquake Hazards workshop CyberShake presentation: [[:File:2025_USGS_NorCal_CyberShake.pptx | PPTX]], [[:File:2025_USGS_NorCal_CyberShake.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2024 ==&lt;br /&gt;
*AGU24 and December staff meeting presentation: [[:File:AGU24_CyberShake_24_8_presentation.pptx | PPTX]], [[:File:AGU24_CyberShake_24_8_presentation.pdf | PDF]]&lt;br /&gt;
*SC24 presentation at OSU booth: [[:File:SC24_OSU_MPI_compression.pptx | PPTX]], [[:File:SC24_OSU_MPI_compression.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake Study 24.8 overview: [[:File:Study_24.8_overview_for_NGAW3.odp | ODP]], [[:File:Study_24.8_overview_for_NGAW3.pdf | PDF]]&lt;br /&gt;
*Geo-INQUIRE Data Lake workshop CyberShake presentation: [[:File:CyberShake_Data_Lake_workshop.pptx | PPTX]], [[:File:CyberShake_Data_Lake_workshop.pdf | PDF]]&lt;br /&gt;
*IHPCSS Workflow talk: [[:File:2024_IHPCSS_workflows.pptx | PPTX]], [[:File:2024_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2023 ==&lt;br /&gt;
*December Staff Meeting CyberShake updates: [[:File:Dec_Staffmtg_CyberShake_update.pptx | PPTX]], [[:File:Dec_Staffmtg_CyberShake_update.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake presentation: [[:File:2023_NGA_West3.pptx | PPTX]], [[:File:2023_NGA_West3.pdf | PDF]]&lt;br /&gt;
*SC23 early career talk: [[:File:SC23_ECP_career_talk.pptx | PPTX]], [[:File:SC23_ECP_career_talk.pdf | PDF]]&lt;br /&gt;
*IHPCSS seismology presentation: [[:File:2023_IHPCSS_seismology.pptx | PPTX]], [[:File:2023_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2023_IHPCSS_workflows.pptx | PPTX]], [[:File:2023_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*GC11 conference (Solid Earth and Geohazards in the Exascale Era): [[:File:GC11_Callaghan_workflows.pptx | PPTX]]&lt;br /&gt;
*CyberTraining for Seismology talk on CyberShake Data Access tool: [[:File:CyberShake_tutorial_for_2023_CyberTraining.pptx | PPTX]]&lt;br /&gt;
*SSA CyberShake Study 22.12 talk: [[:File:2023_SSA_CyberShake_22_12.pptx | PPTX]], [[:File:2023_SSA_CyberShake_22_12.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2022 ==&lt;br /&gt;
&lt;br /&gt;
*SC22 Early Career talk (also given to ECP Work/Life balance group, and Sandia Parents Group): [[:File:2022_parental_balance.ppt | PPT]], [[:File:2022_parental_balance.pdf | PDF]]&lt;br /&gt;
*SC22 SIGHPC Education Chapter overview: [[:File:SC22_SIGHPC_Edu_overview.pptx | PPTX]], [[:File:SC22_SIGHPC_Edu_overview.pdf | PDF]]&lt;br /&gt;
*SOURCES career talk: [[:File:2022_Sources_career_talk.odp | ODP ]]&lt;br /&gt;
*IHPCSS workflow talk: [[:File:2022_IHPCSS_talk.pptx | PPTX]]&lt;br /&gt;
*SSA Broadband CyberShake Validation talk: [[:File:2022_SSA_Broadband_CyberShake.pptx | PPTX]], [[:File:2022_SSA_Broadband_CyberShake.pdf | PDF]]&lt;br /&gt;
*SSA CyberShake Study 21.12 talk: [[:File:2022_SSA_CyberShake_21_12.pptx | PPTX]], [[:File:2022_SSA_CyberShake_21_12.pdf | PDF]]&lt;br /&gt;
*SCEC staff meeting talk: [[:File:Feb_2022_staff_meeting.pptx | PPTX]], [[:File:Feb_2022_staff_meeting.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2021 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2021_CyberShake.pptx | PPTX full]], [[:File:AGU_2021_CyberShake_lighting.pptx | PPTX lightning]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2021_IHPCSS_workflow.pptx | PPTX]], [[:File:2021_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2020 ==&lt;br /&gt;
* Polytechnic talk resources:  [[Poly 2020 outreach discussion]]&lt;br /&gt;
* AGU CyberShake talk: [[File:AGU_2020_CyberShake.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2019 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2019_CyberShake.pptx | PPTX]]&lt;br /&gt;
* SC19 USC booth talk: [[:File:SC19_Callaghan_USC_booth.pptx | PPTX]] or [[:File:SC19_Callaghan_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SCEC Research Computing workshop lightning talk: [[:File:2019_SCEC_Research_Computing_CyberShake_lightning.pdf | PDF]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2019_IHPCSS_workflow.pptx | PPTX]]&lt;br /&gt;
* UseIT talk about HPC at SCEC: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202019.pptx slides (PPTX), external link]&lt;br /&gt;
* SSA CyberShake science talk: [[:File:2019_SSA_CyberShake_Science_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Science_Presentation.pdf | PDF]].  Here are links to the [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/point_src_gtl_v2.wmv 10 km smoothing movie] and [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/pt_src_gtl_20km_v2.wmv 20 km smoothing movie]&lt;br /&gt;
* SSA CyberShake technical talk: [[:File:2019_SSA_CyberShake_Technical_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Technical_Presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2018 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2018_IHPCSS_workflow.pptx | PPTX]] or [[:File:2018_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* Blue Waters Symposium talk: [[:File:Callaghan_Blue_Waters_Symposium.pptx | PPTX]]&lt;br /&gt;
* QuakeCore GMS&amp;amp;V talk: [[:File:CyberShake_QuakeCore_Presentation.pptx | PPTX]] or [[:File:CyberShake_QuakeCore_Presentation.pdf | PDF]]&lt;br /&gt;
* Machine learning with Keras overview: [[:File:Machine_Learning_with_Keras.odp | ODP]]&lt;br /&gt;
&lt;br /&gt;
== 2017 ==&lt;br /&gt;
&lt;br /&gt;
* AGU CyberShake talk: [[:File:2017_AGU_CyberShake.pptx | PPTX]]&lt;br /&gt;
* PG&amp;amp;E CyberShake update: [[:File:PGE_CyberShake_update.pptx | PPTX]] or [[:File:PGE_CyberShake_update.pdf | PDF]]&lt;br /&gt;
* SC17 USC booth talk: [[:File:SC17_USC_booth.pptx | PPTX]] or [[:File:SC17_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SC17 WORKS'17 talk on rvGAHP: [[:File:WORKS17_rvGAHP.pptx | PPTX]] or [[:File:WORKS17_rvGAHP.pdf | PDF]]&lt;br /&gt;
* SC17 Women in HPC Mentoring talk: [[:File:2017_WHPC_Workshop.pptx | PPTX]] or [[:File:2017_WHPC_Workshop.pdf | PDF]]&lt;br /&gt;
* SCEC Annual Meeting plenary CyberShake presentation: [[:File:2017_SCEC_AM_CyberShake.pptx | PPTX]] or [[:File:2017_SCEC_AM_CyberShake.pdf | PDF]]&lt;br /&gt;
* SCEC Nonlinear Workshop presentation: [[:File:2017_SCEC_Nonlinear_Workshop.pptx | PPTX]] or [[:File:2017_SCEC_Nonlinear_Workshop.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2017_IHPCSS_workflow.pptx | PPTX]] or [[:File:2017_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT Blue Waters tutorial: [[:File:2017_UseIT_HPC_tutorial.odt | tutorial text (ODT)]] and [[:File:2017_UseIT_Linux_commands.doc | Linux Command guide (DOC)]]&lt;br /&gt;
* UseIT HPC at SCEC talk: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202017.pptx slides (PPTX), external link] and [[:File:2017_UseIT_HPC_spreadsheet.xlsx | supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* SSA CyberShake talk: [[:File:Callaghan_2017_SSA_CyberShake.pptx | PPTX]] or [[:File:Callaghan_2017_SSA_CyberShake.pdf | PDF]]&lt;br /&gt;
* Blue Waters workflow seminar: [[:File:Blue_Waters_Workflow_Seminar_Overview.pptx | PPTX]] or [[:File:Blue_Waters_Workflow_Seminar_Overview.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2016 ==&lt;br /&gt;
&lt;br /&gt;
* SC16 USC booth talk: [[:File:SC16_RSQSim_UseIT_USC_booth.pdf | PDF]] or [[:File:SC16_RSQSim_UseIT_USC_booth.odp | ODP]]&lt;br /&gt;
* SCEC Annual Meeting: [[:File:SCEC_2016_AM_CyberShake_CISM.pptx | PPTX]] or [[:File:SCEC_2016_AM_CyberShake_CISM.pdf | PDF]]&lt;br /&gt;
* XSEDE Workflow overview talk: [[File:2016_Callaghan_overview_of_workflows.pptx | PPTX]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2016_IHPCSS_workflow.pptx | PPTX]] or [[:File:2016_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC.pptx | HPC talk (PPTX)]] and [[:File:2016_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC_tutorial.odt | HPC tutorial (ODT)]] and [[:File:2016_UseIT_Linux_commands.odt | Sample Linux Commands (ODT)]]&lt;br /&gt;
* CyberShake [[:File:2016_CCSP.odp | CCSP presentation (ODP)]]&lt;br /&gt;
* CyberShake [[:File:2016_UCERF3_downsampling.odp | UCERF3 downsampling presentation (ODP)]]&lt;br /&gt;
&lt;br /&gt;
== 2015 ==&lt;br /&gt;
&lt;br /&gt;
* SC15 USC booth talk:  [[:File:SC15_CyberShake_USC_booth.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2014 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS seismology talk: [[:File:2014_IHPCSS_seismology.pptx | PPTX]] or [[:File:2014_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2014_IHPCSS_workflow.pptx | PPTX]] or [[:File:2014_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2014_UseIT_HPC.pptx |  HPC tutorial (PPTX)]]  [[:File:2014_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]  [[:File:2014_UseIT_HPC_matrix_mult.docx | Supplemental matrix multiplication (DOCX)]]&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
*[[SC16]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:AWP_presentation_for_OSU_booth.pptx&amp;diff=30746</id>
		<title>File:AWP presentation for OSU booth.pptx</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:AWP_presentation_for_OSU_booth.pptx&amp;diff=30746"/>
		<updated>2026-04-09T21:36:57Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30745</id>
		<title>Callaghan Presentations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30745"/>
		<updated>2026-04-09T21:32:50Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Below are links to presentations and related resources given by Scott Callaghan.&lt;br /&gt;
&lt;br /&gt;
== 2025 ==&lt;br /&gt;
*SC25 presentation at OSU booth: [[File:AWP_presentation_for_OSU_booth.pptx | PPTX]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2025_IHPCSS_workflows.pptx | PPTX]], [[:File:2025_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*USGS NorCal Earthquake Hazards workshop CyberShake presentation: [[:File:2025_USGS_NorCal_CyberShake.pptx | PPTX]], [[:File:2025_USGS_NorCal_CyberShake.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2024 ==&lt;br /&gt;
*AGU24 and December staff meeting presentation: [[:File:AGU24_CyberShake_24_8_presentation.pptx | PPTX]], [[:File:AGU24_CyberShake_24_8_presentation.pdf | PDF]]&lt;br /&gt;
*SC24 presentation at OSU booth: [[:File:SC24_OSU_MPI_compression.pptx | PPTX]], [[:File:SC24_OSU_MPI_compression.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake Study 24.8 overview: [[:File:Study_24.8_overview_for_NGAW3.odp | ODP]], [[:File:Study_24.8_overview_for_NGAW3.pdf | PDF]]&lt;br /&gt;
*Geo-INQUIRE Data Lake workshop CyberShake presentation: [[:File:CyberShake_Data_Lake_workshop.pptx | PPTX]], [[:File:CyberShake_Data_Lake_workshop.pdf | PDF]]&lt;br /&gt;
*IHPCSS Workflow talk: [[:File:2024_IHPCSS_workflows.pptx | PPTX]], [[:File:2024_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2023 ==&lt;br /&gt;
*December Staff Meeting CyberShake updates: [[:File:Dec_Staffmtg_CyberShake_update.pptx | PPTX]], [[:File:Dec_Staffmtg_CyberShake_update.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake presentation: [[:File:2023_NGA_West3.pptx | PPTX]], [[:File:2023_NGA_West3.pdf | PDF]]&lt;br /&gt;
*SC23 early career talk: [[:File:SC23_ECP_career_talk.pptx | PPTX]], [[:File:SC23_ECP_career_talk.pdf | PDF]]&lt;br /&gt;
*IHPCSS seismology presentation: [[:File:2023_IHPCSS_seismology.pptx | PPTX]], [[:File:2023_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2023_IHPCSS_workflows.pptx | PPTX]], [[:File:2023_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*GC11 conference (Solid Earth and Geohazards in the Exascale Era): [[:File:GC11_Callaghan_workflows.pptx | PPTX]]&lt;br /&gt;
*CyberTraining for Seismology talk on CyberShake Data Access tool: [[:File:CyberShake_tutorial_for_2023_CyberTraining.pptx | PPTX]]&lt;br /&gt;
*SSA CyberShake Study 22.12 talk: [[:File:2023_SSA_CyberShake_22_12.pptx | PPTX]], [[:File:2023_SSA_CyberShake_22_12.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2022 ==&lt;br /&gt;
&lt;br /&gt;
*SC22 Early Career talk (also given to ECP Work/Life balance group, and Sandia Parents Group): [[:File:2022_parental_balance.ppt | PPT]], [[:File:2022_parental_balance.pdf | PDF]]&lt;br /&gt;
*SC22 SIGHPC Education Chapter overview: [[:File:SC22_SIGHPC_Edu_overview.pptx | PPTX]], [[:File:SC22_SIGHPC_Edu_overview.pdf | PDF]]&lt;br /&gt;
*SOURCES career talk: [[:File:2022_Sources_career_talk.odp | ODP ]]&lt;br /&gt;
*IHPCSS workflow talk: [[:File:2022_IHPCSS_talk.pptx | PPTX]]&lt;br /&gt;
*SSA Broadband CyberShake Validation talk: [[:File:2022_SSA_Broadband_CyberShake.pptx | PPTX]], [[:File:2022_SSA_Broadband_CyberShake.pdf | PDF]]&lt;br /&gt;
*SSA CyberShake Study 21.12 talk: [[:File:2022_SSA_CyberShake_21_12.pptx | PPTX]], [[:File:2022_SSA_CyberShake_21_12.pdf | PDF]]&lt;br /&gt;
*SCEC staff meeting talk: [[:File:Feb_2022_staff_meeting.pptx | PPTX]], [[:File:Feb_2022_staff_meeting.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2021 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2021_CyberShake.pptx | PPTX full]], [[:File:AGU_2021_CyberShake_lighting.pptx | PPTX lightning]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2021_IHPCSS_workflow.pptx | PPTX]], [[:File:2021_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2020 ==&lt;br /&gt;
* Polytechnic talk resources:  [[Poly 2020 outreach discussion]]&lt;br /&gt;
* AGU CyberShake talk: [[File:AGU_2020_CyberShake.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2019 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2019_CyberShake.pptx | PPTX]]&lt;br /&gt;
* SC19 USC booth talk: [[:File:SC19_Callaghan_USC_booth.pptx | PPTX]] or [[:File:SC19_Callaghan_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SCEC Research Computing workshop lightning talk: [[:File:2019_SCEC_Research_Computing_CyberShake_lightning.pdf | PDF]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2019_IHPCSS_workflow.pptx | PPTX]]&lt;br /&gt;
* UseIT talk about HPC at SCEC: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202019.pptx slides (PPTX), external link]&lt;br /&gt;
* SSA CyberShake science talk: [[:File:2019_SSA_CyberShake_Science_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Science_Presentation.pdf | PDF]].  Here are links to the [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/point_src_gtl_v2.wmv 10 km smoothing movie] and [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/pt_src_gtl_20km_v2.wmv 20 km smoothing movie]&lt;br /&gt;
* SSA CyberShake technical talk: [[:File:2019_SSA_CyberShake_Technical_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Technical_Presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2018 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2018_IHPCSS_workflow.pptx | PPTX]] or [[:File:2018_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* Blue Waters Symposium talk: [[:File:Callaghan_Blue_Waters_Symposium.pptx | PPTX]]&lt;br /&gt;
* QuakeCore GMS&amp;amp;V talk: [[:File:CyberShake_QuakeCore_Presentation.pptx | PPTX]] or [[:File:CyberShake_QuakeCore_Presentation.pdf | PDF]]&lt;br /&gt;
* Machine learning with Keras overview: [[:File:Machine_Learning_with_Keras.odp | ODP]]&lt;br /&gt;
&lt;br /&gt;
== 2017 ==&lt;br /&gt;
&lt;br /&gt;
* AGU CyberShake talk: [[:File:2017_AGU_CyberShake.pptx | PPTX]]&lt;br /&gt;
* PG&amp;amp;E CyberShake update: [[:File:PGE_CyberShake_update.pptx | PPTX]] or [[:File:PGE_CyberShake_update.pdf | PDF]]&lt;br /&gt;
* SC17 USC booth talk: [[:File:SC17_USC_booth.pptx | PPTX]] or [[:File:SC17_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SC17 WORKS'17 talk on rvGAHP: [[:File:WORKS17_rvGAHP.pptx | PPTX]] or [[:File:WORKS17_rvGAHP.pdf | PDF]]&lt;br /&gt;
* SC17 Women in HPC Mentoring talk: [[:File:2017_WHPC_Workshop.pptx | PPTX]] or [[:File:2017_WHPC_Workshop.pdf | PDF]]&lt;br /&gt;
* SCEC Annual Meeting plenary CyberShake presentation: [[:File:2017_SCEC_AM_CyberShake.pptx | PPTX]] or [[:File:2017_SCEC_AM_CyberShake.pdf | PDF]]&lt;br /&gt;
* SCEC Nonlinear Workshop presentation: [[:File:2017_SCEC_Nonlinear_Workshop.pptx | PPTX]] or [[:File:2017_SCEC_Nonlinear_Workshop.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2017_IHPCSS_workflow.pptx | PPTX]] or [[:File:2017_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT Blue Waters tutorial: [[:File:2017_UseIT_HPC_tutorial.odt | tutorial text (ODT)]] and [[:File:2017_UseIT_Linux_commands.doc | Linux Command guide (DOC)]]&lt;br /&gt;
* UseIT HPC at SCEC talk: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202017.pptx slides (PPTX), external link] and [[:File:2017_UseIT_HPC_spreadsheet.xlsx | supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* SSA CyberShake talk: [[:File:Callaghan_2017_SSA_CyberShake.pptx | PPTX]] or [[:File:Callaghan_2017_SSA_CyberShake.pdf | PDF]]&lt;br /&gt;
* Blue Waters workflow seminar: [[:File:Blue_Waters_Workflow_Seminar_Overview.pptx | PPTX]] or [[:File:Blue_Waters_Workflow_Seminar_Overview.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2016 ==&lt;br /&gt;
&lt;br /&gt;
* SC16 USC booth talk: [[:File:SC16_RSQSim_UseIT_USC_booth.pdf | PDF]] or [[:File:SC16_RSQSim_UseIT_USC_booth.odp | ODP]]&lt;br /&gt;
* SCEC Annual Meeting: [[:File:SCEC_2016_AM_CyberShake_CISM.pptx | PPTX]] or [[:File:SCEC_2016_AM_CyberShake_CISM.pdf | PDF]]&lt;br /&gt;
* XSEDE Workflow overview talk: [[File:2016_Callaghan_overview_of_workflows.pptx | PPTX]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2016_IHPCSS_workflow.pptx | PPTX]] or [[:File:2016_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC.pptx | HPC talk (PPTX)]] and [[:File:2016_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC_tutorial.odt | HPC tutorial (ODT)]] and [[:File:2016_UseIT_Linux_commands.odt | Sample Linux Commands (ODT)]]&lt;br /&gt;
* CyberShake [[:File:2016_CCSP.odp | CCSP presentation (ODP)]]&lt;br /&gt;
* CyberShake [[:File:2016_UCERF3_downsampling.odp | UCERF3 downsampling presentation (ODP)]]&lt;br /&gt;
&lt;br /&gt;
== 2015 ==&lt;br /&gt;
&lt;br /&gt;
* SC15 USC booth talk:  [[:File:SC15_CyberShake_USC_booth.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2014 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS seismology talk: [[:File:2014_IHPCSS_seismology.pptx | PPTX]] or [[:File:2014_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2014_IHPCSS_workflow.pptx | PPTX]] or [[:File:2014_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2014_UseIT_HPC.pptx |  HPC tutorial (PPTX)]]  [[:File:2014_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]  [[:File:2014_UseIT_HPC_matrix_mult.docx | Supplemental matrix multiplication (DOCX)]]&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
*[[SC16]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30744</id>
		<title>Callaghan Presentations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30744"/>
		<updated>2026-04-09T21:30:45Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Below are links to presentations and related resources given by Scott Callaghan.&lt;br /&gt;
&lt;br /&gt;
== 2025 ==&lt;br /&gt;
*SC25 presentation at OSU booth: [[:File:AWP_presentation_for_OSU_booth.pptx | PPTX]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2025_IHPCSS_workflows.pptx | PPTX]], [[:File:2025_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*USGS NorCal Earthquake Hazards workshop CyberShake presentation: [[:File:2025_USGS_NorCal_CyberShake.pptx | PPTX]], [[:File:2025_USGS_NorCal_CyberShake.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2024 ==&lt;br /&gt;
*AGU24 and December staff meeting presentation: [[:File:AGU24_CyberShake_24_8_presentation.pptx | PPTX]], [[:File:AGU24_CyberShake_24_8_presentation.pdf | PDF]]&lt;br /&gt;
*SC24 presentation at OSU booth: [[:File:SC24_OSU_MPI_compression.pptx | PPTX]], [[:File:SC24_OSU_MPI_compression.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake Study 24.8 overview: [[:File:Study_24.8_overview_for_NGAW3.odp | ODP]], [[:File:Study_24.8_overview_for_NGAW3.pdf | PDF]]&lt;br /&gt;
*Geo-INQUIRE Data Lake workshop CyberShake presentation: [[:File:CyberShake_Data_Lake_workshop.pptx | PPTX]], [[:File:CyberShake_Data_Lake_workshop.pdf | PDF]]&lt;br /&gt;
*IHPCSS Workflow talk: [[:File:2024_IHPCSS_workflows.pptx | PPTX]], [[:File:2024_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2023 ==&lt;br /&gt;
*December Staff Meeting CyberShake updates: [[:File:Dec_Staffmtg_CyberShake_update.pptx | PPTX]], [[:File:Dec_Staffmtg_CyberShake_update.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake presentation: [[:File:2023_NGA_West3.pptx | PPTX]], [[:File:2023_NGA_West3.pdf | PDF]]&lt;br /&gt;
*SC23 early career talk: [[:File:SC23_ECP_career_talk.pptx | PPTX]], [[:File:SC23_ECP_career_talk.pdf | PDF]]&lt;br /&gt;
*IHPCSS seismology presentation: [[:File:2023_IHPCSS_seismology.pptx | PPTX]], [[:File:2023_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2023_IHPCSS_workflows.pptx | PPTX]], [[:File:2023_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*GC11 conference (Solid Earth and Geohazards in the Exascale Era): [[:File:GC11_Callaghan_workflows.pptx | PPTX]]&lt;br /&gt;
*CyberTraining for Seismology talk on CyberShake Data Access tool: [[:File:CyberShake_tutorial_for_2023_CyberTraining.pptx | PPTX]]&lt;br /&gt;
*SSA CyberShake Study 22.12 talk: [[:File:2023_SSA_CyberShake_22_12.pptx | PPTX]], [[:File:2023_SSA_CyberShake_22_12.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2022 ==&lt;br /&gt;
&lt;br /&gt;
*SC22 Early Career talk (also given to ECP Work/Life balance group, and Sandia Parents Group): [[:File:2022_parental_balance.ppt | PPT]], [[:File:2022_parental_balance.pdf | PDF]]&lt;br /&gt;
*SC22 SIGHPC Education Chapter overview: [[:File:SC22_SIGHPC_Edu_overview.pptx | PPTX]], [[:File:SC22_SIGHPC_Edu_overview.pdf | PDF]]&lt;br /&gt;
*SOURCES career talk: [[:File:2022_Sources_career_talk.odp | ODP ]]&lt;br /&gt;
*IHPCSS workflow talk: [[:File:2022_IHPCSS_talk.pptx | PPTX]]&lt;br /&gt;
*SSA Broadband CyberShake Validation talk: [[:File:2022_SSA_Broadband_CyberShake.pptx | PPTX]], [[:File:2022_SSA_Broadband_CyberShake.pdf | PDF]]&lt;br /&gt;
*SSA CyberShake Study 21.12 talk: [[:File:2022_SSA_CyberShake_21_12.pptx | PPTX]], [[:File:2022_SSA_CyberShake_21_12.pdf | PDF]]&lt;br /&gt;
*SCEC staff meeting talk: [[:File:Feb_2022_staff_meeting.pptx | PPTX]], [[:File:Feb_2022_staff_meeting.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2021 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2021_CyberShake.pptx | PPTX full]], [[:File:AGU_2021_CyberShake_lighting.pptx | PPTX lightning]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2021_IHPCSS_workflow.pptx | PPTX]], [[:File:2021_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2020 ==&lt;br /&gt;
* Polytechnic talk resources:  [[Poly 2020 outreach discussion]]&lt;br /&gt;
* AGU CyberShake talk: [[File:AGU_2020_CyberShake.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2019 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2019_CyberShake.pptx | PPTX]]&lt;br /&gt;
* SC19 USC booth talk: [[:File:SC19_Callaghan_USC_booth.pptx | PPTX]] or [[:File:SC19_Callaghan_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SCEC Research Computing workshop lightning talk: [[:File:2019_SCEC_Research_Computing_CyberShake_lightning.pdf | PDF]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2019_IHPCSS_workflow.pptx | PPTX]]&lt;br /&gt;
* UseIT talk about HPC at SCEC: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202019.pptx slides (PPTX), external link]&lt;br /&gt;
* SSA CyberShake science talk: [[:File:2019_SSA_CyberShake_Science_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Science_Presentation.pdf | PDF]].  Here are links to the [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/point_src_gtl_v2.wmv 10 km smoothing movie] and [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/pt_src_gtl_20km_v2.wmv 20 km smoothing movie]&lt;br /&gt;
* SSA CyberShake technical talk: [[:File:2019_SSA_CyberShake_Technical_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Technical_Presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2018 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2018_IHPCSS_workflow.pptx | PPTX]] or [[:File:2018_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* Blue Waters Symposium talk: [[:File:Callaghan_Blue_Waters_Symposium.pptx | PPTX]]&lt;br /&gt;
* QuakeCore GMS&amp;amp;V talk: [[:File:CyberShake_QuakeCore_Presentation.pptx | PPTX]] or [[:File:CyberShake_QuakeCore_Presentation.pdf | PDF]]&lt;br /&gt;
* Machine learning with Keras overview: [[:File:Machine_Learning_with_Keras.odp | ODP]]&lt;br /&gt;
&lt;br /&gt;
== 2017 ==&lt;br /&gt;
&lt;br /&gt;
* AGU CyberShake talk: [[:File:2017_AGU_CyberShake.pptx | PPTX]]&lt;br /&gt;
* PG&amp;amp;E CyberShake update: [[:File:PGE_CyberShake_update.pptx | PPTX]] or [[:File:PGE_CyberShake_update.pdf | PDF]]&lt;br /&gt;
* SC17 USC booth talk: [[:File:SC17_USC_booth.pptx | PPTX]] or [[:File:SC17_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SC17 WORKS'17 talk on rvGAHP: [[:File:WORKS17_rvGAHP.pptx | PPTX]] or [[:File:WORKS17_rvGAHP.pdf | PDF]]&lt;br /&gt;
* SC17 Women in HPC Mentoring talk: [[:File:2017_WHPC_Workshop.pptx | PPTX]] or [[:File:2017_WHPC_Workshop.pdf | PDF]]&lt;br /&gt;
* SCEC Annual Meeting plenary CyberShake presentation: [[:File:2017_SCEC_AM_CyberShake.pptx | PPTX]] or [[:File:2017_SCEC_AM_CyberShake.pdf | PDF]]&lt;br /&gt;
* SCEC Nonlinear Workshop presentation: [[:File:2017_SCEC_Nonlinear_Workshop.pptx | PPTX]] or [[:File:2017_SCEC_Nonlinear_Workshop.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2017_IHPCSS_workflow.pptx | PPTX]] or [[:File:2017_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT Blue Waters tutorial: [[:File:2017_UseIT_HPC_tutorial.odt | tutorial text (ODT)]] and [[:File:2017_UseIT_Linux_commands.doc | Linux Command guide (DOC)]]&lt;br /&gt;
* UseIT HPC at SCEC talk: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202017.pptx slides (PPTX), external link] and [[:File:2017_UseIT_HPC_spreadsheet.xlsx | supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* SSA CyberShake talk: [[:File:Callaghan_2017_SSA_CyberShake.pptx | PPTX]] or [[:File:Callaghan_2017_SSA_CyberShake.pdf | PDF]]&lt;br /&gt;
* Blue Waters workflow seminar: [[:File:Blue_Waters_Workflow_Seminar_Overview.pptx | PPTX]] or [[:File:Blue_Waters_Workflow_Seminar_Overview.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2016 ==&lt;br /&gt;
&lt;br /&gt;
* SC16 USC booth talk: [[:File:SC16_RSQSim_UseIT_USC_booth.pdf | PDF]] or [[:File:SC16_RSQSim_UseIT_USC_booth.odp | ODP]]&lt;br /&gt;
* SCEC Annual Meeting: [[:File:SCEC_2016_AM_CyberShake_CISM.pptx | PPTX]] or [[:File:SCEC_2016_AM_CyberShake_CISM.pdf | PDF]]&lt;br /&gt;
* XSEDE Workflow overview talk: [[File:2016_Callaghan_overview_of_workflows.pptx | PPTX]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2016_IHPCSS_workflow.pptx | PPTX]] or [[:File:2016_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC.pptx | HPC talk (PPTX)]] and [[:File:2016_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC_tutorial.odt | HPC tutorial (ODT)]] and [[:File:2016_UseIT_Linux_commands.odt | Sample Linux Commands (ODT)]]&lt;br /&gt;
* CyberShake [[:File:2016_CCSP.odp | CCSP presentation (ODP)]]&lt;br /&gt;
* CyberShake [[:File:2016_UCERF3_downsampling.odp | UCERF3 downsampling presentation (ODP)]]&lt;br /&gt;
&lt;br /&gt;
== 2015 ==&lt;br /&gt;
&lt;br /&gt;
* SC15 USC booth talk:  [[:File:SC15_CyberShake_USC_booth.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2014 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS seismology talk: [[:File:2014_IHPCSS_seismology.pptx | PPTX]] or [[:File:2014_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2014_IHPCSS_workflow.pptx | PPTX]] or [[:File:2014_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2014_UseIT_HPC.pptx |  HPC tutorial (PPTX)]]  [[:File:2014_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]  [[:File:2014_UseIT_HPC_matrix_mult.docx | Supplemental matrix multiplication (DOCX)]]&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
*[[SC16]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30736</id>
		<title>UCVM muscal</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_muscal&amp;diff=30736"/>
		<updated>2026-04-02T16:00:32Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page describes the Multi-Scale CALifornia (MUSCAL) velocity model and its integration into UCVM.&lt;br /&gt;
&lt;br /&gt;
== MUSCAL overview ==&lt;br /&gt;
&lt;br /&gt;
The MUSCAL Vp and Vs model was created by Te-Yang Yeh and Yehuda Ben-Zion in 2025-6 by starting with the CANVAS tomography model as a base and then integrating multiple regional and local high-resolution models.  Each submodel is evaluated to determine where it improves the fit through simulations of M4 historical events to 1 Hz, and then the overall model is updated where the submodel improves the fit.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Datasets ===&lt;br /&gt;
&lt;br /&gt;
zone: 11&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  command  use:&lt;br /&gt;
     plot_depth_profile.py -n $UCVM_INSTALL_PATH/conf/ucvm.conf -i $UCVM_INSTALL_PATH -d vs -c muscal &lt;br /&gt;
                           -o muscal_small_depth_1000.png -C 'Multi-Scale Statewide California Velocity Model'&lt;br /&gt;
                           -v 1000 -b 0 -s 36.5054,-119.0587 -e 30000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== big data in netcdf ====&lt;br /&gt;
  &lt;br /&gt;
  model_MSCAL_CANVAS_dll0.01_dz50_cmpd.nc&lt;br /&gt;
  4.5G&lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113 &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:671      from 0 to 100,000 &lt;br /&gt;
                   50 increments until 30,000&lt;br /&gt;
                   1000 increments til 100000&lt;br /&gt;
&lt;br /&gt;
Plot Depth profile at  36.5054,-119.0587 in different step increments. No interpretation and access data using nc api&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:muscal_big_deth_50.png|thumb|300px|muscal big 50 vs]]&lt;br /&gt;
| [[FILE:muscal_big_deth_100.png|thumb|300px|muscal big 100 vs]]&lt;br /&gt;
| [[FILE:muscal_big_depth_500.png|thumb|300px|muscal big 500 vs]] &lt;br /&gt;
| [[FILE:muscal_big_deth_1000.png|thumb|300px|muscal big 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_big_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_big_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== small data in netcdf ====&lt;br /&gt;
   model_MUSCAL_CANVAS_dll0.01_vardz_cmpd.nc &lt;br /&gt;
   2.1G &lt;br /&gt;
&lt;br /&gt;
  longitdue:1301 from -126 to -113   &lt;br /&gt;
  latitude:1251  from 31 to 43.5&lt;br /&gt;
  depth:210      from 0 to 99,000 &lt;br /&gt;
                   50 increments upto 3000 &lt;br /&gt;
                   100 increments upto 5000&lt;br /&gt;
                   250 increments upto 10000&lt;br /&gt;
                   500 increments upto 30000&lt;br /&gt;
                   1000 increments upto 99000&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Depth profiles in different steps : 50m,100m,500m,1000m&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Direct from netcdf as external file&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_depth_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_depth_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_deth_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_depth-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_depth-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data no interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_no_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_no_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_no_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_no_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_no_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
==== Another small dataset in netcdf ====&lt;br /&gt;
&lt;br /&gt;
  model_MUSCAL_CANVAS_dll0.01_vardz_float32_cmpd.nc&lt;br /&gt;
  1.4G&lt;br /&gt;
&lt;br /&gt;
  All longitude, latitude, and depth points are now saved as float32, as well as the vp, vs, and rho. &lt;br /&gt;
&lt;br /&gt;
Loaded in-memory as binary data with interpolation&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
|[[FILE:muscal_small_interp_50.png|thumb|300px|muscal small 50 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_100.png|thumb|300px|muscal small 100 vs]]&lt;br /&gt;
|[[FILE:muscal_small_interp_500.png|thumb|300px|muscal small 500 vs]] &lt;br /&gt;
|[[FILE:muscal_small_interp_1000.png|thumb|300px|muscal small 1000 vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
 [[FILE:muscal_small_interp-50_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-100_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-500_matprops.json.txt]]&lt;br /&gt;
 [[FILE:muscal_small_interp-1000_matprops.json.txt]]&lt;br /&gt;
&lt;br /&gt;
=== Validation ===&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Several steps of pre/post processing are done to original MUSCAL model(netcdf format) in order to &lt;br /&gt;
incorporate it into UCVM and CVM explorer.&lt;br /&gt;
&lt;br /&gt;
   Because of the speed of netcdf-C code is too slow to support in-time nature of the explorer and would &lt;br /&gt;
   like to have interpolation on query,&lt;br /&gt;
&lt;br /&gt;
      * Number of depth layer is reduced with deeper layers merged into fewer layers&lt;br /&gt;
      * preprocessing the netcdf data into binary data files&lt;br /&gt;
      * data are all in float32&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.txt]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Final result from ucvm_query, (with interpolation)&lt;br /&gt;
&lt;br /&gt;
 [[FILE:MUSCAL_test_points_deep.final.txt]]&lt;br /&gt;
 [[FILE:MUSCAL_test_points_shallow.final.txt]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Visualizing_AWP-ODC_Output&amp;diff=30643</id>
		<title>Visualizing AWP-ODC Output</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Visualizing_AWP-ODC_Output&amp;diff=30643"/>
		<updated>2026-02-23T22:15:28Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents a procedure to visualize AWP-ODC velocity output, using remote Paraview.  So far, this procedure has only been tested on Titan using Paraview 4.4.0.&lt;br /&gt;
&lt;br /&gt;
== Configure the simulation ==&lt;br /&gt;
&lt;br /&gt;
When running the AWP-ODC simulation, make sure it is configured to produce velocity output.&lt;br /&gt;
&lt;br /&gt;
Take note of the following parameters:&lt;br /&gt;
*NX, NY, NZ (and NBGX, NEDX, NGBY, NEDY, NGBZ, NEDZ if specified)&lt;br /&gt;
*WRITE_STEP&lt;br /&gt;
*NSKPX, NSKPY, NSKPZ&lt;br /&gt;
*NTISKP&lt;br /&gt;
&lt;br /&gt;
They'll be needed when producing the output.&lt;br /&gt;
&lt;br /&gt;
== Create a Paraview XDMF file ==&lt;br /&gt;
&lt;br /&gt;
Once the simulation is complete, we must create a configuration file which tells Paraview how the velocity data is laid out in memory.&lt;br /&gt;
&lt;br /&gt;
For this, run the script [https://github.com/SCECcode/cybershake-core/blob/main/Utils/visualizer/create_timeseries_file.py create_timeseries_file.py].&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Usage: ./create_timeseries_file.py &amp;lt;X dim in output&amp;gt; &amp;lt;Y dim in output&amp;gt; &amp;lt;grid decimation&amp;gt; &amp;lt;NT in sim&amp;gt; &amp;lt;DT of sim&amp;gt; &amp;lt;NTISKP&amp;gt; &amp;lt;timesteps output per file&amp;gt; &amp;lt;prefix&amp;gt; &amp;lt;output file&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Based on this, a XDMF (XML) file is produced which can serve as input to Paraview.&lt;br /&gt;
&lt;br /&gt;
== Create traces ==&lt;br /&gt;
&lt;br /&gt;
Often, other visual references are needed to make the visualization meaningful.  The approach outlined uses a Cartesian grid projection to display the velocity results.  This means that to display other data, we have to convert the data from (lat, lon) into (X index, Y index).&lt;br /&gt;
&lt;br /&gt;
To do this, use the code get_grid_values.c, in https://github.com/SCECcode/cybershake-core/blob/main/PreCVM/GenGrid_py/src/get_grid_values.c .  It takes a path to a gridfile generated from PreCVM; the modellat, modellon, and modelrot; an input CSV file in lat,lon format; a path to the output file; the grid spacing; and the xshift and yshift from the model_params file.  geoproj should always be set to 1, which is Rob's great circle projection.  Here's a sample invocation:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./get_grid_values gridfile=gridfile_fwd modellat=35.70115 modellon=-119.75587 modelrot=-30.0 infile=CA_trace.csv outfile=ca_trace_grid.csv grid_spacing_km=0.1 xshift=-149.95001 yshift=-134.95001 geoproj=1&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The directory https://github.com/SCECcode/cybershake-core/blob/main/Utils/visualizer/ contains trace files for California, CCA, USGS, and cvms4/cvms4.26/cvms5 models in lat,lon format, ready for use with get_grid_values.&lt;br /&gt;
&lt;br /&gt;
== Use local Paraview to connect to remote nodes ==&lt;br /&gt;
&lt;br /&gt;
The easiest way to visualize the data is to leave it on the remote system, and use the client/server functionality of Paraview.  In this mode, Paraview connects to a remote system and uses the remote cluster to drive the rendering.&lt;br /&gt;
&lt;br /&gt;
Follow the instructions at [https://www.olcf.ornl.gov/tutorials/running-paraview-on-titan/ Titan's Paraview guide], under &amp;quot;Interactive Mode&amp;quot;, to create a connection.&lt;br /&gt;
&lt;br /&gt;
== Load datasets ==&lt;br /&gt;
&lt;br /&gt;
Once the job starts on Titan, you can begin loading data.&lt;br /&gt;
&lt;br /&gt;
=== Velocity data ===&lt;br /&gt;
&lt;br /&gt;
To load velocity data, go to File-&amp;gt;Open and select the XDMF file you previously made.  It will appear in the 'Pipeline Browser' window in the upper left.  You may need to click on the eye next to it to get it to display. Additionally, the data is rendered on the Y/Z plane (there is a reason I had to set it up this way, but now I forget why), so click on the '-X' icon to the right of the Slice pull-down in the toolbar.  You should see the rectangle defined by your simulation region.  You can change the timestep you're looking at by changing the value in the text box to the right of &amp;quot;Time:&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
=== Traces ===&lt;br /&gt;
&lt;br /&gt;
Loading traces is more involved.  For each trace:&lt;br /&gt;
&lt;br /&gt;
#Go to File-&amp;gt;Open and select the .csv trace file you created earlier.  The .csv file should appear in the Pipeline Browser window on the left.&lt;br /&gt;
#On the Properties tab below the Pipeline Browser, de-select 'Have Headers' and click 'Apply'.  A spreadsheet will appear.&lt;br /&gt;
#Go to Filters-&amp;gt;Alphabetical-&amp;gt;Table to Points.  This will open a new filter, below your .csv file in the Pipeline Browser.&lt;br /&gt;
#In the Properties tab, change the Y Column pull-down to &amp;quot;Field 1&amp;quot;, and the Z column pull-down to &amp;quot;Field 2&amp;quot;.  Click &amp;quot;Apply&amp;quot;.  You have now converted the .csv into a series of points.  You can close the spreadsheet.  If you click on the eye next to 'TableToPoints1' in the Pipeline Browser, you will see your outline appear, but as a series of points.  We will next convert it to line segments.&lt;br /&gt;
#Go to Filters-&amp;gt;Alphabetical-&amp;gt;Programmable Filter.  This filter allows you to write your own VTK code, in Python, to do whatever you like.  We will write code which will convert the series of points into line segments.&lt;br /&gt;
#Copy and paste the following into the 'Script' section of the Properties tab:&lt;br /&gt;
 pdi = self.GetPolyDataInput()&lt;br /&gt;
 pdo = self.GetPolyDataOutput()&lt;br /&gt;
 numPoints = pdi.GetNumberOfPoints()&lt;br /&gt;
 pdo.Allocate()&lt;br /&gt;
 for i in range(0, numPoints-1):&lt;br /&gt;
   points = [i, i+1]&lt;br /&gt;
   pdo.InsertNextCell(3,2,points)&lt;br /&gt;
#Click 'Apply'.  It may take a moment, depending on how many points you have.  You should now see the outline of the object.  If you don't, try toggling the eye.&lt;br /&gt;
#To change the color of the object, click on &amp;quot;ProgrammableFilter1&amp;quot;, scroll down in the Properties tab, and under Coloring click the Edit button and pick a new color.  You can also change the Line Width in the Styling section.&lt;br /&gt;
&lt;br /&gt;
Repeat this until all your traces have been loaded.&lt;br /&gt;
&lt;br /&gt;
== Modify visuals ==&lt;br /&gt;
&lt;br /&gt;
You may need to make additional modifications to your plot to make it look as you like.&lt;br /&gt;
&lt;br /&gt;
=== Add timestamp ===&lt;br /&gt;
&lt;br /&gt;
To display the timestamp on the plot, go to Filters-&amp;gt;Temporal-&amp;gt;Annotate Time Filter.  Click 'Apply' in the Properties tab to make it appear.  You can click and drag it on the Layout to move it around.  If you want to show fewer decimal places, change the 'Format' section in the Properties tab.&lt;br /&gt;
&lt;br /&gt;
=== Change plot colors ===&lt;br /&gt;
&lt;br /&gt;
I'm not sure how the default plot colors are selected, but they're usually not symmetric around 0 which isn't what we want.&lt;br /&gt;
&lt;br /&gt;
To change the colors, click on the XDMF file in the Pipeline Browser.  The Color Map Editor will appear on the right side.&lt;br /&gt;
&lt;br /&gt;
#Change the max and min to be symmetric around 0.  To determine what the max and min should be:&lt;br /&gt;
##Select a timestep about 10% of the way through the simulation.&lt;br /&gt;
##Click on the Information tab (next to the Properties tab).  It will show you what the Data Range is for velocity, an easy way to figure out how extreme your values are.&lt;br /&gt;
##I usually select a max and min about 10% of the global max and min.  I find this gives a good balance between too much saturation and not losing the small values, but you may prefer other settings.&lt;br /&gt;
#Click on the 'Rescale to Custom Data Range' button.  It's 2 buttons to the left of the pull-down that says 'X Velocity, with what looks like an arrow with a C.  Set your new max and min.&lt;br /&gt;
#If you want to change the color scheme, click on the 'Choose Preset' button in the Color Map Editor on the right.  It's the one with a heart on it.  Select a new color scheme.&lt;br /&gt;
#If you want to alter how quickly the color changes from neutral to red or blue, you can click on the horizontal bar (not the triangle part, that controls opacity) to add new reference points.  Then, you can edit the 'Color transfer function values' table to set new R,G,and B values for it.&lt;br /&gt;
&lt;br /&gt;
=== Edit color legend ===&lt;br /&gt;
&lt;br /&gt;
If you want to alter the color legend, click on the 'Edit Color Legend Parameters' button, near the top right of the Color Map Editor.  It's the button with an 'e' on it.&lt;br /&gt;
&lt;br /&gt;
== Save state ==&lt;br /&gt;
&lt;br /&gt;
Before starting your animation, save a state file.  That way, if there are any issues, you don't have to reload everything again.  Go to File-&amp;gt;Save State.&lt;br /&gt;
&lt;br /&gt;
In the future, you should reconnect before you try to load it.  It may take 15-20 seconds to load once you confirm the file paths.&lt;br /&gt;
&lt;br /&gt;
== Create series of snapshots ==&lt;br /&gt;
&lt;br /&gt;
The way that Paraview creates animations is by creating a series of snapshots, which you can then render into a movie.  To save a series of snapshots, go to File-&amp;gt;Save Animation.  Check over the Animation Settings.  Note that you can set a custom Timestep range, so if you aren't able to finish rendering the snapshots in the length of your Titan job, you can resume from where you left off.&lt;br /&gt;
&lt;br /&gt;
'''If the connection terminates in the middle, or your local machine goes to sleep, you may end up with rendered frames in which the content wasn't updated.  Check your last frames to be sure they have the correct timestamp.'''&lt;br /&gt;
&lt;br /&gt;
I think the speed of rendering is a function of your internet connection, the number of remote nodes, and the image size.  It typically takes me 90-120 minutes to render a 2000-timestep animation at ~1280x800.  I use PNG files.&lt;br /&gt;
&lt;br /&gt;
== Create movie ==&lt;br /&gt;
&lt;br /&gt;
On Linux, you can use ffmpeg to string together the PNGs to create a movie, with a command like:&lt;br /&gt;
 ffmpeg -framerate &amp;lt;fps&amp;gt; -i fileprefix.%04d.png &amp;lt;output movie.mp4&amp;gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30642</id>
		<title>Callaghan Presentations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30642"/>
		<updated>2026-02-09T21:55:07Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* 2025 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Below are links to presentations and related resources given by Scott Callaghan.&lt;br /&gt;
&lt;br /&gt;
== 2025 ==&lt;br /&gt;
*SC25 presentation at OSU booth:&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2025_IHPCSS_workflows.pptx | PPTX]], [[:File:2025_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*USGS NorCal Earthquake Hazards workshop CyberShake presentation: [[:File:2025_USGS_NorCal_CyberShake.pptx | PPTX]], [[:File:2025_USGS_NorCal_CyberShake.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2024 ==&lt;br /&gt;
*AGU24 and December staff meeting presentation: [[:File:AGU24_CyberShake_24_8_presentation.pptx | PPTX]], [[:File:AGU24_CyberShake_24_8_presentation.pdf | PDF]]&lt;br /&gt;
*SC24 presentation at OSU booth: [[:File:SC24_OSU_MPI_compression.pptx | PPTX]], [[:File:SC24_OSU_MPI_compression.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake Study 24.8 overview: [[:File:Study_24.8_overview_for_NGAW3.odp | ODP]], [[:File:Study_24.8_overview_for_NGAW3.pdf | PDF]]&lt;br /&gt;
*Geo-INQUIRE Data Lake workshop CyberShake presentation: [[:File:CyberShake_Data_Lake_workshop.pptx | PPTX]], [[:File:CyberShake_Data_Lake_workshop.pdf | PDF]]&lt;br /&gt;
*IHPCSS Workflow talk: [[:File:2024_IHPCSS_workflows.pptx | PPTX]], [[:File:2024_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2023 ==&lt;br /&gt;
*December Staff Meeting CyberShake updates: [[:File:Dec_Staffmtg_CyberShake_update.pptx | PPTX]], [[:File:Dec_Staffmtg_CyberShake_update.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake presentation: [[:File:2023_NGA_West3.pptx | PPTX]], [[:File:2023_NGA_West3.pdf | PDF]]&lt;br /&gt;
*SC23 early career talk: [[:File:SC23_ECP_career_talk.pptx | PPTX]], [[:File:SC23_ECP_career_talk.pdf | PDF]]&lt;br /&gt;
*IHPCSS seismology presentation: [[:File:2023_IHPCSS_seismology.pptx | PPTX]], [[:File:2023_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2023_IHPCSS_workflows.pptx | PPTX]], [[:File:2023_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*GC11 conference (Solid Earth and Geohazards in the Exascale Era): [[:File:GC11_Callaghan_workflows.pptx | PPTX]]&lt;br /&gt;
*CyberTraining for Seismology talk on CyberShake Data Access tool: [[:File:CyberShake_tutorial_for_2023_CyberTraining.pptx | PPTX]]&lt;br /&gt;
*SSA CyberShake Study 22.12 talk: [[:File:2023_SSA_CyberShake_22_12.pptx | PPTX]], [[:File:2023_SSA_CyberShake_22_12.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2022 ==&lt;br /&gt;
&lt;br /&gt;
*SC22 Early Career talk (also given to ECP Work/Life balance group, and Sandia Parents Group): [[:File:2022_parental_balance.ppt | PPT]], [[:File:2022_parental_balance.pdf | PDF]]&lt;br /&gt;
*SC22 SIGHPC Education Chapter overview: [[:File:SC22_SIGHPC_Edu_overview.pptx | PPTX]], [[:File:SC22_SIGHPC_Edu_overview.pdf | PDF]]&lt;br /&gt;
*SOURCES career talk: [[:File:2022_Sources_career_talk.odp | ODP ]]&lt;br /&gt;
*IHPCSS workflow talk: [[:File:2022_IHPCSS_talk.pptx | PPTX]]&lt;br /&gt;
*SSA Broadband CyberShake Validation talk: [[:File:2022_SSA_Broadband_CyberShake.pptx | PPTX]], [[:File:2022_SSA_Broadband_CyberShake.pdf | PDF]]&lt;br /&gt;
*SSA CyberShake Study 21.12 talk: [[:File:2022_SSA_CyberShake_21_12.pptx | PPTX]], [[:File:2022_SSA_CyberShake_21_12.pdf | PDF]]&lt;br /&gt;
*SCEC staff meeting talk: [[:File:Feb_2022_staff_meeting.pptx | PPTX]], [[:File:Feb_2022_staff_meeting.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2021 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2021_CyberShake.pptx | PPTX full]], [[:File:AGU_2021_CyberShake_lighting.pptx | PPTX lightning]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2021_IHPCSS_workflow.pptx | PPTX]], [[:File:2021_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2020 ==&lt;br /&gt;
* Polytechnic talk resources:  [[Poly 2020 outreach discussion]]&lt;br /&gt;
* AGU CyberShake talk: [[File:AGU_2020_CyberShake.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2019 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2019_CyberShake.pptx | PPTX]]&lt;br /&gt;
* SC19 USC booth talk: [[:File:SC19_Callaghan_USC_booth.pptx | PPTX]] or [[:File:SC19_Callaghan_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SCEC Research Computing workshop lightning talk: [[:File:2019_SCEC_Research_Computing_CyberShake_lightning.pdf | PDF]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2019_IHPCSS_workflow.pptx | PPTX]]&lt;br /&gt;
* UseIT talk about HPC at SCEC: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202019.pptx slides (PPTX), external link]&lt;br /&gt;
* SSA CyberShake science talk: [[:File:2019_SSA_CyberShake_Science_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Science_Presentation.pdf | PDF]].  Here are links to the [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/point_src_gtl_v2.wmv 10 km smoothing movie] and [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/pt_src_gtl_20km_v2.wmv 20 km smoothing movie]&lt;br /&gt;
* SSA CyberShake technical talk: [[:File:2019_SSA_CyberShake_Technical_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Technical_Presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2018 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2018_IHPCSS_workflow.pptx | PPTX]] or [[:File:2018_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* Blue Waters Symposium talk: [[:File:Callaghan_Blue_Waters_Symposium.pptx | PPTX]]&lt;br /&gt;
* QuakeCore GMS&amp;amp;V talk: [[:File:CyberShake_QuakeCore_Presentation.pptx | PPTX]] or [[:File:CyberShake_QuakeCore_Presentation.pdf | PDF]]&lt;br /&gt;
* Machine learning with Keras overview: [[:File:Machine_Learning_with_Keras.odp | ODP]]&lt;br /&gt;
&lt;br /&gt;
== 2017 ==&lt;br /&gt;
&lt;br /&gt;
* AGU CyberShake talk: [[:File:2017_AGU_CyberShake.pptx | PPTX]]&lt;br /&gt;
* PG&amp;amp;E CyberShake update: [[:File:PGE_CyberShake_update.pptx | PPTX]] or [[:File:PGE_CyberShake_update.pdf | PDF]]&lt;br /&gt;
* SC17 USC booth talk: [[:File:SC17_USC_booth.pptx | PPTX]] or [[:File:SC17_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SC17 WORKS'17 talk on rvGAHP: [[:File:WORKS17_rvGAHP.pptx | PPTX]] or [[:File:WORKS17_rvGAHP.pdf | PDF]]&lt;br /&gt;
* SC17 Women in HPC Mentoring talk: [[:File:2017_WHPC_Workshop.pptx | PPTX]] or [[:File:2017_WHPC_Workshop.pdf | PDF]]&lt;br /&gt;
* SCEC Annual Meeting plenary CyberShake presentation: [[:File:2017_SCEC_AM_CyberShake.pptx | PPTX]] or [[:File:2017_SCEC_AM_CyberShake.pdf | PDF]]&lt;br /&gt;
* SCEC Nonlinear Workshop presentation: [[:File:2017_SCEC_Nonlinear_Workshop.pptx | PPTX]] or [[:File:2017_SCEC_Nonlinear_Workshop.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2017_IHPCSS_workflow.pptx | PPTX]] or [[:File:2017_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT Blue Waters tutorial: [[:File:2017_UseIT_HPC_tutorial.odt | tutorial text (ODT)]] and [[:File:2017_UseIT_Linux_commands.doc | Linux Command guide (DOC)]]&lt;br /&gt;
* UseIT HPC at SCEC talk: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202017.pptx slides (PPTX), external link] and [[:File:2017_UseIT_HPC_spreadsheet.xlsx | supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* SSA CyberShake talk: [[:File:Callaghan_2017_SSA_CyberShake.pptx | PPTX]] or [[:File:Callaghan_2017_SSA_CyberShake.pdf | PDF]]&lt;br /&gt;
* Blue Waters workflow seminar: [[:File:Blue_Waters_Workflow_Seminar_Overview.pptx | PPTX]] or [[:File:Blue_Waters_Workflow_Seminar_Overview.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2016 ==&lt;br /&gt;
&lt;br /&gt;
* SC16 USC booth talk: [[:File:SC16_RSQSim_UseIT_USC_booth.pdf | PDF]] or [[:File:SC16_RSQSim_UseIT_USC_booth.odp | ODP]]&lt;br /&gt;
* SCEC Annual Meeting: [[:File:SCEC_2016_AM_CyberShake_CISM.pptx | PPTX]] or [[:File:SCEC_2016_AM_CyberShake_CISM.pdf | PDF]]&lt;br /&gt;
* XSEDE Workflow overview talk: [[File:2016_Callaghan_overview_of_workflows.pptx | PPTX]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2016_IHPCSS_workflow.pptx | PPTX]] or [[:File:2016_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC.pptx | HPC talk (PPTX)]] and [[:File:2016_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC_tutorial.odt | HPC tutorial (ODT)]] and [[:File:2016_UseIT_Linux_commands.odt | Sample Linux Commands (ODT)]]&lt;br /&gt;
* CyberShake [[:File:2016_CCSP.odp | CCSP presentation (ODP)]]&lt;br /&gt;
* CyberShake [[:File:2016_UCERF3_downsampling.odp | UCERF3 downsampling presentation (ODP)]]&lt;br /&gt;
&lt;br /&gt;
== 2015 ==&lt;br /&gt;
&lt;br /&gt;
* SC15 USC booth talk:  [[:File:SC15_CyberShake_USC_booth.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2014 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS seismology talk: [[:File:2014_IHPCSS_seismology.pptx | PPTX]] or [[:File:2014_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2014_IHPCSS_workflow.pptx | PPTX]] or [[:File:2014_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2014_UseIT_HPC.pptx |  HPC tutorial (PPTX)]]  [[:File:2014_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]  [[:File:2014_UseIT_HPC_matrix_mult.docx | Supplemental matrix multiplication (DOCX)]]&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
*[[SC16]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_UCERF2_ERF&amp;diff=30612</id>
		<title>CyberShake UCERF2 ERF</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_UCERF2_ERF&amp;diff=30612"/>
		<updated>2026-01-09T06:30:42Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: Created page with &amp;quot;All CyberShake studies through Study 24.8 utilize an earthquake rupture forecast (ERF) derived from UCERF2.  Some details about the construction of this ERF for CyberShake are...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;All CyberShake studies through Study 24.8 utilize an earthquake rupture forecast (ERF) derived from UCERF2.  Some details about the construction of this ERF for CyberShake are described below.&lt;br /&gt;
&lt;br /&gt;
== Event Types ==&lt;br /&gt;
&lt;br /&gt;
There are three types of events given by the UCERF2 ERF: regular, median, and aleatory.&lt;br /&gt;
&lt;br /&gt;
=== Regular ===&lt;br /&gt;
Regular events represent floating sources in UCERF2, about 85% of all events in CyberShake. &lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt; Compute target Magnitude Frequency Distribution (MFD) for each source as a Gutenberg-Richter (G-R) distribution up to maximum magnitude determined by two Mw-Area scaling relationships (Ellsworth-B; Hanks and Bakun, 2007). The G-R b-value depends on the fault type: b=0 for type A faults (e.g., the San Andreas), and the average of G-R distributions with b=0 and b=0.8 are used for type B faults. Then average the MFD from the two scaling relationships, which is discretized in 0.1 magnitude bins (i.e., UCERF2 magnitude) to determine the final source MFD.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt; Extend the down-dip width (DDW) of the fault to match the Somerville (2006) area for a full-fault rupture at the maximum magnitude. (CyberShake-only, not done for standard UCERF2)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt; For each magnitude bin, compute the rupture area with Somerville 2006 (or, for standard UCERF2, Hanks and Bakun, 2007).&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt; Build N floating ruptures with that area.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt; Set rate of each rupture to G-R rate for magnitude divided by N (equal weight to each rupture for that magnitude bin).&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Median ===&lt;br /&gt;
Median events represent the characteristic sources in UCERF2, whose rupture areas are modified to match the Somerville (2006) scaling relationship, which represent ~2% of events in CyberShake. &lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;For each branch on UCERF2 logic tree (Ellsworth-B; Hanks and Bakun, 2007), compute median magnitude from area with given scaling relationship, discretized into 0.1 magnitude bins (i.e., UCERF2 magnitude)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;For each source:&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;compute implied area from Somerville 2006 from UCERF2 magnitude&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;compute DDW correction factor as ratio of implied area to original UCERF2 (U2) area, intended to be: ddwCorrFactor = som06Area / origU2Area&amp;lt;br&amp;gt;&lt;br /&gt;
Note that there is a bug in the UCERF2 implementation such that ddwCorrFactor = som06Area / ellBArea (rather than origU2Area). This resulted in a constant ddwCorrFactor = 1.65959 for characteristic sources (this bug does not affect floating regular sources).&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;extend DDW such that the new rupture area equals the implied Somerville 2006 area: newDDW = origDDW * ddwCorrFactor (to the nearest integer)&amp;lt;br&amp;gt;&lt;br /&gt;
Note: what this does, holding magnitude constant, is extend the area such that if you were to use Somerville 2006 to compute magnitude it would equal the input UCERF2 magnitude.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Aleatory ===&lt;br /&gt;
Aleatory events have the same rupture areas of Median events, but with varying magnitudes to capture the variations in magnitude-area relationship, which represent ~13% of events in CyberShake.&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;For each Median source, compute the total moment rate&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;compute moment balanced Gaussian aleatory magnitude distribution with sigma=0.12, two sided truncation at 2 sigma&amp;lt;br&amp;gt;&lt;br /&gt;
Note: this means that the resultant moment rate from this distribution will equal the moment rate from Median (1)&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Compute magnitude discretized in 0.1 magnitude bins&amp;lt;br&amp;gt;&lt;br /&gt;
Note: the largest magnitudes in CyberShake are aleatory events that have very large stress drops, which can lead to very high seismic hazard at long return periods.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Summary: Both Regular and Median types follow the Somerville 2006 scaling relationship, hence have the same estimated stress drop values. Aleatory events, on the other hand, have a large range of stress drop values.&amp;lt;/b&amp;gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Rupture_Files&amp;diff=30609</id>
		<title>CyberShake Rupture Files</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Rupture_Files&amp;diff=30609"/>
		<updated>2025-12-22T20:49:50Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake uses rupture files to provide descriptions of rupture geometry, and, in some cases, slip information about individual events to be simulated.  A description of these files is below.&lt;br /&gt;
&lt;br /&gt;
== UCERF2 files ==&lt;br /&gt;
&lt;br /&gt;
To date, all CyberShake simulations have been performed using UCERF 2 events.  These events can include both a rupture geometry file and a Standard Rupture Format (SRF) description of the event itself.&lt;br /&gt;
&lt;br /&gt;
=== UCERF2 Rupture Geometry Files ===&lt;br /&gt;
&lt;br /&gt;
Rupture geometry files contain a list of all the points which appear on the surface of the rupture.  This information is needed to determine which SGTs should be saved (so that we have an SGT for each point on every rupture surface), and to generate a rupture variation when passed into the RupGen-api code.  We can also use the rupture geometry to generate the list of points we need SGTs for when doing post-processing.  Since 2014, we have tried to replace the SRF with the rupture geometry whenever possible, as it is a much smaller file and therefore reduces I/O.&lt;br /&gt;
&lt;br /&gt;
The format of the rupture geometry files is:&lt;br /&gt;
&lt;br /&gt;
 Probability = &amp;lt;prob&amp;gt;&lt;br /&gt;
 Magnitude = &amp;lt;mag&amp;gt;&lt;br /&gt;
 GridSpacing = &amp;lt;spacing between elements, in km&amp;gt;&lt;br /&gt;
 NumRows = &amp;lt;# rows&amp;gt;&lt;br /&gt;
 NumCols = &amp;lt;# cols&amp;gt;&lt;br /&gt;
 #   Lat         Lon         Depth      Rake    Dip     Strike&lt;br /&gt;
 &amp;lt;data for (row 0,col 0)&amp;gt;&lt;br /&gt;
 &amp;lt;data for (row 0,col 1)&amp;gt;&lt;br /&gt;
 ...&lt;br /&gt;
 &amp;lt;data for (row 0,col c)&amp;gt;&lt;br /&gt;
 &amp;lt;data for (row 1,col 0)&amp;gt;&lt;br /&gt;
 ...&lt;br /&gt;
 &amp;lt;data for (row r,col c)&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A 'row' is defined as a set of rupture points all at the same depth, and rows are listed in the file from shallowest to deepest.&lt;br /&gt;
&lt;br /&gt;
For ERF 35, we used 1 km spacing; this should be used with 0.5 Hz simulations.  For ERF 36, we used 200 m spacing; this should be used with 1 Hz simulations.&lt;br /&gt;
&lt;br /&gt;
These files are stored in &amp;lt;CyberShake root&amp;gt;/ruptures/Ruptures_erf&amp;lt;ERF ID&amp;gt;/&amp;lt;src ID&amp;gt;/&amp;lt;rup ID&amp;gt;/&amp;lt;src&amp;gt;_&amp;lt;rup&amp;gt;.txt.  The path to the top-level ruptures directory is specified in [[CyberShake_Code_Base#Configuration_file | cybershake.cfg]], and the substructure is assumed in the [[CyberShake_Code_Base#PreSGT | PreSgt]] code.&lt;br /&gt;
&lt;br /&gt;
The geometry files for ERF 36 are available here: [https://g-3a9041.a78b8.36fe.data.globus.org/cybershake/ruptures_erf36.tgz ERF 36](4.1 GB)&lt;br /&gt;
&lt;br /&gt;
=== Rupture Variation files ===&lt;br /&gt;
&lt;br /&gt;
Initially, we precomputed rupture variations for all events and saved these to disk.  This was done for the Graves &amp;amp; Pitarka (2006) (Rupture Variation Scenario ID 3) and Graves &amp;amp; Pitarka (2010) (Rup Var Scenario ID 4).  These SRFs can be found on SCEC disks at /home/rcf-104/CyberShake2007/ruptures/RuptureVariations_35_V2_3 (Rup Var Scenario ID 3) and /home/rcf-104/CyberShake2007/ruptures/RuptureVariations_35_V3_2 (Rup Var Scenario ID 4), using the same &amp;lt;src id&amp;gt;/&amp;lt;rup id&amp;gt; directory hierarchy as the rupture geometry files.&lt;br /&gt;
&lt;br /&gt;
Starting in 2012, we stopped writing out all the SRFs.  Instead, they are generated on-demand in memory during the post-processing.  However, these files can still be generated by calling the _write_srf() function in genslip.  They follow the SRF format; version 1 is documented [http://hypocenter.usc.edu/research/cybershake/srf4.pdf here] and version 2 [http://hypocenter.usc.edu/research/cybershake/SRF-Description-Graves_2.0.pdf here].&lt;br /&gt;
&lt;br /&gt;
== RSQSim files ==&lt;br /&gt;
&lt;br /&gt;
As part of a new effort for 2018, we are proposing to generate CyberShake curves using ruptures generated from RSQSim.&lt;br /&gt;
&lt;br /&gt;
=== RSQSim Rupture Geometry files ===&lt;br /&gt;
&lt;br /&gt;
Rupture geometry files generate from RSQSim ruptures have a different header, as the rupture points are on a tetrahedral grid instead of an evenly spaced rectangular one, and so GridSpacing, NumRows, and NumCols don't really translate.  Instead, the file format is:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Probability = 5.325298E-6&lt;br /&gt;
Magnitude = 7.305054&lt;br /&gt;
AveArea = &amp;lt;average area of fault element&amp;gt;&lt;br /&gt;
NumPoints = &amp;lt;total number of points&amp;gt;&lt;br /&gt;
#   Lat         Lon         Depth      Rake    Dip     Strike&lt;br /&gt;
&amp;lt;data for point 0&amp;gt;&lt;br /&gt;
&amp;lt;data for point 1&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;data for point P&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
No order should be assumed for the points.&lt;br /&gt;
&lt;br /&gt;
These files will use the same directory structure (&amp;lt;CyberShake root&amp;gt;/ruptures/Ruptures_erf&amp;lt;ERF ID&amp;gt;/&amp;lt;src ID&amp;gt;/&amp;lt;rup ID&amp;gt;/&amp;lt;src&amp;gt;_&amp;lt;rup&amp;gt;.txt).&lt;br /&gt;
&lt;br /&gt;
=== RSQSim SRFs ===&lt;br /&gt;
&lt;br /&gt;
RSQSim SRFs are in version 1 format.  The optional &amp;quot;PLANE&amp;quot; section is omitted, since planes don't really map well to RSQSim faults.&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_Database_Archiving&amp;diff=30605</id>
		<title>CyberShake Study Database Archiving</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_Database_Archiving&amp;diff=30605"/>
		<updated>2025-12-13T00:17:20Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== SQLite Archiving ==&lt;br /&gt;
&lt;br /&gt;
The following procedure should be used when we want to archive a study to SQLite which is either in the production or data access database to disk, to free up room in the database.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;First, dump the old database contents into a new directory by running the following commands.&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note that if you don't want to lock the tables, you can replace '--lock-all-tables' with '--single-transaction=TRUE' in the following commands.&lt;br /&gt;
&lt;br /&gt;
You can add --no-create-info if you don't want the create statement for the table at the beginning of the dump file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine the Study ID you want to archive, from looking at the Studies table in the DB:&amp;lt;/li&amp;gt;&lt;br /&gt;
  select * from Studies;&lt;br /&gt;
&amp;lt;li&amp;gt;Dump the PeakAmplitudes, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'Run_ID in (select R.Run_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake PeakAmplitudes &amp;gt; peak_amps.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump Hazard_Datasets, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'Study_ID=&amp;lt;study id to archive&amp;gt;' CyberShake Hazard_Datasets &amp;gt; hazard_datasets.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump Hazard_Curves, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'Hazard_Dataset_ID in (select D.Hazard_Dataset_ID from Hazard_Datasets D where D.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Hazard_Curves &amp;gt; hazard_curves.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump Hazard_Curve_Points, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'Hazard_Curve_ID in (select C.Hazard_Curve_ID from Hazard_Curves C, Hazard_Datasets D where D.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Hazard_Curve_Points &amp;gt; hazard_curve_points.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump CyberShake_Runs, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'Study_ID=&amp;lt;study id to archive&amp;gt;' CyberShake CyberShake_Runs &amp;gt; runs.sql&lt;br /&gt;
&amp;lt;li&amp;gt;You now need the rest of the input tables.  They used to be pretty small, but now they're up to ~70GB with indices.  As of the completion of Study 22.12, here are their approximate sizes (in the DB, not as dump files):&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
! Table !! Size (GB)&lt;br /&gt;
|-&lt;br /&gt;
| AR_Hazard_Curve_Points || 44.6&lt;br /&gt;
|-&lt;br /&gt;
| CyberShake_Site_Ruptures || 10.3&lt;br /&gt;
|-&lt;br /&gt;
| Ruptures || 2.3&lt;br /&gt;
|-&lt;br /&gt;
| Rupture_Variations || 2.6&lt;br /&gt;
|-&lt;br /&gt;
| AR_Hazard_Curves || 1.4&lt;br /&gt;
|-&lt;br /&gt;
| Other tables || 0.5&lt;br /&gt;
|}&lt;br /&gt;
You can either:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Dump all of the input tables -- they are small enough that we don't mind capturing data which wasn't directly used in this study.&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --skip-add-drop-table -u cybershk -p CyberShake AR_Hazard_Curve_Points AR_Hazard_Curves AR_Hazard_Datasets Atten_Rel_Metadata Atten_Rels CyberShake_Site_Regions CyberShake_Site_Ruptures CyberShake_Site_Types CyberShake_Sites ERF_IDs ERF_Metadata ERF_Probability_Models IM_Types Points Rupture_Variation_Probability_Modifier Rupture_Variation_Scenario_IDs Rupture_Variation_Scenario_Metadata Rupture_Variations Ruptures Rup_Var_Seeds SGT_Variation_IDs SGT_Variation_Metadata Studies Time_Spans Velocity_Models &amp;gt; input_tables.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Dump only the sections of the large input tables that you need.  Note that the Points table isn't used anymore, so don't bother to dump it.&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'AR_Hazard_Curve_ID in (select distinct C.AR_Hazard_Curve_ID from AR_Hazard_Curves C where C.AR_Hazard_Dataset_ID in (select distinct D.AR_Hazard_Dataset_ID from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;gt;=(select distinct D.Min_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;lt;=(select distinct D.Max_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;gt;=(select distinct D.Min_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;lt;=(select distinct D.Max_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) )' CyberShake AR_Hazard_Curve_Points &amp;gt; ar_hazard_curve_points.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'CS_Site_ID in (select R.Site_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;) and ERF_ID in (select R.ERF_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake CyberShake_Site_Ruptures &amp;gt; cs_site_ruptures.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'ERF_ID in (select R.ERF_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;) and (Source_ID, Rupture_ID) in (select distinct SR.Source_ID, SR.Rupture_ID from CyberShake_Site_Ruptures SR, CyberShake_Runs R where SR.CS_Site_ID=R.Site_ID and SR.ERF_ID=R.ERF_ID and R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Ruptures &amp;gt; ruptures.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where '(Rup_Var_Scenario_ID, ERF_ID) in (select R.Rup_Var_Scenario_ID, R.ERF_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;) and (Source_ID, Rupture_ID) in (select distinct SR.Source_ID, SR.Rupture_ID from CyberShake_Site_Ruptures SR, CyberShake_Runs R where SR.CS_Site_ID=R.Site_ID and SR.ERF_ID=R.ERF_ID and R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Rupture_Variations &amp;gt; rupture_variations.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'AR_Hazard_Dataset_ID in (select distinct D.AR_Hazard_Dataset_ID from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;gt;=(select distinct D.Min_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;lt;=(select distinct D.Max_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;gt;=(select distinct D.Min_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;lt;=(select distinct D.Max_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID)' CyberShake AR_Hazard_Curves &amp;gt; ar_hazard_curves.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p CyberShake AR_Hazard_Datasets Atten_Rel_Metadata Atten_Rels CyberShake_Site_Regions CyberShake_Site_Types CyberShake_Sites ERF_IDs ERF_Metadata ERF_Probability_Models IM_Types Rupture_Variation_Probability_Modifier Rupture_Variation_Scenario_IDs Rupture_Variation_Scenario_Metadata Rup_Var_Seeds SGT_Variation_IDs SGT_Variation_Metadata Studies Time_Spans Velocity_Models &amp;gt; input_tables.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Next, convert the SQL dumps into SQLite format using mysql2sqlite:&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#For each of the dump files, run:&lt;br /&gt;
  mysql2sqlite &amp;lt;SQL dump file&amp;gt; &amp;gt; &amp;lt;SQLite dump file&amp;gt;&lt;br /&gt;
  Example:  ./mysql2sqlite peak_amps.sql &amp;gt; peak_amps.sqlite&lt;br /&gt;
For large tables this may take an hour or two.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Create a SQLite database and import the tables.  Use sqlite3 3.7.11 or later, or there will be an error reading the dump files.  If you get errors about too many entries, upgrade to a more recent version and try again.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a database for this study:&amp;lt;/li&amp;gt;&lt;br /&gt;
  sqlite3 &amp;lt;study name&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;For each dump file, run the following command:&amp;lt;/li&amp;gt;&lt;br /&gt;
  .read &amp;lt;path/to/dump/file.sqlite&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Run a few queries on the original table on the database, and on the sqlite files to check that the count is the same.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Check the number of PeakAmplitudes:&amp;lt;/li&amp;gt;&lt;br /&gt;
  select count(*) from PeakAmplitudes P, CyberShake_Runs R where R.Study_ID=&amp;lt;study_id&amp;gt; and P.Run_ID=R.Run_ID;&lt;br /&gt;
&amp;lt;li&amp;gt;Check the number of rupture variations for each site:&amp;lt;/li&amp;gt;&lt;br /&gt;
  select count(*) from CyberShake_Runs R, CyberShake_Site_Ruptures SR, Rupture_Variations V where R.Site_ID=SR.CS_Site_ID and SR.Source_ID=V.Source_ID and SR.Rupture_ID=V.Rupture_ID and R.ERF_ID=V.ERF_ID and R.ERF_ID=SR.ERF_ID and V.Rup_Var_Scenario_ID=R.Rup_Var_Scenario_ID and R.Study_ID=&amp;lt;study id&amp;gt;;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Create index on Run_ID and IM_Type_ID in PeakAmplitudes table to speed up access.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy over the original version to a new file.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Set the SQLITE_TMPDIR variable to point to a filesystem with lots of free space.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Using a current version of sqlite3, open the new database file and run:&amp;lt;/li&amp;gt;&lt;br /&gt;
  CREATE INDEX &amp;quot;idx_PeakAmplitudes_Run_ID_IM_Type_ID&amp;quot; ON &amp;quot;PeakAmplitudes&amp;quot; (`Run_ID`, `IM_Type_ID`);&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Move the files to the archive location on project.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Tar up the sqlite files:&amp;lt;/li&amp;gt;&lt;br /&gt;
  tar czvf &amp;lt;study_name&amp;gt;.tgz *.sqlite&lt;br /&gt;
&amp;lt;li&amp;gt;SFTP the tgz file to the study sqlite archive location on project at CARC (/project/scec_608/cybershake/results/sqlite_studies/&amp;lt;study name&amp;gt;), using hpc-transfer1.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Extract the database to /project/scec_608/cybershake/results/sqlite_studies/&amp;lt;study_name&amp;gt; .&amp;lt;/li&amp;gt;&lt;br /&gt;
  tar xzvf sqlite_dumps/&amp;lt;study_name&amp;gt;.tgz &amp;lt;study_name&amp;gt;.sqlite&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Move study to from production DB to data access DB ==&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Code_Base&amp;diff=30590</id>
		<title>CyberShake Code Base</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Code_Base&amp;diff=30590"/>
		<updated>2025-11-26T19:57:56Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* DirectSynth */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page details all the pieces of code which make up the CyberShake code base, as of November 2017.  Note that this does not include the workflow middleware, or the workflow generators; that code is detailed at [[CyberShake Workflow Framework]].&lt;br /&gt;
&lt;br /&gt;
Conceptually, we can divide up the CyberShake codes into three categories:&lt;br /&gt;
&lt;br /&gt;
#Strain Green Tensor-related codes: These codes produce the input files needed to generate SGTs, actually calculate the SGTs, and do some reformatting and sanity checks on the results.&lt;br /&gt;
#Synthesis-related codes: These codes take the SGTs and perform seismogram synthesis and intensity measure calculations.&lt;br /&gt;
#Data product codes: These codes insert the results into the database, and use the database to generate a variety of output data products.&lt;br /&gt;
&lt;br /&gt;
Below is a description of each piece of software we use, organized by these categories.  For each piece of software, we include a description of where it is located, how to compile and use it, and what its inputs and outputs are.  At the end, we provide a description of input and output files and formats.&lt;br /&gt;
&lt;br /&gt;
== Code Installation ==&lt;br /&gt;
&lt;br /&gt;
Historically, we have selected a root directory for CyberShake, then created the subdirectories 'software' for all the code, 'ruptures' for the rupture files, 'logs' for log files, and 'utils' for workflow tools.  This is typically set up in unpurged storage space, so once installed purging isn't a worry.  Each code listed below, along with the configuration file, should be checked out into the 'software' subdirectory.&lt;br /&gt;
&lt;br /&gt;
In terms of compilers, you should use the GNU compilers unless specifically directed otherwise.&lt;br /&gt;
&lt;br /&gt;
Most of the codes below contain a main directory.  Inside that is a bin directory, with binaries; a src directory with code requiring compilation; and wrappers, in the main directory.&lt;br /&gt;
&lt;br /&gt;
If you are looking for compilation instructions, a general guide is available [[CyberShake compilation guide | here]].&lt;br /&gt;
&lt;br /&gt;
=== Configuration file ===&lt;br /&gt;
&lt;br /&gt;
Many CyberShake codes use a configuration file, which specifies the root directory for the CyberShake installation, the command use to start an MPI executable, paths to a tmp and scratch space (which can be the same), and the path to the CyberShake rupture directory.  We have done this instead of environment variables because it's more transparent and easier for multiple users.  Both of these files should be stored in the 'software' subdirectory.&lt;br /&gt;
&lt;br /&gt;
The configuration file is available at:&lt;br /&gt;
 https://github.com/SCECcode/cybershake-core/cybershake.cfg&lt;br /&gt;
Obviously, this file must be edited to be correct for the install.&lt;br /&gt;
&lt;br /&gt;
The keys that CyberShake currently expects to find are:&lt;br /&gt;
*CS_PATH = /path/to/CyberShake/software/directory&lt;br /&gt;
*SCRATCH_PATH = /path/to/shared/scratch&lt;br /&gt;
*TMP_PATH = /path/to/tmp (can be node-local, or shared with scratch&lt;br /&gt;
*RUPTURE_PATH = /path/to/CyberShake/rupture/directory&lt;br /&gt;
*MPI_CMD = ibrun or aprun or mpiexec&lt;br /&gt;
*LOG_PATH = /path/to/CyberShake/logs/directory&lt;br /&gt;
&lt;br /&gt;
To interact with cybershake.cfg, the CyberShake codes use a Python script to deliver cybershake.cfg entries as key-value pairs, located here:&lt;br /&gt;
 https://github.com/SCECcode/cybershake-core/config.py&lt;br /&gt;
Several CyberShake codes import config, then use it to read out the cybershake.cfg file.&lt;br /&gt;
&lt;br /&gt;
=== Compiler file ===&lt;br /&gt;
&lt;br /&gt;
A long time ago, Gideon Juve created a compiler file, Compilers.mk, which contains information about which compilers should be used for which system.  This file should also be downloaded using 'svn export' and installed in the software directory, from&lt;br /&gt;
 https://github.com/SCECcode/cybershake-core/Compilers.mk&lt;br /&gt;
&lt;br /&gt;
Some of the makefiles reference this file.  This can - and should - be updated to reflect new systems.&lt;br /&gt;
&lt;br /&gt;
== SGT-related codes ==&lt;br /&gt;
&lt;br /&gt;
[[File:SGT_workflow_stages.png|thumb|right|300px|Overview of the codes involved in the SGT part of CyberShake, [http://hypocenter.usc.edu/research/cybershake/full_SGT_workflow.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
=== PreCVM ===&lt;br /&gt;
&lt;br /&gt;
This code stands for &amp;quot;Pre-Community-Velocity-Model&amp;quot;.  It has to be run before the UCVM codes, since it generates input files required by UCVM.  &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To determine the simulation volume for a particular CyberShake site.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PreCVM queries the CyberShake database to determine all of the ruptures which fall within a given cutoff for a certain site.  From that information, padding is added around the edges to construct the CyberShake simulation volume for this site.  Additional padding so the X and Y dimensions are multiples of 10, 20, or 40 might also be applied, depending on the input parameters.  Using this volume, both the X/Y offset of each grid point, and then the latitude and longitude using a great circle projection, are determined and written to output files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#The CyberShake volume depth needs to be changed, so as to have the right number of grid points. That is set in the genGrid() function in GenGrid_py/gen_grid.py, in km.&lt;br /&gt;
#X and Y padding needs to be altered.  That is set using 'bound_pad' in Modelbox/get_modelbox.py, around line 70.&lt;br /&gt;
#The rotation of the simulation volume needs to be changed.  That is set using 'model_rot' in Modelbox/get_modelbox.py, around line 70.&lt;br /&gt;
#The database access parameters have changed.  That's in Modelbox/get_modelbox.py, around line 80.&lt;br /&gt;
#The divisibility needs for GPU simulations change (currently, we need the dimensions to be evenly divisible by the number of GPUs used in that dimension.  That is in Modelbox/get_modelbox.py, around line 250.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/PreCVM/&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Rob Graves, wrapped by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]], [[CyberShake_Code_Base#MySQLdb|MySQLdb for Python]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  pre_cvm.py&lt;br /&gt;
    Modelbox/get_modelbox.py&lt;br /&gt;
      Modelbox/bin/gcproj&lt;br /&gt;
    GenGrid_py/gen_grid.py&lt;br /&gt;
      GenGrid_py/bin/gen_model_cords&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the Modelbox/src and the GenGrid_py/src directories.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: pre_cvm.py [options]&lt;br /&gt;
  Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --site=SITE           Site name&lt;br /&gt;
  --erf_id=ERF_ID       ERF ID&lt;br /&gt;
  --modelbox=MODELBOX   Path to modelbox file (output)&lt;br /&gt;
  --gridfile=GRIDFILE   Path to gridfile (output)&lt;br /&gt;
  --gridout=GRIDOUT     Path to gridout (output)&lt;br /&gt;
  --coordfile=COORDSFILE&lt;br /&gt;
                        Path to coorfile (output)&lt;br /&gt;
  --paramsfile=PARAMSFILE&lt;br /&gt;
                        Path to paramsfile (output)&lt;br /&gt;
  --boundsfile=BOUNDSFILE&lt;br /&gt;
                        Path to boundsfile (output)&lt;br /&gt;
  --frequency=FREQUENCY&lt;br /&gt;
                        Frequency&lt;br /&gt;
  --gpu                 Use GPU box settings.&lt;br /&gt;
  --spacing=SPACING     Override default spacing with this value.&lt;br /&gt;
  --server=SERVER       Address of server to query in creating modelbox,&lt;br /&gt;
                        default is focal.usc.edu.&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; requires 6 minutes for 100m spacing, 10 billion point volume&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; None; inputs are retrieved from the database&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Modelbox|modelbox]], [[CyberShake_Code_Base#Gridfile|gridfile]], [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Params|params]], [[CyberShake_Code_Base#Coord|coord]], [[CyberShake_Code_Base#Bounds|bounds]]&lt;br /&gt;
&lt;br /&gt;
=== UCVM ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate a populated velocity mesh for a CyberShake simulation volume.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; UCVM takes the volume defined by PreCVM and queries the [[UCVM]] software, using the C API, to populate the volume.  The resulting mesh is then checked for Vp/Vs ratio, minimum Vp/Vs/rho, and for no Infs or NaNs.  The data is outputted in either Graves (RWG) format or AWP format.  This code also produces log files, which will be written to the CyberShake logs directory/GenLog/site/v_mpi-&amp;lt;processor number&amp;gt;.log.  This can be useful if there's an error and you aren't sure why.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#New velocity models are added.  Velocity models are specified in the DAX and passed through the wrapper scripts into the C code and then ultimately to UCVM, so an if statement must be added to around line 250 (and around line 450 if it's applicable for no GTL).&lt;br /&gt;
#The backend UCVM substantially changes.  If we move to the Python implementation, for example.&lt;br /&gt;
#If additional models are added, new libraries may need to be added to the makefile.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/UCVM&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]], [[CyberShake_Code_Base#UCVM|UCVM]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  single_exe.py&lt;br /&gt;
    single_exe.csh&lt;br /&gt;
      bin/ucvm-single-mpi&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;The makefile needs to be edited so that &amp;quot;UCVM_HOME&amp;quot; points to the UCVM home directory.  Then run 'make' in the UCVM/src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;All of site, gridout, modelcords, models, and format must be specified.&lt;br /&gt;
Usage: single_exe.py [options]&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --site=SITE           Site name&lt;br /&gt;
  --gridout=GRIDOUT     Path to gridout (output)&lt;br /&gt;
  --coordfile=COORDSFILE&lt;br /&gt;
                        Path to coordfile (output)&lt;br /&gt;
  --models=MODELS       Comma-separated string on velocity models to use.&lt;br /&gt;
  --format=FORMAT       Specify awp or rwg format for output.&lt;br /&gt;
  --frequency=FREQUENCY&lt;br /&gt;
                        Frequency&lt;br /&gt;
  --spacing=SPACING     Override default spacing with this value (km)&lt;br /&gt;
  --min_vs=MIN_VS       Override minimum Vs value.  Minimum Vp and minimum&lt;br /&gt;
                        density will be 3.4 times this value.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel on ~4000 cores; for 10 billion points and the C version of UCVM, takes about 20 minutes.  Typically only half the cores per node are used to get more memory per process.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#coords|coords]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; either [[CyberShake_Code_Base#RWG_format|RWG format]] or [[CyberShake_Code_Base#AWP_format|AWP format]], depending on the option selected.&lt;br /&gt;
&lt;br /&gt;
=== Smoothing ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To smooth a velocity file along model interfaces.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The smoothing code takes in a velocity mesh, determines the surface coordinates of the interfaces between velocity models, gets a list of all the points which need to be smoothed, and then performs the smoothing by averaging in both the X and Y direction for a user-specified number of points (default of 10km in each direction).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change our version of UCVM.  The LD_LIBRARY_PATH needs to be modified, in run_smoothing.py around line 98.&lt;br /&gt;
#The smoothing algorithm is modified.  Currently that is specified in the average_point() function in smooth_mpi.c.&lt;br /&gt;
#We start using velocity models with boundaries aren't perpendicular to the earth's surface.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/UCVM/smoothing&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#UCVM|UCVM]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  smoothing/run_smoothing.py&lt;br /&gt;
    bin/determine_surface_model&lt;br /&gt;
    smoothing/determine_smoothing_points.py&lt;br /&gt;
    smoothing/smooth_mpi&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the smoothing directory, and make sure that determine_surface_model has been compiled in the UCVM/src directory.  You may need to change the compiler; currently it uses 'cc'.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: run_smoothing.py [options]&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --gridout=GRIDOUT     gridout file&lt;br /&gt;
  --coords=COORDS       coords file&lt;br /&gt;
  --models=MODELSTRING  comma-separated list of velocity models&lt;br /&gt;
  --smoothing-dist=SMOOTHING_DIST&lt;br /&gt;
                        Number of grid points to smooth over.  About 10km of&lt;br /&gt;
                        grid points is a good starting place.&lt;br /&gt;
  --mesh=MESH           AWP-format velocity mesh to smooth&lt;br /&gt;
  --mesh-out=MESH_OUT   Output smoothed mesh&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel on ~1500 cores; for 5 billion points and the C version of UCVM, takes about 16 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP_format|AWP format velocity file]], [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Coord|coord]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP_format|AWP format]] smoothed velocity file.&lt;br /&gt;
&lt;br /&gt;
=== PreSGT ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate a series of input files which are used by the wave propagation codes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PreSGT determines the X and Y coordinates of the site location (where the impulse will go for the wave propagation simulation) and determines, which mesh point (X and Y) maps most closely to every point on a fault surface which is within the cutoff.  That information is combined with an adaptive mesh approach to create a list of all the points for which SGTs should be saved.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change our approach for saving adaptive mesh points.&lt;br /&gt;
#We change the location of the rupture geometry files, currently assumed to be &amp;lt;rupture root&amp;gt;/Ruptures_erf&amp;lt;erf ID&amp;gt; .  This is specified in presgt.py, line 167.&lt;br /&gt;
#The directory hierarchy and naming scheme for rupture geometry files, currently &amp;lt;src id&amp;gt;/&amp;lt;rup id&amp;gt;/&amp;lt;src id&amp;gt;_&amp;lt;rup_id&amp;gt;.txt, changes.  This is specified in faultlist_py/CreateFaultList.py, line 36.&lt;br /&gt;
#The number of header lines in the rupture geometry file changes.  This would require changing the nheader value, currently 6, specified in faultlist_py/CreateFaultList.py, line 36.&lt;br /&gt;
#We switch to RSQSim ruptures, or other ruptures in which the geometry isn't planar.  Modifications would be required to gen_sgtgrid.c.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/PreSgt&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Rob Graves, heavily modified by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]], [[CyberShake_Code_Base#libcfu|libcfu]], [[CyberShake_Code_Base#MySQLdb|MySQLdb for Python]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  presgt.py&lt;br /&gt;
    faultlist_py/CreateFaultList.py&lt;br /&gt;
    bin/gen_sgtgrid&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./presgt.py &amp;lt;site&amp;gt; &amp;lt;erf_id&amp;gt; &amp;lt;modelbox&amp;gt; &amp;lt;gridout&amp;gt; &amp;lt;model_coords&amp;gt; &amp;lt;fdloc&amp;gt; &amp;lt;faultlist&amp;gt; &amp;lt;radiusfile&amp;gt; &amp;lt;sgtcords&amp;gt; &amp;lt;spacing&amp;gt; [frequency]&lt;br /&gt;
Example: ./presgt.py USC 33 USC.modelbox gridout_USC model_coords_GC_USC USC.fdloc USC.faultlist USC.radiusfile USC.cordfile 200.0 0.1&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel on 8 nodes, 32 cores (gen_sgtgrid is a parallel code); for 200m spacing UCERF2, takes about 8 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Modelbox|modelbox]], [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Coord|coord]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Fdloc|fdloc]], [[CyberShake_Code_Base#Faultlist|faultlist]], [[CyberShake_Code_Base#Radiusfile|radiusfile]], [[CyberShake_Code_Base#SgtCoords|sgtcoords]].&lt;br /&gt;
&lt;br /&gt;
=== PreAWP ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate input files in a format that AWP-ODC expects.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PreAWP performs a number of steps:&lt;br /&gt;
#An IN3D parameter file is produced, needed for AWP-ODC.&lt;br /&gt;
#A file with the SGT coordinates to save in AWP format is produced.  Since RWG and AWP use different coordinate systems, a coordinate transformation (X-&amp;gt;Y, Y-&amp;gt;X, zero-indexing-&amp;gt;one-indexing) is performed on the SGT coordinates file.&lt;br /&gt;
#The velocity file in translated to AWP format, if it isn't in AWP format already. &lt;br /&gt;
#The correct source, based on the dt and nt, is selected.  The source must be generated manually ahead of time.  Details about source generation are given [[CyberShake Code Base#Impulse source descriptions | here]].&lt;br /&gt;
#Striping for the output file is also set up here.&lt;br /&gt;
#Files are symlinked into the directory structure that AWP expects.  Note that slightly different versions of this exist for the CPU and GPU implementations of AWP-ODC-SGT.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#The path to the Lustre striping command (lfs) changes.  This path is hard-coded in build_awp_inputs.py, line 14.  Note that this is the path to lfs on the compute node, NOT the login node.&lt;br /&gt;
#The AWP code changes its input format.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/AWP-HIP-SGT/utils/ (HIP GPU) or https://github.com/SCECcode/cybershake-core/AWP-GPU-SGT/utils/ (CUDA GPU) or https://github.com/SCECcode/cybershake-core/AWP-ODC-SGT/utils/ (CPU), AND also https://github.com/SCECcode/cybershake-core/SgtHead &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; SgtHead&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  build_awp_inputs.py&lt;br /&gt;
    build_IN3D.py&lt;br /&gt;
    build_src.py&lt;br /&gt;
    build_cordfile.py&lt;br /&gt;
      SgtHead/gen_awp_cordfile.py&lt;br /&gt;
    build_media.py&lt;br /&gt;
      SgtHead/bin/reformat_velocity&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the SgtHead/src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: build_awp_inputs.py [options]&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --site=SITE           Site name&lt;br /&gt;
  --gridout=GRIDOUT     Path to gridout input file&lt;br /&gt;
  --fdloc=FDLOC         Path to fdloc input file&lt;br /&gt;
  --cordfile=CORDFILE   Path to cordfile input file&lt;br /&gt;
  --velocity-prefix=VEL_PREFIX&lt;br /&gt;
                        RWG velocity prefix.  If omitted, will not reformat&lt;br /&gt;
                        velocity file, just symlink.&lt;br /&gt;
  --frequency=FREQUENCY&lt;br /&gt;
                        Frequency of SGT run, 0.5 Hz by default.&lt;br /&gt;
  --px=PX               Number of processors in X-direction.&lt;br /&gt;
  --py=PY               Number of processors in Y-direction.&lt;br /&gt;
  --pz=PZ               Number of processors in Z-direction.&lt;br /&gt;
  --source-frequency=SOURCE_FREQ&lt;br /&gt;
                        Low-pass filter frequency to use on the source,&lt;br /&gt;
                        default is same frequency as the frequency of the run.&lt;br /&gt;
  --spacing=SPACING     Override default spacing, derived from frequency.&lt;br /&gt;
  --velocity-mesh=VEL_MESH&lt;br /&gt;
                        Provide path to velocity mesh.  If omitted, will&lt;br /&gt;
                        assume mesh is named awp.&amp;lt;site&amp;gt;.media.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for 1 Hz run, takes about 11 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Fdloc|fdloc]], [[CyberShake_Code_Base#SgtCoords|cordfile]], velocity mesh (if in [[CyberShake_Code_Base#RWG_format|RWG format]], will be converted to [[CyberShake_Code_Base#AWP_format|AWP]]), [[CyberShake_Code_Base#RWG_source|RWG source]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP source|AWP source]], [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]], [[CyberShake_Code_Base#AWP cordfile|AWP cordfile]].&lt;br /&gt;
&lt;br /&gt;
=== AWP-ODC-SGT, CPU version ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To perform SGT synthesis&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; AWP-ODC-SGT is the CPU version. It uses the IN3D file for its parameters.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#New science or features are added to the AWP code.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/AWP-ODC-SGT&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kim Olsen, Steve Day, Yifeng Cui, various students and post-docs, wrapped by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; iobuf module&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  awp_odc_wrapper.sh&lt;br /&gt;
    bin/pmcl3d&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Using the GNU compilers, run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt; pmcl3d &amp;lt;IN3D parameter file&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel; for 0.5 Hz run (2 billion points, 20k timesteps), takes about 45 minutes on 10,000 cores.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP cordfile|AWP cordfile]],  [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]]), [[CyberShake_Code_Base#AWP source|AWP source]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]].&lt;br /&gt;
&lt;br /&gt;
=== AWP-ODC-SGT, CUDA GPU version ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To perform SGT synthesis&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; AWP-ODC-SGT is the GPU version. It takes parameters on the command-line, so the wrapper converts the IN3D file into command-line arguments and invokes it.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#New science or features are added to the AWP code.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; hhttps://github.com/SCECcode/cybershake-core/AWP-GPU-SGT&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kim Olsen, Steve Day, Yifeng Cui, various students and post-docs, wrapped by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; CUDA toolkit module&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  gpu_wrapper.py&lt;br /&gt;
    bin/pmcl3d&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;modules PrgEnv-gnu and module cudatoolkit must be loaded first.  Then, run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./pmcl3d &lt;br /&gt;
Options:&lt;br /&gt;
	[(-T | --TMAX) &amp;lt;TMAX&amp;gt;]&lt;br /&gt;
	[(-H | --DH) &amp;lt;DH&amp;gt;]&lt;br /&gt;
	[(-t | --DT) &amp;lt;DT&amp;gt;]&lt;br /&gt;
	[(-A | --ARBC) &amp;lt;ARBC&amp;gt;]&lt;br /&gt;
	[(-P | --PHT) &amp;lt;PHT&amp;gt;]&lt;br /&gt;
	[(-M | --NPC) &amp;lt;NPC&amp;gt;]&lt;br /&gt;
	[(-D | --ND) &amp;lt;ND&amp;gt;]&lt;br /&gt;
	[(-S | --NSRC) &amp;lt;NSRC&amp;gt;]&lt;br /&gt;
	[(-N | --NST) &amp;lt;NST&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-V | --NVE) &amp;lt;NVE&amp;gt;]&lt;br /&gt;
	[(-B | --MEDIASTART) &amp;lt;MEDIASTART&amp;gt;]&lt;br /&gt;
	[(-n | --NVAR) &amp;lt;NVAR&amp;gt;]&lt;br /&gt;
	[(-I | --IFAULT) &amp;lt;IFAULT&amp;gt;]&lt;br /&gt;
	[(-R | --READ_STEP) &amp;lt;x READ_STEP]&lt;br /&gt;
&lt;br /&gt;
	[(-X | --NX) &amp;lt;x length]&lt;br /&gt;
	[(-Y | --NY) &amp;lt;y length&amp;gt;]&lt;br /&gt;
	[(-Z | --NZ) &amp;lt;z length]&lt;br /&gt;
	[(-x | --NPX) &amp;lt;x processors]&lt;br /&gt;
	[(-y | --NPY) &amp;lt;y processors&amp;gt;]&lt;br /&gt;
	[(-z | --NPZ) &amp;lt;z processors&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-1 | --NBGX) &amp;lt;starting point to record in X&amp;gt;]&lt;br /&gt;
	[(-2 | --NEDX) &amp;lt;ending point to record in X&amp;gt;]&lt;br /&gt;
	[(-3 | --NSKPX) &amp;lt;skipping points to record in X&amp;gt;]&lt;br /&gt;
	[(-11 | --NBGY) &amp;lt;starting point to record in Y&amp;gt;]&lt;br /&gt;
	[(-12 | --NEDY) &amp;lt;ending point to record in Y&amp;gt;]&lt;br /&gt;
	[(-13 | --NSKPY) &amp;lt;skipping points to record in Y&amp;gt;]&lt;br /&gt;
	[(-21 | --NBGZ) &amp;lt;starting point to record in Z&amp;gt;]&lt;br /&gt;
	[(-22 | --NEDZ) &amp;lt;ending point to record in Z&amp;gt;]&lt;br /&gt;
	[(-23 | --NSKPZ) &amp;lt;skipping points to record in Z&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-i | --IDYNA) &amp;lt;i IDYNA&amp;gt;]&lt;br /&gt;
	[(-s | --SoCalQ) &amp;lt;s SoCalQ&amp;gt;]&lt;br /&gt;
	[(-l | --FL) &amp;lt;l FL&amp;gt;]&lt;br /&gt;
	[(-h | --FH) &amp;lt;i FH&amp;gt;]&lt;br /&gt;
	[(-p | --FP) &amp;lt;p FP&amp;gt;]&lt;br /&gt;
	[(-r | --NTISKP) &amp;lt;time skipping in writing&amp;gt;]&lt;br /&gt;
	[(-W | --WRITE_STEP) &amp;lt;time aggregation in writing&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-100 | --INSRC) &amp;lt;source file&amp;gt;]&lt;br /&gt;
	[(-101 | --INVEL) &amp;lt;mesh file&amp;gt;]&lt;br /&gt;
	[(-o | --OUT) &amp;lt;output file&amp;gt;]&lt;br /&gt;
	[(-c | --CHKFILE) &amp;lt;checkpoint file to write statistics&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
	[(-G | --IGREEN) &amp;lt;IGREEN for SGT&amp;gt;]&lt;br /&gt;
	[(-200 | --NTISKP_SGT) &amp;lt;NTISKP for SGT&amp;gt;]&lt;br /&gt;
	[(-201 | --INSGT) &amp;lt;SGT input file&amp;gt;]&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel; for 1 Hz run (10 billion points, 40k timesteps), takes about 55 minutes on 800 GPUs.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP cordfile|AWP cordfile]],  [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]]), [[CyberShake_Code_Base#AWP source|AWP source]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]].&lt;br /&gt;
&lt;br /&gt;
=== PostAWP ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To prepare the AWP results for use in post-processing.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PostAWP prepares the outputs of AWP so that they can be used with the RWG-authored post-processing code.  Specifically, it undoes the AWP coordinate transformation and reformats the AWP output files into the SGT component order expected by RWG (XX-&amp;gt;YY, YY-&amp;gt;XX, XZ-&amp;gt;-YZ, YZ-&amp;gt;-XZ, and all SGTs are doubled if we are calculating the Z-component), creates separate SGT header files, and calculates MD5 sums on the SGT files.  Calculating the header information requires a number of input files, since lambda, mu, and the location of the impulse must all be included.  The MD5 sums can be calculated separately, using the MD5 wrapper RunMD5sum. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#The AWP code is modified to produce outputs in exactly RWG order&lt;br /&gt;
#The header format for the post-processing code changes&lt;br /&gt;
#We decide not to calculate MD5 sums&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/AWP-HIP-SGT/utils/prepare_for_pp.py (this will work for the CPU version of AWP also, despite the path); https://github.com/SCECcode/cybershake-core/SgtHead&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  AWP-GPU-SGT/utils/prepare_for_pp.py&lt;br /&gt;
    SgtHead/bin/reformat_awp_mpi&lt;br /&gt;
    SgtHead/bin/write_head&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make write_head' and 'make reformat_awp_mpi' in the SgtHead/src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./prepare_for_pp.py &amp;lt;site&amp;gt; &amp;lt;AWP SGT&amp;gt; &amp;lt;reformatted SGT filename&amp;gt; &amp;lt;modelbox file&amp;gt; &amp;lt;rwg cordfile&amp;gt; &amp;lt;fdloc file&amp;gt; &amp;lt;gridout file&amp;gt; &amp;lt;IN3D file&amp;gt; &amp;lt;AWP media file&amp;gt; &amp;lt;component&amp;gt; &amp;lt;run_id&amp;gt; &amp;lt;header&amp;gt; [frequency]&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel, 4 processors on 2 nodes; for a 750 GB SGT, takes about 100 minutes &amp;lt;b&amp;gt;without&amp;lt;/b&amp;gt; the MD5 sums.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]], [[CyberShake_Code_Base#Modelbox|modelbox]],  [[CyberShake_Code_Base#SgtCoords|RWG cordfile]]), [[CyberShake_Code_Base#Fdloc|fdloc]], [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT file]], [[CyberShake_Code_Base#SGT header file|SGT header file]]&lt;br /&gt;
&lt;br /&gt;
=== RunMD5sum ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Wrapper for performing MD5sums.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; On Titan, we ran into wallclock issues when bundling the MD5sums along with PostAWP.  This wrapper supports performing the MD5 sums separately.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change hash algorithms&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/SgtHead/run_md5sum.sh&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  run_md5sum.sh&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./run_md5sum.sh &amp;lt;file&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for a 750 GB SGT, takes about 70 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; MD5sum, with filename &amp;lt;RWG SGT filename&amp;gt;.md5&lt;br /&gt;
&lt;br /&gt;
=== NanCheck ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Check the SGTs for anomalies before the post-processing.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code checks to be sure the SGTs are the expected size, then checks for NaNs or too many consecutive zeros in the SGT files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change the number of timesteps in the SGT file.  Currently this is hardcoded, but it should be a command-line parameter.&lt;br /&gt;
#We want to add additional checks.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/SgtTest/&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Rob Graves, Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  perform_checks.py&lt;br /&gt;
    bin/check_for_nans&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Run 'make' in SgtTest/src .&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./perform_checks.py &amp;lt;SGT file&amp;gt; &amp;lt;SGT header file&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for a 750 GB SGT, takes about 45 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]], [[CyberShake_Code_Base#Sgt Coords|RWG coordinate file]], [[CyberShake_Code_Base#IN3D | IN3D file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
== PP-related codes ==&lt;br /&gt;
&lt;br /&gt;
The following codes are related to the post-processing part of the workflow.&lt;br /&gt;
&lt;br /&gt;
[[File:PP_workflow_stages.png|thumb|right|300px|Overview of the codes involved in the PP part of CyberShake, [http://hypocenter.usc.edu/research/cybershake/full_PP_workflow.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== CheckSgt ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To check the MD5 sums of the SGT files to be sure they match.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; CheckSgt takes the SGT files and their corresponding MD5 sums and checks for agreement.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change hashing algorithms.&lt;br /&gt;
#We decide to add additional sanity checks to the beginning of the post-processing.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/CheckSgt&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  CheckSgt.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./CheckSgt.py &amp;lt;sgt file&amp;gt; &amp;lt;md5 file&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for a 750 GB SGT, takes about 90 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT]], SGT MD5 sums&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
=== DirectSynth ===&lt;br /&gt;
&lt;br /&gt;
DirectSynth is the code we currently use to perform the post-processing.  For historical reasons, all of the codes used for CyberShake post-processing are documented here: [https://scec.usc.edu/it/Post-processing_options CyberShake post-processing options] (login required).  The current version uses the [[Rupture_Variation_Generator_v5.5.2|Graves and Pitarka (2022) rupture generator]], RupGen-v5.5.2.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To perform reciprocity calculations and produce seismograms, intensity measures, and duration measures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; DirectSynth reads in the SGTs across a group of processes, and hands out tasks (synthesis jobs) to worker processes.  These worker processes read in rupture geometry information from disk and call the RupGen-api to generate full slip histories in memory.  The workers request SGTs from the reader processes over MPI. X and Y component PSA calculations are performed from the resultant seismograms, and RotD and duration calculations are also performed, if requested.  More details about the approach used are available at [[DirectSynth]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We have new intensity measures or other calculations per seismogram to perform.&lt;br /&gt;
#We decide to change the post-processing algorithm.&lt;br /&gt;
#The wrapper needs to be modified if we want to set different custom environment variables.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/DirectSynth&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan, original seismogram synthesis code by Rob Graves, X and Y component PSA code by David Okaya, RotD code by Christine Goulet&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake Code Base#Getpar | Getpar]], [[CyberShake Code Base#libcfu | libcfu]], [[CyberShake Code Base#RupGen-api-v5.5.2 | RupGen-api-v5.5.2, [[CyberShake Code Base#FFTW | FFTW]], [[CyberShake Code Base#libmemcached | libmemcached]] (optional) and [[CyberShake Code Base#memcached | memcached]] (optional)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  direct_synth.py&lt;br /&gt;
    utils/pegasus_wrappers/invoke_memcached.sh&lt;br /&gt;
      memcached&lt;br /&gt;
    bin/direct_synth  &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; &lt;br /&gt;
#Compile RupGen-api first.&lt;br /&gt;
#Edit the makefile in DirectSynth/src .  Check the following variables:&lt;br /&gt;
##BASE_DIR should point to the top-level CyberShake install directory&lt;br /&gt;
##LIBCFU should point to the libcfu install directory&lt;br /&gt;
##CUR_RG_LIB should point to the RupGen-api-&amp;lt;version&amp;gt;/lib directory&lt;br /&gt;
##LDLIBS should have the correct paths to the libcfu and libmemcached lib directories&lt;br /&gt;
##CUR_RG_INC should point to the RupGen-api-&amp;lt;version&amp;gt;/include directory&lt;br /&gt;
##IFLAGS should have the correct paths to the libcfu and libmemcached include directories&lt;br /&gt;
#Run 'make direct_synth' in DirectSynth/src.&lt;br /&gt;
&lt;br /&gt;
direct_synth.py has an if statement which, based on the wrapper to MPI on the target system (such as 'mpiexec', 'ibrun', or 'jsrun'), will set up the execution string.  You may need to add a new section of the if statement if a new MPI wrapper is used.  This is also the place to change the number of cores used per node.  You may also need to add hard-coded paths to memcached here.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
direct_synth.py&lt;br /&gt;
 stat=&amp;lt;site short name&amp;gt;&lt;br /&gt;
 slat=&amp;lt;site lat&amp;gt;&lt;br /&gt;
 slon=&amp;lt;site lon&amp;gt;&lt;br /&gt;
 run_id=&amp;lt;run id&amp;gt;&lt;br /&gt;
 sgt_handlers=&amp;lt;number of SGT handler processes; must be enough for the SGTs to be read into memory&amp;gt;&lt;br /&gt;
 debug=&amp;lt;print logs for each process; 1 is yes, 0 no&amp;gt;&lt;br /&gt;
 max_buf_mb=&amp;lt;buffer size in MB for each worker to use for storing SGT information; 512 recommended&amp;gt;&lt;br /&gt;
 rupture_spacing=&amp;lt;'uniform' or 'random' hypocenter spacing&amp;gt;&lt;br /&gt;
 ntout=&amp;lt;nt for seismograms&amp;gt;&lt;br /&gt;
 dtout=&amp;lt;dt for seismograms&amp;gt;&lt;br /&gt;
 rup_list_file=&amp;lt;input file containing ruptures to process&amp;gt;&lt;br /&gt;
 rv_info_file=&amp;lt;input file containing rvfrac and random seed for each rupture variation&amp;gt;&lt;br /&gt;
 sgt_xfile=&amp;lt;input SGT X file&amp;gt;&lt;br /&gt;
 sgt_yfile=&amp;lt;input SGT Y file&amp;gt;&lt;br /&gt;
 sgt_zfile=&amp;lt;optional, input SGT Z file&amp;gt;&lt;br /&gt;
 x_header=&amp;lt;input SGT X header&amp;gt;&lt;br /&gt;
 y_header=&amp;lt;input SGT Y header&amp;gt;&lt;br /&gt;
 z_header=&amp;lt;optional, input SGT Z header&amp;gt;&lt;br /&gt;
 det_max_freq=&amp;lt;maximum frequency of deterministic part&amp;gt;&lt;br /&gt;
 stoch_max_freq=&amp;lt;maximum frequency of stochastic part&amp;gt;&lt;br /&gt;
 run_psa=&amp;lt;'1' to run X and Y component PSA, '0' to not&amp;gt;&lt;br /&gt;
 run_rotd=&amp;lt;'1' to run RotD calculations, '0' to not&amp;gt;&lt;br /&gt;
 run_durations=&amp;lt;'1' to run duration calculation, '0' to not&amp;gt;&lt;br /&gt;
 run_period_durations=&amp;lt;'1' to run period duration calculations, '0' to not&amp;gt;&lt;br /&gt;
 simulation_out_pointsX=&amp;lt;the number of components: 2 for just horizontal, 3 to include vertical&amp;gt;&lt;br /&gt;
 simulation_out_pointsY=1&lt;br /&gt;
 simulation_out_timesamples=&amp;lt;same as ntout&amp;gt;&lt;br /&gt;
 simulation_out_timeskip=&amp;lt;same as dtout&amp;gt;&lt;br /&gt;
 surfseis_rspectra_seismogram_units=cmpersec&lt;br /&gt;
 surfseis_rspectra_output_units=cmpersec2&lt;br /&gt;
 surfseis_rspectra_output_type=aa&lt;br /&gt;
 surfseis_rspectra_period=all&lt;br /&gt;
 surfseis_rspectra_apply_filter_highHZ=&amp;lt;high filter, 5.0 for 1 Hz runs, 20.0 or higher for 10 Hz runs&amp;gt;&lt;br /&gt;
 surfseis_rspectra_apply_byteswap=no&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel, typically on 3840 processors; for 750 GB SGTs with ~7000 ruptures, takes about 12 hours.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT]], [[CyberShake_Code_Base#SGT header file|SGT headers]], [[CyberShake_Code_Base#rupture list file|rupture list file]], [[CyberShake_Code_Base#rupture variation info file|rupture variation info file]], [[CyberShake_Rupture_Files|rupture geometry files]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | Seismograms]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | PSA files]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | RotD files]], [[Accessing_CyberShake_Duration_Data | Duration files]]&lt;br /&gt;
&lt;br /&gt;
== Data Product Codes ==&lt;br /&gt;
&lt;br /&gt;
The software in this section takes the data products produced by the SGT and post-processing stages, adds some of it to the database, and creates final data products.  Note that all these codes should be installed on a server close to the database, to reduce insertion and query time.  Currently these are all installed on SCEC disks and accessed from shock.usc.edu.&lt;br /&gt;
&lt;br /&gt;
[[File:Data_workflow_stages.png|thumb|right|300px|Overview of the codes involved in the data product of CyberShake, [http://hypocenter.usc.edu/research/cybershake/full_data_workflow.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
=== Load Amps ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Load data from output files into the database.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code loads either PSA, RotD, or Duration data into the database, depending on command-line options.  It also performs sanity checks on the PSA data being inserted: values must be between 0.008 and 8400 cm/s2. If they are less than 0.008, some will still be passed through if it's a small magnitude event at large distances.  If this constraint is violated, it will abort.  Note that if LoadAmps needs to be rerun, sometimes the database must be cleaned out first, as data from the previous attempt may have inserted successfully and will cause duplicate key errors if you try to insert the same data again.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change the sanity checks on data inserts.&lt;br /&gt;
#We modify the format of the PSA, RotD, or Duration files.&lt;br /&gt;
#We add new types of data to insert.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
#We add a new server.  To add a new server, in addition to providing a command-line option for it, you will need to create a Hibernate config file.  You can start with moment.cfg.xml or focal.cfg.xml and edit lines 7-16 appropriately.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/CyberCommands&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Joshua Garcia, Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; LoadAmps calls CyberCommands, a Java code with a long list of dependencies (all of these are checked into the Java project):&lt;br /&gt;
&lt;br /&gt;
*Ant&lt;br /&gt;
*Apache Commons&lt;br /&gt;
*Hibernate&lt;br /&gt;
*MySQL bindings&lt;br /&gt;
*Xerces&lt;br /&gt;
*DOM4J&lt;br /&gt;
*Log4J&lt;br /&gt;
*Java 1.6+&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  insert_dir.sh&lt;br /&gt;
    CyberLoadAmps_SC&lt;br /&gt;
      cybercommands_SC.jar&lt;br /&gt;
        CyberLoadamps.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Check out CyberCommands into Eclipse.  Create the cybercommands_SC.jar file using Eclipse's JAR build framework and the cybercommands_SC.jardesc description file.  Install cybercommand_SC.jar and the required JAR files on the server.  Point insert_dir.sh to CyberLoadAmps_SC to cybercommands_SC.jar.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: CyberLoadAmps [-r | -d | -z | -u][-c] [-d] [-periods periods] [-run RunID] [-p directory] [-server name] [-z] [-help] [-i insertion_values]&lt;br /&gt;
       [-u] [-f]&lt;br /&gt;
 -i &amp;lt;insertion_values&amp;gt;   Which values to insert -&lt;br /&gt;
                         gm:	geometric mean PSA data (default)&lt;br /&gt;
                         xy:	X and Y component PSA data&lt;br /&gt;
                         gmxy:  Geometric mean and X and Y components&lt;br /&gt;
 -run &amp;lt;RunID&amp;gt;            Run ID - this option is required&lt;br /&gt;
 -p &amp;lt;directory&amp;gt;          file path with spectral acceleration files,&lt;br /&gt;
                         either top-level directory or zip file - this option is required&lt;br /&gt;
 -server &amp;lt;name&amp;gt;          server name (focal, surface, intensity, moment,&lt;br /&gt;
                         or csep-x) - this option is required&lt;br /&gt;
 -periods &amp;lt;periods&amp;gt;      Comma-delimited periods to insert&lt;br /&gt;
 -c                      Convert values from g to cm/sec^2&lt;br /&gt;
 -d                      Assume one BSA file per rupture, with embedded&lt;br /&gt;
                         header information.&lt;br /&gt;
 -f                      Don't apply value checks to insertion values; use&lt;br /&gt;
                         with care!.&lt;br /&gt;
 -help                   print this message&lt;br /&gt;
 -r                      Read rotd files (instead of bsa.)&lt;br /&gt;
 -u                      Read duration files (instead of bsa.)&lt;br /&gt;
 -z                      Read zip files instead of bsa.&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for 5 periods, takes about 10 minutes.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | PSA files]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | RotD files]], [[Accessing_CyberShake_Duration_Data | Duration files]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
=== Check DB Site ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Verify that data was correctly loaded into the database.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a list of components (or type IDs) to check for a run ID, and verifies that there is one entry for every rupture variation.  If some rupture variations are missing, a file is produced which lists the missing source, rupture, rupture variation tuples.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/db/CheckDBDataForSite.java and DBConnect.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan (CheckDBDataForSite.java), Nitin Gupta, Vipin Gupta, Phil Maechling (DBConnect.java)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; Both are checked into the CyberShake project:&lt;br /&gt;
&lt;br /&gt;
*Apache Commons&lt;br /&gt;
*MySQL bindings&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  check_db.sh&lt;br /&gt;
    CheckDBDataForSite.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Check out CheckDBDataForSite.java and DBConnect.java.  Compile them by running 'javac -classpath mysql-connector-java-5.0.5-bin.jar:commons-cli-1.0.jar DBConnect.java CheckDBDataForSite.java'.  The paths to the MySQL bindings jar and the Apache Commons jar may be different depending on your installation.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;usage: CheckDBDataForSite&lt;br /&gt;
 -p &amp;lt;periods&amp;gt;     Comma-separated list of periods to check, for geometric&lt;br /&gt;
                  and rotd.&lt;br /&gt;
 -t &amp;lt;type_ids&amp;gt;    Comma-separated list of type IDs to check, for duration.&lt;br /&gt;
 -c &amp;lt;component&amp;gt;   Component type (geometric, rotd, duration) to check.&lt;br /&gt;
 -h,--help        Print help for CheckDBDataForSite&lt;br /&gt;
 -o &amp;lt;output&amp;gt;      Path to output file, if something is missing (required).&lt;br /&gt;
 -r &amp;lt;run_id&amp;gt;      Run ID to check (required).&lt;br /&gt;
 -s &amp;lt;server&amp;gt;      DB server to query against.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; typically takes just a few seconds.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Missing variations file | Missing variations file]]&lt;br /&gt;
&lt;br /&gt;
=== DB Report ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Produce a database report, a data product which Rob Graves used for a time.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a run ID, queries the database for PSA values for all components, and writes the output to a text file.  The list of periods and the DB config parameters are specified in an XML config file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed: the DB connection parameters in default.xml would need to be edited.&lt;br /&gt;
#We want results for different periods: edit default.xml.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/reports/db_report_gen.py . default.xml in the same directory is also needed, and can be generated by editing and running conf_get.py, also in the same directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kevin Milner&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; MySQLdb&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  db_report_gen.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None, all code is Python.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: db_report_gen.py [options] SITE_SHORT_NAME&lt;br /&gt;
&lt;br /&gt;
NOTE: defaults are loaded from defaults.xml and can be edited manually&lt;br /&gt;
	  or overridden with conf_gen.py&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  -e ERF_ID, --erfID=ERF_ID&lt;br /&gt;
                        ERF ID for Report (default = none)&lt;br /&gt;
  -f FILENAME, --file=FILENAME&lt;br /&gt;
                        Store Results to a file instead of STDOUT. If a&lt;br /&gt;
                        directory is given, a name will be auto generated.&lt;br /&gt;
  -i, --id              Flag for specifying site ID instead of Short Name&lt;br /&gt;
                        (default uses Short Name)&lt;br /&gt;
  --hypo, --hpyocenter  Flag for appending hypocenter locations to result&lt;br /&gt;
  -l LIMIT, --limit=LIMIT&lt;br /&gt;
                        Limit the total number of rusults, or 0 for no limit&lt;br /&gt;
                        (default = 0)&lt;br /&gt;
  -o, --sort            SLOW: Force SQL Order By statement for sorting. It&lt;br /&gt;
                        will probably come out sorted, but if it doesn't, you&lt;br /&gt;
                        can use this. (default will not sort)&lt;br /&gt;
  -p PERIODS, --periods=PERIODS&lt;br /&gt;
                        Comma separated period values (default = 3.0,5.0,10.0)&lt;br /&gt;
  --pr, --print-runs    Print run IDs for site and optionally ERF/Rup Var&lt;br /&gt;
                        Scen/SGT Var IDs&lt;br /&gt;
  -r RUP_VAR_SCENARIO_ID, --rupVarID=RUP_VAR_SCENARIO_ID&lt;br /&gt;
                        Rupture Variation Scenario ID for Report (default =&lt;br /&gt;
                        none)&lt;br /&gt;
  --ri=RUN_ID, --runID=RUN_ID&lt;br /&gt;
                        Allows you to specify a run ID to use (default uses&lt;br /&gt;
                        latest compatible run ID)&lt;br /&gt;
  -R RUPTURE, --rupture=RUPTURE&lt;br /&gt;
                        Only give information on specified rupture. Must be&lt;br /&gt;
                        acompanied by -S/--source flag (default shows all&lt;br /&gt;
                        ruptures)&lt;br /&gt;
  -s SGT_VAR_ID, --sgtVarID=SGT_VAR_ID&lt;br /&gt;
                        SGT Variation ID for Report (default = none)&lt;br /&gt;
  -S SOURCE, --source=SOURCE&lt;br /&gt;
                        Only give information on specified source. To specify&lt;br /&gt;
                        rupture, see -R option (default shows all sources)&lt;br /&gt;
  --s_im, --sort-ims    Sort output by IM value (increasing)...may be slow!&lt;br /&gt;
  -v, --verbose         Verbosity Flag (default = False)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; about 1 minute.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; default.xml&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#DB Report file | DB Report file]]&lt;br /&gt;
&lt;br /&gt;
=== Curve Calc ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Calculate CyberShake hazard curves alongside comparison GMPEs.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a run ID, component, and period, queries the database for the appropriate IM values, and calculates a hazard curve in the desired format. Comparison GMPE curves can also be plotted.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
#New IM types need to be supported.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; The CyberShake curve calculator is part of the OpenSHA codebase.  The specific Java class is org.opensha.sha.cybershake.plot.HazardCurvePlotter (available via https://github.com/opensha/opensha-cybershake/tree/master/src/main/java/org/opensha/sha/cybershake/plot), but it has a complex set of Java depdendencies.  To compile and run, you should follow the instructions on http://www.opensha.org/trac/wiki/SettingUpEclipse to access the source.  The curve calculator is also wrapped by curve_plot_wrapper.sh, in https://github.com/SCECcode/cybershake-tools/blob/master/HazardCurveGeneration/curve_plot_wrapper.sh .&lt;br /&gt;
&lt;br /&gt;
The OpenSHA project also has configuration files for various GMPEs, config files for UCERF2, and configuration files for output formats preferred by Tom and Rob, in src/org/opensha/sha/cybershake/conf.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kevin Milner&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; Standard OpenSHA dependencies&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  curve_plot_wrapper.sh&lt;br /&gt;
    HazardCurvePlotter.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Use the OpenSHA build process if building from source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
usage: HazardCurvePlotter [-?] [-af &amp;lt;arg&amp;gt;] [-benchmark] [-c] [-cmp &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-comp &amp;lt;arg&amp;gt;] [-cvmvs] [-e &amp;lt;arg&amp;gt;] [-ef &amp;lt;arg&amp;gt;] [-f] [-fvs &amp;lt;arg&amp;gt;] [-h&lt;br /&gt;
       &amp;lt;arg&amp;gt;] [-imid &amp;lt;arg&amp;gt;] [-imt &amp;lt;arg&amp;gt;] [-n] [-novm] [-o &amp;lt;arg&amp;gt;] [-p&lt;br /&gt;
       &amp;lt;arg&amp;gt;] [-pf &amp;lt;arg&amp;gt;] [-pl &amp;lt;arg&amp;gt;] [-R &amp;lt;arg&amp;gt;] [-r &amp;lt;arg&amp;gt;] [-s &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-sgt &amp;lt;arg&amp;gt;] [-sgtsym] [-t &amp;lt;arg&amp;gt;] [-v &amp;lt;arg&amp;gt;] [-vel &amp;lt;arg&amp;gt;] [-w&lt;br /&gt;
       &amp;lt;arg&amp;gt;]&lt;br /&gt;
 -?,--help                            Display this message&lt;br /&gt;
 -af,--atten-rel-file &amp;lt;arg&amp;gt;           XML Attenuation Relationship&lt;br /&gt;
                                      description file(s) for comparison.&lt;br /&gt;
                                      Multiple files should be comma&lt;br /&gt;
                                      separated&lt;br /&gt;
 -benchmark,--benchmark-test-recalc   Forces recalculation of hazard&lt;br /&gt;
                                      curves to test calculation speed.&lt;br /&gt;
                                      Newly recalculated curves are not&lt;br /&gt;
                                      kept and the original curves are&lt;br /&gt;
                                      plotted.&lt;br /&gt;
 -c,--calc-only                       Only calculate and insert the&lt;br /&gt;
                                      CyberShake curves, don't make plots.&lt;br /&gt;
                                      If a curve already exists, it will&lt;br /&gt;
                                      be skipped.&lt;br /&gt;
 -cmp,--component &amp;lt;arg&amp;gt;               Intensity measure component.&lt;br /&gt;
                                      Options: GEOM,X,Y,RotD100,RotD50,&lt;br /&gt;
                                      Default: GEOM&lt;br /&gt;
 -comp,--compare-to &amp;lt;arg&amp;gt;             Compare to  aspecific Run ID (or&lt;br /&gt;
                                      multiple IDs, comma separated)&lt;br /&gt;
 -cvmvs,--cvm-vs30                    Option to use Vs30 value from the&lt;br /&gt;
                                      velocity model itself in GMPE&lt;br /&gt;
                                      calculations rather than, for&lt;br /&gt;
                                      example, the Wills 2006 value.&lt;br /&gt;
 -e,--erf-id &amp;lt;arg&amp;gt;                    ERF ID&lt;br /&gt;
 -ef,--erf-file &amp;lt;arg&amp;gt;                 XML ERF description file for&lt;br /&gt;
                                      comparison&lt;br /&gt;
 -f,--force-add                       Flag to add curves to db without&lt;br /&gt;
                                      prompt&lt;br /&gt;
 -fvs,--force-vs30 &amp;lt;arg&amp;gt;              Option to force the given Vs30 value&lt;br /&gt;
                                      to be used in GMPE calculations.&lt;br /&gt;
 -h,--height &amp;lt;arg&amp;gt;                    Plot height (default = 500)&lt;br /&gt;
 -imid,--im-type-id &amp;lt;arg&amp;gt;             Intensity measure type ID. If not&lt;br /&gt;
                                      supplied, will be detected from im&lt;br /&gt;
                                      type/component/period parameters&lt;br /&gt;
 -imt,--im-type &amp;lt;arg&amp;gt;                 Intensity measure type. Options: SA,&lt;br /&gt;
                                      Default: SA&lt;br /&gt;
 -n,--no-add                          Flag to not automatically calculate&lt;br /&gt;
                                      curves not in the database&lt;br /&gt;
 -novm,--no-vm-colors                 Disables Velocity Model coloring&lt;br /&gt;
 -o,--output-dir &amp;lt;arg&amp;gt;                Output directory&lt;br /&gt;
 -p,--period &amp;lt;arg&amp;gt;                    Period(s) to calculate. Multiple&lt;br /&gt;
                                      periods should be comma separated&lt;br /&gt;
                                      (default: 3)&lt;br /&gt;
 -pf,--password-file &amp;lt;arg&amp;gt;            Path to a file that contains the&lt;br /&gt;
                                      username and password for inserting&lt;br /&gt;
                                      curves into the database. Format&lt;br /&gt;
                                      should be &amp;quot;user:pass&amp;quot;&lt;br /&gt;
 -pl,--plot-chars-file &amp;lt;arg&amp;gt;          Specify the path to a plot&lt;br /&gt;
                                      characteristics XML file&lt;br /&gt;
 -R,--run-id &amp;lt;arg&amp;gt;                    Run ID&lt;br /&gt;
 -r,--rv-id &amp;lt;arg&amp;gt;                     Rupture Variation ID&lt;br /&gt;
 -s,--site &amp;lt;arg&amp;gt;                      Site short name&lt;br /&gt;
 -sgt,--sgt-var-id &amp;lt;arg&amp;gt;              STG Variation ID&lt;br /&gt;
 -sgtsym,--sgt-colors                 Enables SGT specific symbols&lt;br /&gt;
 -t,--type &amp;lt;arg&amp;gt;                      Plot save type. Options are png,&lt;br /&gt;
                                      pdf, jpg, and txt. Multiple types&lt;br /&gt;
                                      can be comma separated (default is&lt;br /&gt;
                                      pdf)&lt;br /&gt;
 -v,--vs30 &amp;lt;arg&amp;gt;                      Specify default Vs30 for sites with&lt;br /&gt;
                                      no Vs30 data, or leave blank for&lt;br /&gt;
                                      default value. Otherwise, you will&lt;br /&gt;
                                      be prompted to enter vs30&lt;br /&gt;
                                      interactively if needed.&lt;br /&gt;
 -vel,--vel-model-id &amp;lt;arg&amp;gt;            Velocity Model ID&lt;br /&gt;
 -w,--width &amp;lt;arg&amp;gt;                     Plot width (default = 600)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; about 30 seconds per curve.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; ERF config file, GMPE config files&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Hazard Curve | Hazard Curve]]&lt;br /&gt;
&lt;br /&gt;
=== Disaggregate ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Disaggregate the curve results to determine the largest contributing sources.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a run ID, a probability or IM level, and a period to disaggregate at.  It produces disaggregation distance-magnitude plots and also a list of the % contribution of each source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
#We want to support different kinds of disaggregation, or for a different kind of ERF.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; The Disaggregator is part of the OpenSHA codebase.  The specific Java class is org.opensha.sha.cybershake.plot.DisaggregationPlotter (available via https://github.com/opensha/opensha-cybershake/tree/master/src/main/java/org/opensha/sha/cybershake/plot), but it has a complex set of Java depdendencies.  To compile and run, you should follow the instructions on http://www.opensha.org/trac/wiki/SettingUpEclipse to access the source.  The curve calculator is also wrapped by disagg_plot_wrapper.sh, in https://github.com/SCECcode/cybershake-tools/blob/master/HazardCurveGeneration/disagg_plot_wrapper.sh .&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kevin Milner, Nitin Gupta, Vipin Gupta&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; Standard OpenSHA dependencies&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  disagg_plot_wrapper.sh&lt;br /&gt;
    DisaggregationPlotter.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Use the standard OpenSHA building process if building from source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;usage: DisaggregationPlotter [-?] [-af &amp;lt;arg&amp;gt;] [-cmp &amp;lt;arg&amp;gt;] [-e &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-fvs &amp;lt;arg&amp;gt;] [-i &amp;lt;arg&amp;gt;] [-imid &amp;lt;arg&amp;gt;] [-imt &amp;lt;arg&amp;gt;] [-o &amp;lt;arg&amp;gt;] [-p&lt;br /&gt;
       &amp;lt;arg&amp;gt;] [-pr &amp;lt;arg&amp;gt;] [-r &amp;lt;arg&amp;gt;] [-R &amp;lt;arg&amp;gt;] [-s &amp;lt;arg&amp;gt;] [-sgt &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-t &amp;lt;arg&amp;gt;] [-vel &amp;lt;arg&amp;gt;]&lt;br /&gt;
 -?,--help                    Display this message&lt;br /&gt;
 -af,--atten-rel-file &amp;lt;arg&amp;gt;   XML Attenuation Relationship description&lt;br /&gt;
                              file(s) for comparison. Multiple files&lt;br /&gt;
                              should be comma separated&lt;br /&gt;
 -cmp,--component &amp;lt;arg&amp;gt;       Intensity measure component. Options:&lt;br /&gt;
                              GEOM,X,Y,RotD100,RotD50, Default: GEOM&lt;br /&gt;
 -e,--erf-id &amp;lt;arg&amp;gt;            ERF ID&lt;br /&gt;
 -fvs,--force-vs30 &amp;lt;arg&amp;gt;      Option to force the given Vs30 value to be&lt;br /&gt;
                              used in GMPE calculations.&lt;br /&gt;
 -i,--imls &amp;lt;arg&amp;gt;              Intensity Measure Levels (IMLs) to&lt;br /&gt;
                              disaggregate at. Multiple IMLs should be&lt;br /&gt;
                              comma separated.&lt;br /&gt;
 -imid,--im-type-id &amp;lt;arg&amp;gt;     Intensity measure type ID. If not supplied,&lt;br /&gt;
                              will be detected from im&lt;br /&gt;
                              type/component/period parameters&lt;br /&gt;
 -imt,--im-type &amp;lt;arg&amp;gt;         Intensity measure type. Options: SA,&lt;br /&gt;
                              Default: SA&lt;br /&gt;
 -o,--output-dir &amp;lt;arg&amp;gt;        Output directory&lt;br /&gt;
 -p,--period &amp;lt;arg&amp;gt;            Period(s) to calculate. Multiple periods&lt;br /&gt;
                              should be comma separated (default: 3)&lt;br /&gt;
 -pr,--probs &amp;lt;arg&amp;gt;            Probabilities (1 year) to disaggregate at.&lt;br /&gt;
                              Multiple probabilities should be comma&lt;br /&gt;
                              separated.&lt;br /&gt;
 -r,--rv-id &amp;lt;arg&amp;gt;             Rupture Variation ID&lt;br /&gt;
 -R,--run-id &amp;lt;arg&amp;gt;            Run ID&lt;br /&gt;
 -s,--site &amp;lt;arg&amp;gt;              Site short name&lt;br /&gt;
 -sgt,--sgt-var-id &amp;lt;arg&amp;gt;      STG Variation ID&lt;br /&gt;
 -t,--type &amp;lt;arg&amp;gt;              Plot save type. Options are png, pdf, and&lt;br /&gt;
                              txt. Multiple types can be comma separated&lt;br /&gt;
                              (default is pdf)&lt;br /&gt;
 -vel,--vel-model-id &amp;lt;arg&amp;gt;    Velocity Model ID&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; typically takes about 30 seconds.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Disaggregation file | Disaggregation file]]&lt;br /&gt;
&lt;br /&gt;
== Stochastic codes ==&lt;br /&gt;
&lt;br /&gt;
With CyberShake, we also have the option to augment a completed run with stochastic seismograms.  The following codes are used to add stochastic high-frequency content to an already-completed low-frequency deterministic run.&lt;br /&gt;
&lt;br /&gt;
[[File:stochastic workflow overview.png|thumb|right|300px|Overview of the codes involved in the stochastic part of CyberShake, [http://hypocenter.usc.edu/research/cybershake/stochastic_workflow_overview.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
=== Velocity Info ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To determine slowness-averaged VsX values for a CyberShake site, from UCVM.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; Velocity Info takes a location, a velocity model, and grid spacing information and queries UCVM to generate three VsX values needed by the site response:&lt;br /&gt;
#Vs30, calculated as: 30 / sum( 1 / (Vs sampled from [0.5, 29.5] at 1 meter increments, for 30 values) )&lt;br /&gt;
#Vs5H, like Vs30 but calculated over the shallowest 5*gridspacing meters.  So if gridspacing=100m, Vs5H = 500 / sum( 1 / (Vs sampled from [0.5, 499.5] at 1 meter increments, for 500 values) )&lt;br /&gt;
#VsD5H, like Vs30, but calculated over gridspacing increments, instead of 1 meter.  The start and end are weighted half as much.  So if gridspacing=100m, VsD5H = 5 / sum( 0 / (Vs sampled from [0, 500] at 1 meter increments, for 500 values) )&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We need to support more than one model - for instance, if the CyberShake site box (not simulation volume) spans multiple models.  The code to parse the model string and load models in initialize_ucvm() would need to be changed. &lt;br /&gt;
#We want to support new kinds of velocity values.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/src/retrieve_vs.c&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#UCVM | UCVM]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  retrieve_vs&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make retrieve_vs'&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./retrieve_vs &amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;model&amp;gt; &amp;lt;gridspacing&amp;gt; &amp;lt;out filename&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes about 15 seconds.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Velocity_Info file|Velocity Info file]]&lt;br /&gt;
&lt;br /&gt;
=== Local VM ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate a &amp;quot;local&amp;quot; 1D velocity file, required for the high-frequency codes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; Local VM takes in an input file containing a 1D velocity model.  It then calculates Qs from these values and writes all the velocity data to a new file.  For all Study 15.12 runs, we used the LA Basin 1D model from the BBP, v14.3.0.  It's registered in the RLS, and is located at /home/scec-02/cybershk/runs/genslip_nr_generic1d-gp01.vmod.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change the algorithm for calculating Vs.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/gen_local_vel.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan, modified from Rob Graves' code&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  gen_local_vel.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./gen_local_vel.py &amp;lt;1D velocity model&amp;gt; &amp;lt;output&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes less than a second.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#BBP velocity file|BBP 1D velocity file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Local VM file|Local VM file]]&lt;br /&gt;
&lt;br /&gt;
=== Create Dirs ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To create a directory for each source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The high-frequency codes produce many intermediate files.  To avoid overloading the filesystem, Create Dirs creates a separate directory for every source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; This code is basically just a wrapper around mkdir, and is unlikely to need changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/create_dirs.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  create_dirs.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt; &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Usage: ./create_dirs.py &amp;lt;file with list of dirs&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial, takes just a few seconds.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; File with a directory to create (a source ID) on each line.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; None.&lt;br /&gt;
&lt;br /&gt;
=== HF Synth ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; HF Synth generates a high-frequency stochastic seismogram for one or more rupture variations.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code wraps multiple broadband platform codes to reduce the number of invocations required.  Specifically, it calls:&lt;br /&gt;
#srf2stoch_lite(), a reduced-memory version of srf2stoch.  We have modified it to call rupgen_genslip() to generate the SRF, rather than reading it in from disk.&lt;br /&gt;
#hfsim(), a wrapper for:&lt;br /&gt;
##hb_high(), Rob Graves's original BBP code to produce the seismograms&lt;br /&gt;
##wcc_getpeak(), which calculates PGA for the seismogram&lt;br /&gt;
##wcc_siteamp14(), which performs site amplification.&lt;br /&gt;
&lt;br /&gt;
Vs30 is required, so if it is not passed as a command-line argument, UCVM is called to determine it.&lt;br /&gt;
&lt;br /&gt;
Additionally, hf_synth_lite is able to handle processing on multiple rupture variations, to further reduce the number of invocations.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; &lt;br /&gt;
#A new version of one of Rob's codes - the high-frequency generator or the site amplification - is needed.  We have tried to use whatever the most recent version is on the BBP, for consistency.&lt;br /&gt;
#New velocity parameters are needed for the site amplification.&lt;br /&gt;
#The format of the rupture geometry files changes.&lt;br /&gt;
&lt;br /&gt;
The makefile needs to be changed if the path to libmemcached, UCVM, Getpar, or the rupture generator changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; wrapper by Scott Callaghan, hb_high(), wcc_getpeak(), and wcc_siteamp14() by Rob Graves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar | Getpar]], [[CyberShake_Code_Base#UCVM | UCVM]], [[CyberShake_Code_Base#RupGen-api-v3.3.1 | rupture generator]], [[CyberShake_Code_Base#libmemcached | libmemcached]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  hf_synth_lite&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Run 'make' in src.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt; There is no 'help' usage string, but here's a sample invocation:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/projects/sciteam/jmz/CyberShake/software/HFSim_mem/bin/hf_synth_lite&lt;br /&gt;
   stat=OSI slat=34.6145 slon=-118.7235&lt;br /&gt;
   rup_geom_file=e36_rv6_121_0.txt source_id=121 rupture_id=0&lt;br /&gt;
   num_rup_vars=5 rup_vars=(0,0,0);(1,1,0);(2,2,0);(3,3,0);(4,4,0)&lt;br /&gt;
   outfile=121/Seismogram_OSI_4331_121_0_hf_t0.grm&lt;br /&gt;
   dx=2.0 dy=2.0 tlen=300.0 dt=0.025&lt;br /&gt;
   do_site_response=1 vs30=359.1 debug=0 vmod=LA_Basin_BBP_14.3.0.local&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes a few seconds per rupture variation up to a minute, depending on the size of the fault surface.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Local_VM_file | Local velocity file]], [[CyberShake_Rupture_Files#UCERF2_Rupture_Geometry_Files | rupture geometry file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; High-frequency seismograms, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
=== Combine HF Synth ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; This code combines the seismograms produced by HF Synth so that there is just 1 seismogram per source/rupture combo.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; Since we split up work so that each HF Synth job takes a chunk of rupture variations, we may end up with multiple seismogram files per rupture, each containing some of the rupture variations.  This script concatenates the files, using cat, into a single file, ready to be worked on later in the workflow.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; I can't think of a circumstance where we would need to change this.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/combine_seis.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  combine_seis.py&lt;br /&gt;
    cat&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Usage: ./combine_seis.py &amp;lt;seis 0&amp;gt; &amp;lt;seis 1&amp;gt; ... &amp;lt;seis N&amp;gt; &amp;lt;output seis name&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes a few seconds.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; High-frequency seismograms, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; A single high-frequency seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
=== LF Site Response ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; This code performs site response modifications to the CyberShake low-frequency seismograms.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The LF Site Response code takes a low-frequency seismogram and some velocity parameters, and outputs a seismogram with site response applied.  In Study 15.12, this was a necessary step before combining the low and high frequency seismograms together.  Since Vs30 is required, if it's not passed as a command-line argument, then UCVM is called to determine it.&lt;br /&gt;
&lt;br /&gt;
The reason we calculate site response for the low-frequency deterministic seismograms is that we want both the low- and high-frequency results to be for the same site-response condition.  For the HF, we used Vs30 directly&lt;br /&gt;
for the site-response adjustment, but for the LF we had to use an adjusted VsX value since the grid spacing was 100 m (so Vs30 doesn't make sense).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; &lt;br /&gt;
#We change the site response algorithm.&lt;br /&gt;
#We decide to use different velocity parameters for setting site response.&lt;br /&gt;
#The format of the seismogram files changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/LF_Site_Response&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; wrapper by Scott Callaghan, site response by Rob Graves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar | Getpar]], [[CyberShake_Code_Base#UCVM | UCVM]], [[CyberShake_Code_Base#RupGen-api-v3.3.1 | rupture generator]], [[CyberShake_Code_Base#libmemcached | libmemcached]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  lf_site_response&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Edit the makefile to point to RupGen, libmemcached, and Getpar, then run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
Sample invocation:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./lf_site_response&lt;br /&gt;
seis_in=Seismogram_OSI_3923_263_3.grm seis_out=263/Seismogram_OSI_3923_263_3_site_response.grm&lt;br /&gt;
slat=34.6145 slon=-118.7235&lt;br /&gt;
module=cb2014&lt;br /&gt;
vs30=359.1 vref=344.7&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes less than a second.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; A low-frequency deterministic seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; A low-frequency deterministic seismogram with site response, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
=== Merge IM ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; This code combines low-frequency deterministic and high-frequency stochastic seismograms, then processes them to obtain intensity measures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The Merge IM code takes an LF and HF seismogram and performs the following processing:&lt;br /&gt;
#A high-pass filter is applied to the HF seismogram.&lt;br /&gt;
#The LF seismogram is resampled to the same dt as the HF seismogram.&lt;br /&gt;
#The two seismograms are combined into a single broadband (BB) seismogram.&lt;br /&gt;
#The PSA code is run on the resulting seismogram.&lt;br /&gt;
#If desired, the RotD and duration codes are also run on the seismogram.&lt;br /&gt;
&lt;br /&gt;
Merge IM works on a seismogram file at the rupture level, so it assumes that the input files contain multiple rupture variations.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; &lt;br /&gt;
#We change the filter-and-combine algorithm.&lt;br /&gt;
#We decide to modify the post-processing and IM types we want to capture.&lt;br /&gt;
#The format of the seismogram files changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/MergeIM&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan, Rob Graves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar | Getpar]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  merge_psa&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Edit the makefile to point to Getpar, then run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
Sample invocation:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./merge_psa&lt;br /&gt;
lf_seis=182/Seismogram_OSI_3923_182_23_site_response.grm hf_seis=182/Seismogram_OSI_4331_182_23_hf.grm seis_out=182/Seismogram_OSI_4331_182_23_bb.grm&lt;br /&gt;
freq=1.0 comps=2 num_rup_vars=16&lt;br /&gt;
simulation_out_pointsX=2 simulation_out_pointsY=1&lt;br /&gt;
simulation_out_timesamples=12000 simulation_out_timeskip=0.025&lt;br /&gt;
surfseis_rspectra_seismogram_units=cmpersec surfseis_rspectra_output_units=cmpersec2&lt;br /&gt;
surfseis_rspectra_output_type=aa surfseis_rspectra_period=all&lt;br /&gt;
surfseis_rspectra_apply_filter_highHZ=20.0 surfseis_rspectra_apply_byteswap=no&lt;br /&gt;
out=182/PeakVals_OSI_4331_182_23_bb.bsa&lt;br /&gt;
run_rotd=1 rotd_out=182/RotD_OSI_4331_182_23_bb.rotd&lt;br /&gt;
run_duration=1 duration_out=182/Duration_OSI_4331_182_23_bb.dur&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes 5-30 seconds, depending on the number of rupture variations in the files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; LF deterministic seismogram and HF stochastic seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; BB seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]]; also [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | PSA files]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | RotD files]], and [[Accessing_CyberShake_Duration_Data | Duration files]]&lt;br /&gt;
&lt;br /&gt;
== File types ==&lt;br /&gt;
&lt;br /&gt;
=== Modelbox ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains a description of the simulation box, at the surface.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.modelbox&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;site name&amp;gt;&lt;br /&gt;
APPROXIMATE CENTROID:&lt;br /&gt;
  clon= &amp;lt;centroid lon&amp;gt; clat =&amp;lt;centroid lat&amp;gt;&lt;br /&gt;
MODEL PARAMETERS:&lt;br /&gt;
  mlon= &amp;lt;model lon&amp;gt; mlat =&amp;lt;model lat&amp;gt; mrot=&amp;lt;model rot, default -55&amp;gt; xlen= &amp;lt;x-length in km&amp;gt; ylen= &amp;lt;y-length in km&amp;gt;&lt;br /&gt;
MODEL CORNERS:&lt;br /&gt;
  &amp;lt;lon 1&amp;gt; &amp;lt;lat 1&amp;gt; (x= 0.000 y= 0.000)&lt;br /&gt;
  &amp;lt;lon 2&amp;gt; &amp;lt;lat 2&amp;gt; (x= &amp;lt;max x&amp;gt; y= 0.000)&lt;br /&gt;
  &amp;lt;lon 3&amp;gt; &amp;lt;lat 3&amp;gt; (x= &amp;lt;max x&amp;gt; y= &amp;lt;max y&amp;gt;)&lt;br /&gt;
  &amp;lt;lon 4&amp;gt; &amp;lt;lat 4&amp;gt; (x= 0.000 y= &amp;lt;max y&amp;gt;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== Gridfile ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the three dimensions, and gridspacing in each dimension, of the volume.&lt;br /&gt;
&lt;br /&gt;
Filename convention: gridfile_&amp;lt;site&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
xlen=&amp;lt;x-length in km&amp;gt;&lt;br /&gt;
   0.0  &amp;lt;x-length&amp;gt;  &amp;lt;grid spacing in km&amp;gt;&lt;br /&gt;
ylen=&amp;lt;y-length in km&amp;gt;&lt;br /&gt;
   0.0  &amp;lt;y-length&amp;gt;  &amp;lt;grid spacing in km&amp;gt;&lt;br /&gt;
zlen=&amp;lt;z-length in km&amp;gt;&lt;br /&gt;
   0.0  &amp;lt;z-length&amp;gt;  &amp;lt;grid spacing in km&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gridout ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the km offsets for each grid index, in X, Y, and Z, from the upper southwest corner.&lt;br /&gt;
&lt;br /&gt;
Filename convention: gridout_&amp;lt;site&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
xlen=&amp;lt;x-length in km&amp;gt;&lt;br /&gt;
nx=&amp;lt;number of gridpoints in X direction&amp;gt;&lt;br /&gt;
  0   0   &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  1   &amp;lt;grid spacing&amp;gt;  &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  2   &amp;lt;2*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  3   &amp;lt;3*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
  nx-1 &amp;lt;(nx-1)*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
ylen=&amp;lt;y-length in km&amp;gt;&lt;br /&gt;
ny=&amp;lt;number of gridpoints in Y direction&amp;gt;&lt;br /&gt;
  0   0   &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  1   &amp;lt;grid spacing&amp;gt;  &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
  ny-1 &amp;lt;(ny-1)*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
zlen=&amp;lt;z-length in km&amp;gt;&lt;br /&gt;
nz=&amp;lt;number of gridpoints in Z direction&amp;gt;&lt;br /&gt;
  0   0   &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  1   &amp;lt;grid spacing&amp;gt;  &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
  nz-1 &amp;lt;(nz-1)*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: UCVM, smoothing, PreSGT, PreAWP&lt;br /&gt;
&lt;br /&gt;
=== Params ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Succinctly specify the parameters for the CyberShake volume.  Similar information to the modelbox file, but in a different format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: model_params_GC_&amp;lt;site&amp;gt; (GC stands for 'great circle', the projection we use).&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Model origin coordinates:&lt;br /&gt;
 lon= &amp;lt;model lon&amp;gt; lat=   &amp;lt;model lat&amp;gt; rotate=  &amp;lt;model rotation, default -55&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Model origin shift (cartesian vs. geographic):&lt;br /&gt;
 xshift(km)=   &amp;lt;x shift, usually half the x-length minus 1 grid spacing&amp;gt; yshift(km)=   &amp;lt;y-shift, usually half the y-length minus 1 grid spacing&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Model corners:&lt;br /&gt;
 c1= &amp;lt;nw lon&amp;gt;   &amp;lt;nw lat&amp;gt;&lt;br /&gt;
 c2= &amp;lt;ne lon&amp;gt;   &amp;lt;ne lat&amp;gt;&lt;br /&gt;
 c3= &amp;lt;se lon&amp;gt;   &amp;lt;se lat&amp;gt;&lt;br /&gt;
 c4= &amp;lt;sw lon&amp;gt;   &amp;lt;sw lat&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Model Dimensions:&lt;br /&gt;
 xlen=   &amp;lt;x-length&amp;gt; km&lt;br /&gt;
 ylen=   &amp;lt;y-length&amp;gt; km&lt;br /&gt;
 zlen=   &amp;lt;z-length&amp;gt; km&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: &lt;br /&gt;
&lt;br /&gt;
=== Coord ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the mapping of latitude and longitude to X and Y offsets, for each point on the surface.&lt;br /&gt;
&lt;br /&gt;
Filename convention: model_coords_GC_&amp;lt;site&amp;gt; (GC stands for 'great circle', the projection we use).&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 1 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 2 0&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 1&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 1&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; &amp;lt;ny-1&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: UCVM, smoothing, PreSGT&lt;br /&gt;
&lt;br /&gt;
=== Bounds ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the mapping of latitude and longitude to X and Y offsets, but only for the points along the boundary.  A subset of the coord file.&lt;br /&gt;
&lt;br /&gt;
Filename convention: model_bounds_GC_&amp;lt;site&amp;gt; (GC stands for 'great circle', the projection we use).&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 1 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 2 0&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 1&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 1&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 2&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 2&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 &amp;lt;ny-1&amp;gt;&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 1 &amp;lt;ny-1&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; &amp;lt;ny-1&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by:&lt;br /&gt;
&lt;br /&gt;
=== Velocity files ===&lt;br /&gt;
&lt;br /&gt;
==== RWG format ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Input velocity files for the RWG wave propagation code, emod3d.&lt;br /&gt;
&lt;br /&gt;
Filename convention: v_sgt-&amp;lt;site&amp;gt;.&amp;lt;p, s, or d&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: 3 files, one each for Vp (*.p), Vs (*.s), and rho (*.d).  Each is binary, with 4-byte floats, in fast X, Z (surface-&amp;gt;down), slow Y order.&lt;br /&gt;
&lt;br /&gt;
Generated by: UCVM&lt;br /&gt;
&lt;br /&gt;
Used by: PreAWP&lt;br /&gt;
&lt;br /&gt;
==== AWP format ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Input velocity file for the AWP-ODC wave propagation code.&lt;br /&gt;
&lt;br /&gt;
Filename convention: awp.&amp;lt;site&amp;gt;.media&lt;br /&gt;
&lt;br /&gt;
Format: Binary, with 4-byte floats, in fast Y, X, slow Z (surface down) order.&lt;br /&gt;
&lt;br /&gt;
Generated by: UCVM&lt;br /&gt;
&lt;br /&gt;
Used by: Smoothing, PreAWP, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== Fdloc ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Coordinates of the site, in X Y grid indices, and therefore the coordinates where the SGT impulse should be placed.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.fdloc&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;X grid index of site&amp;gt; &amp;lt;Y grid index of site&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreAWP, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== Faultlist ===&lt;br /&gt;
&lt;br /&gt;
Purpose: List of paths to all the rupture geometry files for all ruptures which are within the cutoff for this site. Used to produce a list of points to save SGTs for.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.faultlist&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;path to rupture file&amp;gt; nheader=&amp;lt;number of header lines, usually 6&amp;gt; latfirst=&amp;lt;1, to signify that latitude comes first in the rupture files&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT&lt;br /&gt;
&lt;br /&gt;
=== Radiusfile ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Describe the adaptive mesh SGTs will be saved for.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.radiusfile&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of gradations in X and Y&amp;gt;&lt;br /&gt;
&amp;lt;radius 1&amp;gt; &amp;lt;radius 2&amp;gt; &amp;lt;radius 3&amp;gt; &amp;lt;radius 4&amp;gt;&lt;br /&gt;
&amp;lt;decimation less than radius 1&amp;gt; &amp;lt;decimation between radius 1 and 2&amp;gt; &amp;lt;between 2 and 3&amp;gt; &amp;lt;between 3 and 4&amp;gt;&lt;br /&gt;
&amp;lt;number of gradations in Z&amp;gt;&lt;br /&gt;
&amp;lt;depth 1&amp;gt; &amp;lt;depth 2&amp;gt; &amp;lt;depth 3&amp;gt; &amp;lt;depth 4&amp;gt;&lt;br /&gt;
&amp;lt;decimation less than depth 1&amp;gt; &amp;lt;decimation between depth 1 and 2&amp;gt; &amp;lt;between 2 and 3&amp;gt; &amp;lt;between 3 and 4&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT&lt;br /&gt;
&lt;br /&gt;
=== SGT Coordinate files ===&lt;br /&gt;
&lt;br /&gt;
There are two formats for the list of points to save SGTs for, one for Rob's codes and one for AWP-ODC.  As with other coordinate transformations between the two systems, to convert X and Y offsets from RWG to AWP you have to flip the X and Y and add 1 to each, since RWG is 0-indexed and AWP is 1-indexed.&lt;br /&gt;
&lt;br /&gt;
==== SgtCoords ====&lt;br /&gt;
&lt;br /&gt;
Purpose: List of all the points to save SGTs for.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.cordfile&lt;br /&gt;
&lt;br /&gt;
Format: Z changes fastest, then Y, then X slowest.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# geoproj= &amp;lt;projection; we usually use 1 for great circle&amp;gt;&lt;br /&gt;
# modellon= &amp;lt;model lon&amp;gt; modellat= &amp;lt;model lat&amp;gt; modelrot= &amp;lt;model rot, usually -55&amp;gt;&lt;br /&gt;
# xlen= &amp;lt;x-length&amp;gt; ylen= &amp;lt;y-length&amp;gt;&lt;br /&gt;
#&lt;br /&gt;
&amp;lt;total number of points&amp;gt;&lt;br /&gt;
&amp;lt;X index&amp;gt; &amp;lt;Y index&amp;gt; &amp;lt;Z index&amp;gt; &amp;lt;Single long to capture the index, in the form XXXXYYYYZZZZ&amp;gt; &amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;depth in km&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT, PreAWP, PostAWP&lt;br /&gt;
&lt;br /&gt;
==== AWP cordfile ====&lt;br /&gt;
&lt;br /&gt;
Purpose: List of SGT points to save in a format usable by AWP-ODC-SGT.&lt;br /&gt;
&lt;br /&gt;
Filename convention: awp.&amp;lt;site&amp;gt;.cordfile&lt;br /&gt;
&lt;br /&gt;
Format: Remember that X and Y are flipped and have 1 added from RWG.  The points are sorted by Y, then X, then Z, so Y changes slowest and Z changes fastest.  This is flipped from the RWG cordfile because X and Y components are swapped.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of points&amp;gt;&lt;br /&gt;
&amp;lt;X coordinate&amp;gt; &amp;lt;Y coordinate&amp;gt; &amp;lt;Z coordinate&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreAWP&lt;br /&gt;
&lt;br /&gt;
Used by: AWP-ODC-SGT CPU, AWP-ODC-SGT GPU&lt;br /&gt;
&lt;br /&gt;
=== Impulse source descriptions ===&lt;br /&gt;
&lt;br /&gt;
We generate the initial source description for CyberShake, with the required dt, nt, and filtering, using gen_source, in https://github.com/SCECcode/cybershake-core/SimSgt_V3.0.3/src/ (run 'make get_source').  gen_source hard-codes its parameters, but you should only change 'nt', 'dt', and 'flo'.  We have been setting flo to twice the CyberShake maximum frequency, to reduce filtering affects at the frequency of interest.  gen_source wraps Rob Graves's source generator, which we use for consistency.&lt;br /&gt;
&lt;br /&gt;
To generate a source for a component, run&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$&amp;gt;./gen_source xsrc=0 ysrc=0 zsrc=0 &amp;lt;fxsrc|fysrc|fzsrc&amp;gt;=1 moment=1e20&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once this RWG source is generated, we then use AWP-GPU-SGT/utils/data/format_source.py to reprocess the RWG source into an AWP-source friendly format.  This involves reformatting the file and multiplying all values by 1e15 for unit conversion.  Different files must be produced for X and Y coordinates, since in the AWP format different columns are used for different components.&lt;br /&gt;
&lt;br /&gt;
Finally, AWP-GPU-SGT/utils/build_src.py takes the correct AWP-friendly source (nt and dt) for a run and adds the impulse location coordinates, producing a complete AWP format source description.&lt;br /&gt;
&lt;br /&gt;
==== RWG source ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Source description for the SGT impulse.&lt;br /&gt;
&lt;br /&gt;
Filename convention: source_cos0.10_&amp;lt;frequency&amp;gt;hz&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
source cos&lt;br /&gt;
&amp;lt;nt&amp;gt; &amp;lt;dt&amp;gt; 0 0 0.0 0.0 0.0 0.0&lt;br /&gt;
&amp;lt;value at ts0&amp;gt; &amp;lt;value at ts1&amp;gt; &amp;lt;value at ts2&amp;gt; &amp;lt;value at ts3&amp;gt; &amp;lt;value at ts4&amp;gt; &amp;lt;value at ts5&amp;gt;&lt;br /&gt;
&amp;lt;value at ts6&amp;gt; &amp;lt;value at ts7&amp;gt; &amp;lt;value at ts8&amp;gt; &amp;lt;value at ts9&amp;gt; &amp;lt;value at ts10&amp;gt; &amp;lt;value at ts11&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: gen_source (see above)&lt;br /&gt;
&lt;br /&gt;
Used by: PreAWP&lt;br /&gt;
&lt;br /&gt;
==== AWP source ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Source description which can be used by AWP-ODC.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_f&amp;lt;x or y&amp;gt;_src&lt;br /&gt;
&lt;br /&gt;
Format: Note that X and Y coordinates are swapped between RWG and AWP format, because of how the box is defined.  Additionally, RWG is 0-indexed, and AWP is 1-indexed, and the RWG values must be multiplied by 1e15 for unit conversion.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;X index of source, same as site X index&amp;gt; &amp;lt;Y index of source, same as site Y index&amp;gt;&lt;br /&gt;
&amp;lt;XX impulse at ts0&amp;gt; &amp;lt;YY at ts0&amp;gt; &amp;lt;ZZ at ts0&amp;gt; &amp;lt;XY at ts0&amp;gt; &amp;lt;XZ at ts0&amp;gt; &amp;lt;YZ at ts0&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Generated by: PreAWP&lt;br /&gt;
&lt;br /&gt;
Used by: AWP-ODC-SGT CPU, AWP-ODC-SGT GPU&lt;br /&gt;
&lt;br /&gt;
=== IN3D ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Input file for AWP-ODC.&lt;br /&gt;
&lt;br /&gt;
Filename convention: IN3D.&amp;lt;site&amp;gt;.&amp;lt;x or y&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: Specified [https://scec.usc.edu/it/AWP-ODC-SGT#IN3D here (login required)].&lt;br /&gt;
&lt;br /&gt;
Generated by: PreAWP&lt;br /&gt;
&lt;br /&gt;
Used by: AWP-ODC-SGT CPU, AWP-ODC-SGT GPU, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== AWP SGT ===&lt;br /&gt;
&lt;br /&gt;
Purpose: SGT file, created by AWP-ODC-SGT.&lt;br /&gt;
&lt;br /&gt;
Filename convention: awp-strain-&amp;lt;site&amp;gt;-f&amp;lt;x or y&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: binary, 4-byte floats.  Points are in the same order as in the AWP SGT coordinate file, which is fast Z, X, Y.  For each point, the SGT components are stored in XX, YY, ZZ, XY, XZ, YZ order, with time fastest.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YY component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 2nd z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (2nd x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 2nd y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, last y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: AWP-ODC-SGT CPU and GPU&lt;br /&gt;
&lt;br /&gt;
Used by: PostAWP, NanCheck &lt;br /&gt;
&lt;br /&gt;
=== RWG SGT ===&lt;br /&gt;
&lt;br /&gt;
Purpose: SGT file, created by PostAWP for use in post-processing.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_f&amp;lt;x or y&amp;gt;_&amp;lt;run id&amp;gt;.sgt&lt;br /&gt;
&lt;br /&gt;
Format: binary, 4-byte floats.  Points are in the same order as in the RWG coordinate file, which is fast Z, Y, X.  For each point, the SGT components are stored in XX, YY, ZZ, XY, XZ, YZ order, with time fastest.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YY component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 2nd z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (2nd x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 2nd y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, last y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PostAWP&lt;br /&gt;
&lt;br /&gt;
Used by: CheckSgt, DirectSynth&lt;br /&gt;
&lt;br /&gt;
=== SGT header file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: SGT header information, used to parse and understand SGT files&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_f&amp;lt;x or y&amp;gt;_&amp;lt;run id&amp;gt;.sgthead&lt;br /&gt;
&lt;br /&gt;
Format: binary.  It consists of three sections:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;The sgtmaster structure, described below in C.  Its information can be used to set up data structures to read the rest of the SGTs.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
struct sgtmaster&lt;br /&gt;
   {&lt;br /&gt;
   int geoproj;     /* =0: RWG local flat earth; =1: RWG great circle arcs; =2: UTM */&lt;br /&gt;
   float modellon;  /* longitude of geographic origin */&lt;br /&gt;
   float modellat;  /* latitude of geographic origin */&lt;br /&gt;
   float modelrot;  /* rotation of y-axis from south (clockwise positive)   */&lt;br /&gt;
   float xshift;    /* xshift of cartesian origin from geographic origin */&lt;br /&gt;
   float yshift;    /* yshift of cartesian origin from geographic origin */&lt;br /&gt;
   int globnp;      /* total number of SGT locations (entire model) */&lt;br /&gt;
   int localnp;     /* local number of SGT locations (this file only) */&lt;br /&gt;
   int nt;          /* number of time points                                */&lt;br /&gt;
   };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;The sgtindex structures, described below in C.  There is one of these for each point in the SGTs, and they're used to determine the X/Y/Z indices of all the SGT points.  Note that the current way of packing the X,Y,Z coordinates into the long allows for 6 digits (so maximum 1M grid points) for each component.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
struct sgtindex   /* indices for all 'globnp' SGT locations */&lt;br /&gt;
   {&lt;br /&gt;
   long long indx; /* indx= xsgt*1000000000000 + ysgt*1000000 + zsgt */&lt;br /&gt;
   int xsgt;     /* x grid location */&lt;br /&gt;
   int ysgt;     /* y grid location */&lt;br /&gt;
   int zsgt;     /* z grid location */&lt;br /&gt;
   float h;         /* grid spacing                                         */&lt;br /&gt;
   };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;The sgtheader structures, described below in C.  There is one of these for each point in the SGTs.  They're used when we perform reciprocity.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
struct sgtheader&lt;br /&gt;
   {&lt;br /&gt;
   long long indx;  /* index of this SGT */&lt;br /&gt;
   int geoproj;     /* =0: RWG local flat earth; =1: RWG great circle arcs; =2: UTM */&lt;br /&gt;
   float modellon;  /* longitude of geographic origin */&lt;br /&gt;
   float modellat;  /* latitude of geographic origin */&lt;br /&gt;
   float modelrot;  /* rotation of y-axis from south (clockwise positive)   */&lt;br /&gt;
   float xshift;    /* xshift of cartesian origin from geographic origin */&lt;br /&gt;
   float yshift;    /* yshift of cartesian origin from geographic origin */&lt;br /&gt;
   int nt;          /* number of time points                                */&lt;br /&gt;
   float xazim;     /* azimuth of X-axis in FD model (clockwise from north) */&lt;br /&gt;
   float dt;        /* time sampling                                        */&lt;br /&gt;
   float tst;       /* start time of 1st point in GF                        */&lt;br /&gt;
   float h;         /* grid spacing                                         */&lt;br /&gt;
   float src_lat;   /* site latitude */&lt;br /&gt;
   float src_lon;   /* site longitude */&lt;br /&gt;
   float src_dep;   /* site depth */&lt;br /&gt;
   int xsrc;        /* x grid location for source (station in recip. exp.)  */&lt;br /&gt;
   int ysrc;        /* y grid location for source (station in recip. exp.)  */&lt;br /&gt;
   int zsrc;        /* z grid location for source (station in recip. exp.)  */&lt;br /&gt;
   float sgt_lat;   /* SGT location latitude */&lt;br /&gt;
   float sgt_lon;   /* SGT location longitude */&lt;br /&gt;
   float sgt_dep;   /* SGT location depth */&lt;br /&gt;
   int xsgt;        /* x grid location for output (source in recip. exp.)   */&lt;br /&gt;
   int ysgt;        /* y grid location for output (source in recip. exp.)   */&lt;br /&gt;
   int zsgt;        /* z grid location for output (source in recip. exp.)   */&lt;br /&gt;
   float cdist;     /* straight-line distance btw site and SGT location */&lt;br /&gt;
   float lam;       /* lambda [in dyne/(cm*cm)] at output point             */&lt;br /&gt;
   float mu;        /* rigidity [in dyne/(cm*cm)] at output point           */&lt;br /&gt;
   float rho;       /* density [in gm/(cm*cm*cm)] at output point           */&lt;br /&gt;
   float xmom;      /* moment strength of x-oriented force in this run      */&lt;br /&gt;
   float ymom;      /* moment strength of y-oriented force in this run      */&lt;br /&gt;
   float zmom;      /* moment strength of z-oriented force in this run      */&lt;br /&gt;
   };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Overall, then, the format for the file is:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;sgtmaster&amp;gt;&lt;br /&gt;
&amp;lt;sgtindex for point 1&amp;gt;&lt;br /&gt;
&amp;lt;sgtindex for point 2&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;sgtindex for point globnp&amp;gt;&lt;br /&gt;
&amp;lt;sgtheader for point 1&amp;gt;&lt;br /&gt;
&amp;lt;sgtheader for point 2&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;sgtheader for point globnp&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PostAWP&lt;br /&gt;
&lt;br /&gt;
Used by: DirectSynth&lt;br /&gt;
&lt;br /&gt;
=== Velocity Info file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  Contains the 3D velocity information needed for stochastic jobs&lt;br /&gt;
&lt;br /&gt;
Filename convention: velocity_info_&amp;lt;site&amp;gt;.txt&lt;br /&gt;
&lt;br /&gt;
Format: Text format, three lines:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Vs30 = &amp;lt;Vs30 value&amp;gt;&lt;br /&gt;
Vs500 = &amp;lt;Vs500 value&amp;gt;&lt;br /&gt;
VsD500 = &amp;lt;VsD500 value&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: Velocity Info job&lt;br /&gt;
&lt;br /&gt;
Used by: Sub Stoch DAX generator, to add these values as command-line arguments to HF Synth and LF Site Response jobs.&lt;br /&gt;
&lt;br /&gt;
=== rupture list file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains a list of all the ruptures for which low-frequency seismograms should be synthesized.  This file is used to construct the tasks in DirectSynth.  The number of rows, columns, and magnitude are included because DirectSynth uses this information to determine how much memory the tasks will use.  This file is constructed at abstract workflow creation time.&lt;br /&gt;
&lt;br /&gt;
Filename convention: rupture_file_list_&amp;lt;site&amp;gt;_&amp;lt;run_id&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: Text format, with the number of ruptures and then 1 line per rupture:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of ruptures N&amp;gt;&lt;br /&gt;
&amp;lt;rupture geometry filename 1&amp;gt; &amp;lt;number of hypocenters&amp;gt; &amp;lt;number of slips per hypocenter&amp;gt; &amp;lt;number of rows in the rupture geometry&amp;gt; &amp;lt;number of columns in the rupture geometry&amp;gt; &amp;lt;magnitude&amp;gt;&lt;br /&gt;
&amp;lt;rupture geometry filename 2&amp;gt; &amp;lt;number of hypocenters&amp;gt; &amp;lt;number of slips per hypocenter&amp;gt; &amp;lt;number of rows in the rupture geometry&amp;gt; &amp;lt;number of columns in the rupture geometry&amp;gt; &amp;lt;magnitude&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;rupture geometry filename N&amp;gt; &amp;lt;number of hypocenters&amp;gt; &amp;lt;number of slips per hypocenter&amp;gt; &amp;lt;number of rows in the rupture geometry&amp;gt; &amp;lt;number of columns in the rupture geometry&amp;gt; &amp;lt;magnitude&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Used by: DirectSynth&lt;br /&gt;
&lt;br /&gt;
=== rupture variation info file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  This file provides the rvfrac (rupture velocity speed as a fraction of shear wave velocity) and random seed for each rupture variation.  It is constructed at abstract workflow creation time, using values from the database.  It is highly recommended that these values are stored somewhere for reproducibility, and the same values should be used for all sites so that the rupture variations are the same for all sites.&lt;br /&gt;
&lt;br /&gt;
Filename convention: rvfrac_seed_values_&amp;lt;site&amp;gt;_&amp;lt;run_id&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:  Text format, with the number of rupture variations and then 1 line per rupture variation:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of rupture variations N&amp;gt;&lt;br /&gt;
&amp;lt;source ID 1&amp;gt; &amp;lt;rupture ID 1&amp;gt; &amp;lt;rupture variation ID 1&amp;gt; &amp;lt;rvfrac&amp;gt; &amp;lt;random seed&amp;gt;&lt;br /&gt;
&amp;lt;source ID 2&amp;gt; &amp;lt;rupture ID 2&amp;gt; &amp;lt;rupture variation ID 2&amp;gt; &amp;lt;rvfrac&amp;gt; &amp;lt;random seed&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;source ID N&amp;gt; &amp;lt;rupture ID N&amp;gt; &amp;lt;rupture variation ID N&amp;gt; &amp;lt;rvfrac&amp;gt; &amp;lt;random seed&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Used by: DirectSynth, when linked with RupGen-v5.5.2 or later&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== BBP 1D Velocity file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  Contains 1D velocity profile information&lt;br /&gt;
&lt;br /&gt;
Filename convention: The only one currently in use in CyberShake is /home/scec-02/cybershk/runs/genslip_nr_generic1d-gp01.vmod .&lt;br /&gt;
&lt;br /&gt;
Format: Text format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of thickness layers L&amp;gt;&lt;br /&gt;
&amp;lt;layer 1 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;not used&amp;gt; &amp;lt;not used&amp;gt;&lt;br /&gt;
&amp;lt;layer 2 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;not used&amp;gt; &amp;lt;not used&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;layer L thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;not used&amp;gt; &amp;lt;not used&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note that the last layer has thickness 999.0.&lt;br /&gt;
&lt;br /&gt;
Generated by: Rob Graves&lt;br /&gt;
&lt;br /&gt;
Used by: Local VM job&lt;br /&gt;
&lt;br /&gt;
=== Local VM file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  Contains 1D velocity profile information for use with stochastic codes&lt;br /&gt;
&lt;br /&gt;
Filename convention: The only one currently in use in CyberShake is LA_Basin_BBP_14.3.0.local .&lt;br /&gt;
&lt;br /&gt;
Format: Text format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of thickness layers L&amp;gt;&lt;br /&gt;
&amp;lt;layer 1 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;Qs&amp;gt; &amp;lt;Qs&amp;gt;&lt;br /&gt;
&amp;lt;layer 2 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;Qs&amp;gt; &amp;lt;Qs&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;layer L thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;Qs&amp;gt; &amp;lt;Qs&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note that the last layer has thickness '0.0', indicating it has no bottom.&lt;br /&gt;
&lt;br /&gt;
Generated by: Local VM Job&lt;br /&gt;
&lt;br /&gt;
Used by: &lt;br /&gt;
&lt;br /&gt;
=== Missing variations file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Lists the variations which the Check DB stage has found are missing.&lt;br /&gt;
&lt;br /&gt;
Filename convention: DB_Check_Out_&amp;lt;PSA or RotD or Duration&amp;gt;_&amp;lt;site&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: For each source and rupture pair with missing variations, the following record is output in text format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;source ID&amp;gt; &amp;lt;rupture ID&amp;gt; &amp;lt;number N of missing rupture variations&amp;gt;&lt;br /&gt;
&amp;lt;ID of first missing rupture variation&amp;gt;&lt;br /&gt;
&amp;lt;ID of second missing rupture variation&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;ID of Nth missing rupture variation&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Originally, a file in this format could be directly fed back into the DAX generator, but that capability has not been used for many years and may not still be functional.&lt;br /&gt;
&lt;br /&gt;
Generated by: Check DB Site&lt;br /&gt;
&lt;br /&gt;
Used by: none&lt;br /&gt;
&lt;br /&gt;
=== DB Report file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Provides PSA data for a run in a text format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_ERF&amp;lt;erf id&amp;gt;_report_&amp;lt;date&amp;gt;.txt&lt;br /&gt;
&lt;br /&gt;
Format: It's a text file with the following header:&lt;br /&gt;
 Site_Name       ERF_ID  Source_ID       Rupture_ID      Rup_Var_ID      Rup_Var_Scenario_ID     Mag     Prob    Grid_Spacing    Num_Rows        Num_Columns     Period  Component       SA&lt;br /&gt;
The file is sorted by fast Rup_Var_ID, Rupture_ID, Source_ID, Period, slow Component.&lt;br /&gt;
&lt;br /&gt;
Generated by: DB Report&lt;br /&gt;
&lt;br /&gt;
Used by: none, output data product&lt;br /&gt;
&lt;br /&gt;
=== Hazard Curve ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains a hazard curve, either in text, PNG, or PDF format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_ERF&amp;lt;erf id&amp;gt;_Run&amp;lt;run id&amp;gt;_&amp;lt;IM type&amp;gt;_&amp;lt;period&amp;gt;sec_&amp;lt;IM component&amp;gt;_&amp;lt;date run completed&amp;gt;.&amp;lt;pdf|txt|png&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:  The PNG and PDF formats contain an image of the curve.  The PDF format also has an extended legend.  The TXT file contains a list of (X,Y) points which describe the curve.&lt;br /&gt;
&lt;br /&gt;
Generated by: Curve Calc&lt;br /&gt;
&lt;br /&gt;
Used by: none, output data product&lt;br /&gt;
&lt;br /&gt;
=== Disaggregation file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains disaggregation results for a single run, in either text, PNG, or PDF format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_ERF&amp;lt;erf id&amp;gt;_Run&amp;lt;run_id&amp;gt;_Disagg&amp;lt;POE|IM&amp;gt;_&amp;lt;disagg level&amp;gt;_&amp;lt;IM type&amp;gt;_&amp;lt;period&amp;gt;sec_&amp;lt;run date&amp;gt;.&amp;lt;txt|png|pdf&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: The PNG and PDF formats contain a plot of the disaggregation results, showing magnitude vs distance and color-coding based on epsilon.  The PDF and TXT formats contain additional information about individual source contributions, in the following format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;Summary data&lt;br /&gt;
Parameters used to create disaggregation&lt;br /&gt;
Disaggregation bin data:&lt;br /&gt;
Dist Mag &amp;lt;breakout by epsilon values&amp;gt;&lt;br /&gt;
&amp;lt;Breakdown of contribution by distance, magnitude, and epsilon range&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Disaggregation Source List Info:&lt;br /&gt;
Source# %Contribution TotExceedRate SourceName DistRup DistX DistSeis DistJB&lt;br /&gt;
&amp;lt;list of contributing sources, in decreasing order of % contribution&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: Disaggregation&lt;br /&gt;
&lt;br /&gt;
Used by: none, output data product&lt;br /&gt;
&lt;br /&gt;
== Dependencies ==&lt;br /&gt;
&lt;br /&gt;
The following are external software dependencies used by CyberShake software modules.&lt;br /&gt;
&lt;br /&gt;
=== Getpar ===&lt;br /&gt;
&lt;br /&gt;
Purpose: A library written in C which enables parsing of key-value command-line parameters, and enforcement of required parameters.  Rob Graves uses it in his codes.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Rob supplied a copy; it is in the CyberShake repository at https://github.com/SCECcode/cybershake-core/tree/main/Getpar .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Run 'make' in Getpar/getpar/src; this will make the library, libget.a, and install it in the lib directory, where CyberShake codes will expect it.&lt;br /&gt;
&lt;br /&gt;
=== MySQLdb ===&lt;br /&gt;
&lt;br /&gt;
This library has been deprecated in favor of pymysql.&lt;br /&gt;
&lt;br /&gt;
=== pymysql ===&lt;br /&gt;
&lt;br /&gt;
Purpose: MySQL bindings for Python.&lt;br /&gt;
&lt;br /&gt;
How to obtain: pip3 install pymysql .  Documentation is at https://pypi.org/project/PyMySQL/ .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: None; pip3 shouldn't have any issues.&lt;br /&gt;
&lt;br /&gt;
=== UCVM ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Supplies the query tools needed to populate a mesh with velocity information.&lt;br /&gt;
&lt;br /&gt;
How to obtain:  The most recent version of UCVM can be found at [[UCVM#Current_UCVM_Software_Releases|Current UCVM Software Releases]].  As of October 2017, we have only integrated the C version of UCVM into CyberShake.&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Following the standard installation instructions for a cluster should work (running ./ucvm_setup.py).  You will want to install CVM-S4, CVM-S426, CVM-S4.M01, CVM-H, CenCal, CCA-06, and CCA 1D velocity models for CyberShake.&lt;br /&gt;
&lt;br /&gt;
=== libcfu ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Provides a hash table library for a variety of CyberShake codes.&lt;br /&gt;
&lt;br /&gt;
How to obtain: https://sourceforge.net/projects/libcfu/ .  Documentation is at http://libcfu.sourceforge.net/libcfu.html .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Follow the instructions, and install into the utils directory.&lt;br /&gt;
&lt;br /&gt;
=== FFTW ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Library which provides FFTs.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Typically installed on supercomputers already, though you may need to load a module to activate it.&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Doesn't need to be installed in user space.&lt;br /&gt;
&lt;br /&gt;
=== memcached ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Server library for running a memory caching system, using key-value pairs.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Download from https://memcached.org/ .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Follow instructions and install in utils directory.  It has a dependency on libevent, which you may have to install also.&lt;br /&gt;
&lt;br /&gt;
=== libmemcached ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Client library and tools for memcached server.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Download from http://libmemcached.org/libMemcached.html .  Install memcached first.&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Sometimes installing can be a challenge, because it can't find the memcached install.  You may have to set the path to memcached either as an argument to configure, or you may even need to edit the configuration and makefiles directory.  Install into the utils directory.&lt;br /&gt;
&lt;br /&gt;
=== RupGen-api ===&lt;br /&gt;
&lt;br /&gt;
Purpose: To generate rupture variations from a rupture geometry for a given hypocenter and slip using the Graves &amp;amp; Pitarka rupture generator.  The current version is 5.5.2.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Check out from https://github.com/SCECcode/cybershake-core/tree/main/RuptureCodes/RupGen-api-5.5.2 .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions:&lt;br /&gt;
&lt;br /&gt;
#This code is dependent on FFTW.  You may need to edit the makefile to point to the FFTW include files and libraries, since different clusters often use different environment variables to capture FFTW paths.&lt;br /&gt;
#If you want memcached support, edit the makefile in RuptureCodes/RupGen-api-5.5.2/src to uncomment lines 19-21 and edit line 19 to point to the libmemcached install directory.  You'll need to do the same to RuptureCodes/RupGen-api-5.5.2/src/GenRandV5.0/makefile, lines 35-37.&lt;br /&gt;
#You may need to edit CFLAGS in RuptureCodes/RupGen-api-5.5.2/src/GenRandV5.5.2/makefile (lines 21-23) to point to the FFTW path; whether or not this is needed depends on the particular system.&lt;br /&gt;
#Run 'make' in RupGen-api-5.5.2 to make the librupgen.a library.&lt;br /&gt;
&lt;br /&gt;
=== SCEC Broadband Platform ===&lt;br /&gt;
&lt;br /&gt;
Purpose: To generate high-frequency stochastic seismograms for broadband CyberShake runs.&lt;br /&gt;
&lt;br /&gt;
How to obtain:  Follow installation instructions at https://github.com/SCECcode/bbp .&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Code_Base&amp;diff=30589</id>
		<title>CyberShake Code Base</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Code_Base&amp;diff=30589"/>
		<updated>2025-11-26T19:46:28Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* DirectSynth */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page details all the pieces of code which make up the CyberShake code base, as of November 2017.  Note that this does not include the workflow middleware, or the workflow generators; that code is detailed at [[CyberShake Workflow Framework]].&lt;br /&gt;
&lt;br /&gt;
Conceptually, we can divide up the CyberShake codes into three categories:&lt;br /&gt;
&lt;br /&gt;
#Strain Green Tensor-related codes: These codes produce the input files needed to generate SGTs, actually calculate the SGTs, and do some reformatting and sanity checks on the results.&lt;br /&gt;
#Synthesis-related codes: These codes take the SGTs and perform seismogram synthesis and intensity measure calculations.&lt;br /&gt;
#Data product codes: These codes insert the results into the database, and use the database to generate a variety of output data products.&lt;br /&gt;
&lt;br /&gt;
Below is a description of each piece of software we use, organized by these categories.  For each piece of software, we include a description of where it is located, how to compile and use it, and what its inputs and outputs are.  At the end, we provide a description of input and output files and formats.&lt;br /&gt;
&lt;br /&gt;
== Code Installation ==&lt;br /&gt;
&lt;br /&gt;
Historically, we have selected a root directory for CyberShake, then created the subdirectories 'software' for all the code, 'ruptures' for the rupture files, 'logs' for log files, and 'utils' for workflow tools.  This is typically set up in unpurged storage space, so once installed purging isn't a worry.  Each code listed below, along with the configuration file, should be checked out into the 'software' subdirectory.&lt;br /&gt;
&lt;br /&gt;
In terms of compilers, you should use the GNU compilers unless specifically directed otherwise.&lt;br /&gt;
&lt;br /&gt;
Most of the codes below contain a main directory.  Inside that is a bin directory, with binaries; a src directory with code requiring compilation; and wrappers, in the main directory.&lt;br /&gt;
&lt;br /&gt;
If you are looking for compilation instructions, a general guide is available [[CyberShake compilation guide | here]].&lt;br /&gt;
&lt;br /&gt;
=== Configuration file ===&lt;br /&gt;
&lt;br /&gt;
Many CyberShake codes use a configuration file, which specifies the root directory for the CyberShake installation, the command use to start an MPI executable, paths to a tmp and scratch space (which can be the same), and the path to the CyberShake rupture directory.  We have done this instead of environment variables because it's more transparent and easier for multiple users.  Both of these files should be stored in the 'software' subdirectory.&lt;br /&gt;
&lt;br /&gt;
The configuration file is available at:&lt;br /&gt;
 https://github.com/SCECcode/cybershake-core/cybershake.cfg&lt;br /&gt;
Obviously, this file must be edited to be correct for the install.&lt;br /&gt;
&lt;br /&gt;
The keys that CyberShake currently expects to find are:&lt;br /&gt;
*CS_PATH = /path/to/CyberShake/software/directory&lt;br /&gt;
*SCRATCH_PATH = /path/to/shared/scratch&lt;br /&gt;
*TMP_PATH = /path/to/tmp (can be node-local, or shared with scratch&lt;br /&gt;
*RUPTURE_PATH = /path/to/CyberShake/rupture/directory&lt;br /&gt;
*MPI_CMD = ibrun or aprun or mpiexec&lt;br /&gt;
*LOG_PATH = /path/to/CyberShake/logs/directory&lt;br /&gt;
&lt;br /&gt;
To interact with cybershake.cfg, the CyberShake codes use a Python script to deliver cybershake.cfg entries as key-value pairs, located here:&lt;br /&gt;
 https://github.com/SCECcode/cybershake-core/config.py&lt;br /&gt;
Several CyberShake codes import config, then use it to read out the cybershake.cfg file.&lt;br /&gt;
&lt;br /&gt;
=== Compiler file ===&lt;br /&gt;
&lt;br /&gt;
A long time ago, Gideon Juve created a compiler file, Compilers.mk, which contains information about which compilers should be used for which system.  This file should also be downloaded using 'svn export' and installed in the software directory, from&lt;br /&gt;
 https://github.com/SCECcode/cybershake-core/Compilers.mk&lt;br /&gt;
&lt;br /&gt;
Some of the makefiles reference this file.  This can - and should - be updated to reflect new systems.&lt;br /&gt;
&lt;br /&gt;
== SGT-related codes ==&lt;br /&gt;
&lt;br /&gt;
[[File:SGT_workflow_stages.png|thumb|right|300px|Overview of the codes involved in the SGT part of CyberShake, [http://hypocenter.usc.edu/research/cybershake/full_SGT_workflow.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
=== PreCVM ===&lt;br /&gt;
&lt;br /&gt;
This code stands for &amp;quot;Pre-Community-Velocity-Model&amp;quot;.  It has to be run before the UCVM codes, since it generates input files required by UCVM.  &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To determine the simulation volume for a particular CyberShake site.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PreCVM queries the CyberShake database to determine all of the ruptures which fall within a given cutoff for a certain site.  From that information, padding is added around the edges to construct the CyberShake simulation volume for this site.  Additional padding so the X and Y dimensions are multiples of 10, 20, or 40 might also be applied, depending on the input parameters.  Using this volume, both the X/Y offset of each grid point, and then the latitude and longitude using a great circle projection, are determined and written to output files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#The CyberShake volume depth needs to be changed, so as to have the right number of grid points. That is set in the genGrid() function in GenGrid_py/gen_grid.py, in km.&lt;br /&gt;
#X and Y padding needs to be altered.  That is set using 'bound_pad' in Modelbox/get_modelbox.py, around line 70.&lt;br /&gt;
#The rotation of the simulation volume needs to be changed.  That is set using 'model_rot' in Modelbox/get_modelbox.py, around line 70.&lt;br /&gt;
#The database access parameters have changed.  That's in Modelbox/get_modelbox.py, around line 80.&lt;br /&gt;
#The divisibility needs for GPU simulations change (currently, we need the dimensions to be evenly divisible by the number of GPUs used in that dimension.  That is in Modelbox/get_modelbox.py, around line 250.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/PreCVM/&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Rob Graves, wrapped by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]], [[CyberShake_Code_Base#MySQLdb|MySQLdb for Python]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  pre_cvm.py&lt;br /&gt;
    Modelbox/get_modelbox.py&lt;br /&gt;
      Modelbox/bin/gcproj&lt;br /&gt;
    GenGrid_py/gen_grid.py&lt;br /&gt;
      GenGrid_py/bin/gen_model_cords&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the Modelbox/src and the GenGrid_py/src directories.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: pre_cvm.py [options]&lt;br /&gt;
  Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --site=SITE           Site name&lt;br /&gt;
  --erf_id=ERF_ID       ERF ID&lt;br /&gt;
  --modelbox=MODELBOX   Path to modelbox file (output)&lt;br /&gt;
  --gridfile=GRIDFILE   Path to gridfile (output)&lt;br /&gt;
  --gridout=GRIDOUT     Path to gridout (output)&lt;br /&gt;
  --coordfile=COORDSFILE&lt;br /&gt;
                        Path to coorfile (output)&lt;br /&gt;
  --paramsfile=PARAMSFILE&lt;br /&gt;
                        Path to paramsfile (output)&lt;br /&gt;
  --boundsfile=BOUNDSFILE&lt;br /&gt;
                        Path to boundsfile (output)&lt;br /&gt;
  --frequency=FREQUENCY&lt;br /&gt;
                        Frequency&lt;br /&gt;
  --gpu                 Use GPU box settings.&lt;br /&gt;
  --spacing=SPACING     Override default spacing with this value.&lt;br /&gt;
  --server=SERVER       Address of server to query in creating modelbox,&lt;br /&gt;
                        default is focal.usc.edu.&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; requires 6 minutes for 100m spacing, 10 billion point volume&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; None; inputs are retrieved from the database&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Modelbox|modelbox]], [[CyberShake_Code_Base#Gridfile|gridfile]], [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Params|params]], [[CyberShake_Code_Base#Coord|coord]], [[CyberShake_Code_Base#Bounds|bounds]]&lt;br /&gt;
&lt;br /&gt;
=== UCVM ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate a populated velocity mesh for a CyberShake simulation volume.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; UCVM takes the volume defined by PreCVM and queries the [[UCVM]] software, using the C API, to populate the volume.  The resulting mesh is then checked for Vp/Vs ratio, minimum Vp/Vs/rho, and for no Infs or NaNs.  The data is outputted in either Graves (RWG) format or AWP format.  This code also produces log files, which will be written to the CyberShake logs directory/GenLog/site/v_mpi-&amp;lt;processor number&amp;gt;.log.  This can be useful if there's an error and you aren't sure why.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#New velocity models are added.  Velocity models are specified in the DAX and passed through the wrapper scripts into the C code and then ultimately to UCVM, so an if statement must be added to around line 250 (and around line 450 if it's applicable for no GTL).&lt;br /&gt;
#The backend UCVM substantially changes.  If we move to the Python implementation, for example.&lt;br /&gt;
#If additional models are added, new libraries may need to be added to the makefile.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/UCVM&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]], [[CyberShake_Code_Base#UCVM|UCVM]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  single_exe.py&lt;br /&gt;
    single_exe.csh&lt;br /&gt;
      bin/ucvm-single-mpi&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;The makefile needs to be edited so that &amp;quot;UCVM_HOME&amp;quot; points to the UCVM home directory.  Then run 'make' in the UCVM/src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;All of site, gridout, modelcords, models, and format must be specified.&lt;br /&gt;
Usage: single_exe.py [options]&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --site=SITE           Site name&lt;br /&gt;
  --gridout=GRIDOUT     Path to gridout (output)&lt;br /&gt;
  --coordfile=COORDSFILE&lt;br /&gt;
                        Path to coordfile (output)&lt;br /&gt;
  --models=MODELS       Comma-separated string on velocity models to use.&lt;br /&gt;
  --format=FORMAT       Specify awp or rwg format for output.&lt;br /&gt;
  --frequency=FREQUENCY&lt;br /&gt;
                        Frequency&lt;br /&gt;
  --spacing=SPACING     Override default spacing with this value (km)&lt;br /&gt;
  --min_vs=MIN_VS       Override minimum Vs value.  Minimum Vp and minimum&lt;br /&gt;
                        density will be 3.4 times this value.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel on ~4000 cores; for 10 billion points and the C version of UCVM, takes about 20 minutes.  Typically only half the cores per node are used to get more memory per process.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#coords|coords]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; either [[CyberShake_Code_Base#RWG_format|RWG format]] or [[CyberShake_Code_Base#AWP_format|AWP format]], depending on the option selected.&lt;br /&gt;
&lt;br /&gt;
=== Smoothing ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To smooth a velocity file along model interfaces.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The smoothing code takes in a velocity mesh, determines the surface coordinates of the interfaces between velocity models, gets a list of all the points which need to be smoothed, and then performs the smoothing by averaging in both the X and Y direction for a user-specified number of points (default of 10km in each direction).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change our version of UCVM.  The LD_LIBRARY_PATH needs to be modified, in run_smoothing.py around line 98.&lt;br /&gt;
#The smoothing algorithm is modified.  Currently that is specified in the average_point() function in smooth_mpi.c.&lt;br /&gt;
#We start using velocity models with boundaries aren't perpendicular to the earth's surface.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/UCVM/smoothing&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#UCVM|UCVM]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  smoothing/run_smoothing.py&lt;br /&gt;
    bin/determine_surface_model&lt;br /&gt;
    smoothing/determine_smoothing_points.py&lt;br /&gt;
    smoothing/smooth_mpi&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the smoothing directory, and make sure that determine_surface_model has been compiled in the UCVM/src directory.  You may need to change the compiler; currently it uses 'cc'.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: run_smoothing.py [options]&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --gridout=GRIDOUT     gridout file&lt;br /&gt;
  --coords=COORDS       coords file&lt;br /&gt;
  --models=MODELSTRING  comma-separated list of velocity models&lt;br /&gt;
  --smoothing-dist=SMOOTHING_DIST&lt;br /&gt;
                        Number of grid points to smooth over.  About 10km of&lt;br /&gt;
                        grid points is a good starting place.&lt;br /&gt;
  --mesh=MESH           AWP-format velocity mesh to smooth&lt;br /&gt;
  --mesh-out=MESH_OUT   Output smoothed mesh&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel on ~1500 cores; for 5 billion points and the C version of UCVM, takes about 16 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP_format|AWP format velocity file]], [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Coord|coord]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP_format|AWP format]] smoothed velocity file.&lt;br /&gt;
&lt;br /&gt;
=== PreSGT ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate a series of input files which are used by the wave propagation codes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PreSGT determines the X and Y coordinates of the site location (where the impulse will go for the wave propagation simulation) and determines, which mesh point (X and Y) maps most closely to every point on a fault surface which is within the cutoff.  That information is combined with an adaptive mesh approach to create a list of all the points for which SGTs should be saved.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change our approach for saving adaptive mesh points.&lt;br /&gt;
#We change the location of the rupture geometry files, currently assumed to be &amp;lt;rupture root&amp;gt;/Ruptures_erf&amp;lt;erf ID&amp;gt; .  This is specified in presgt.py, line 167.&lt;br /&gt;
#The directory hierarchy and naming scheme for rupture geometry files, currently &amp;lt;src id&amp;gt;/&amp;lt;rup id&amp;gt;/&amp;lt;src id&amp;gt;_&amp;lt;rup_id&amp;gt;.txt, changes.  This is specified in faultlist_py/CreateFaultList.py, line 36.&lt;br /&gt;
#The number of header lines in the rupture geometry file changes.  This would require changing the nheader value, currently 6, specified in faultlist_py/CreateFaultList.py, line 36.&lt;br /&gt;
#We switch to RSQSim ruptures, or other ruptures in which the geometry isn't planar.  Modifications would be required to gen_sgtgrid.c.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/PreSgt&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Rob Graves, heavily modified by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]], [[CyberShake_Code_Base#libcfu|libcfu]], [[CyberShake_Code_Base#MySQLdb|MySQLdb for Python]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  presgt.py&lt;br /&gt;
    faultlist_py/CreateFaultList.py&lt;br /&gt;
    bin/gen_sgtgrid&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./presgt.py &amp;lt;site&amp;gt; &amp;lt;erf_id&amp;gt; &amp;lt;modelbox&amp;gt; &amp;lt;gridout&amp;gt; &amp;lt;model_coords&amp;gt; &amp;lt;fdloc&amp;gt; &amp;lt;faultlist&amp;gt; &amp;lt;radiusfile&amp;gt; &amp;lt;sgtcords&amp;gt; &amp;lt;spacing&amp;gt; [frequency]&lt;br /&gt;
Example: ./presgt.py USC 33 USC.modelbox gridout_USC model_coords_GC_USC USC.fdloc USC.faultlist USC.radiusfile USC.cordfile 200.0 0.1&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel on 8 nodes, 32 cores (gen_sgtgrid is a parallel code); for 200m spacing UCERF2, takes about 8 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Modelbox|modelbox]], [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Coord|coord]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Fdloc|fdloc]], [[CyberShake_Code_Base#Faultlist|faultlist]], [[CyberShake_Code_Base#Radiusfile|radiusfile]], [[CyberShake_Code_Base#SgtCoords|sgtcoords]].&lt;br /&gt;
&lt;br /&gt;
=== PreAWP ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate input files in a format that AWP-ODC expects.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PreAWP performs a number of steps:&lt;br /&gt;
#An IN3D parameter file is produced, needed for AWP-ODC.&lt;br /&gt;
#A file with the SGT coordinates to save in AWP format is produced.  Since RWG and AWP use different coordinate systems, a coordinate transformation (X-&amp;gt;Y, Y-&amp;gt;X, zero-indexing-&amp;gt;one-indexing) is performed on the SGT coordinates file.&lt;br /&gt;
#The velocity file in translated to AWP format, if it isn't in AWP format already. &lt;br /&gt;
#The correct source, based on the dt and nt, is selected.  The source must be generated manually ahead of time.  Details about source generation are given [[CyberShake Code Base#Impulse source descriptions | here]].&lt;br /&gt;
#Striping for the output file is also set up here.&lt;br /&gt;
#Files are symlinked into the directory structure that AWP expects.  Note that slightly different versions of this exist for the CPU and GPU implementations of AWP-ODC-SGT.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#The path to the Lustre striping command (lfs) changes.  This path is hard-coded in build_awp_inputs.py, line 14.  Note that this is the path to lfs on the compute node, NOT the login node.&lt;br /&gt;
#The AWP code changes its input format.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/AWP-HIP-SGT/utils/ (HIP GPU) or https://github.com/SCECcode/cybershake-core/AWP-GPU-SGT/utils/ (CUDA GPU) or https://github.com/SCECcode/cybershake-core/AWP-ODC-SGT/utils/ (CPU), AND also https://github.com/SCECcode/cybershake-core/SgtHead &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; SgtHead&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  build_awp_inputs.py&lt;br /&gt;
    build_IN3D.py&lt;br /&gt;
    build_src.py&lt;br /&gt;
    build_cordfile.py&lt;br /&gt;
      SgtHead/gen_awp_cordfile.py&lt;br /&gt;
    build_media.py&lt;br /&gt;
      SgtHead/bin/reformat_velocity&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the SgtHead/src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: build_awp_inputs.py [options]&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --site=SITE           Site name&lt;br /&gt;
  --gridout=GRIDOUT     Path to gridout input file&lt;br /&gt;
  --fdloc=FDLOC         Path to fdloc input file&lt;br /&gt;
  --cordfile=CORDFILE   Path to cordfile input file&lt;br /&gt;
  --velocity-prefix=VEL_PREFIX&lt;br /&gt;
                        RWG velocity prefix.  If omitted, will not reformat&lt;br /&gt;
                        velocity file, just symlink.&lt;br /&gt;
  --frequency=FREQUENCY&lt;br /&gt;
                        Frequency of SGT run, 0.5 Hz by default.&lt;br /&gt;
  --px=PX               Number of processors in X-direction.&lt;br /&gt;
  --py=PY               Number of processors in Y-direction.&lt;br /&gt;
  --pz=PZ               Number of processors in Z-direction.&lt;br /&gt;
  --source-frequency=SOURCE_FREQ&lt;br /&gt;
                        Low-pass filter frequency to use on the source,&lt;br /&gt;
                        default is same frequency as the frequency of the run.&lt;br /&gt;
  --spacing=SPACING     Override default spacing, derived from frequency.&lt;br /&gt;
  --velocity-mesh=VEL_MESH&lt;br /&gt;
                        Provide path to velocity mesh.  If omitted, will&lt;br /&gt;
                        assume mesh is named awp.&amp;lt;site&amp;gt;.media.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for 1 Hz run, takes about 11 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Fdloc|fdloc]], [[CyberShake_Code_Base#SgtCoords|cordfile]], velocity mesh (if in [[CyberShake_Code_Base#RWG_format|RWG format]], will be converted to [[CyberShake_Code_Base#AWP_format|AWP]]), [[CyberShake_Code_Base#RWG_source|RWG source]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP source|AWP source]], [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]], [[CyberShake_Code_Base#AWP cordfile|AWP cordfile]].&lt;br /&gt;
&lt;br /&gt;
=== AWP-ODC-SGT, CPU version ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To perform SGT synthesis&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; AWP-ODC-SGT is the CPU version. It uses the IN3D file for its parameters.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#New science or features are added to the AWP code.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/AWP-ODC-SGT&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kim Olsen, Steve Day, Yifeng Cui, various students and post-docs, wrapped by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; iobuf module&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  awp_odc_wrapper.sh&lt;br /&gt;
    bin/pmcl3d&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Using the GNU compilers, run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt; pmcl3d &amp;lt;IN3D parameter file&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel; for 0.5 Hz run (2 billion points, 20k timesteps), takes about 45 minutes on 10,000 cores.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP cordfile|AWP cordfile]],  [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]]), [[CyberShake_Code_Base#AWP source|AWP source]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]].&lt;br /&gt;
&lt;br /&gt;
=== AWP-ODC-SGT, CUDA GPU version ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To perform SGT synthesis&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; AWP-ODC-SGT is the GPU version. It takes parameters on the command-line, so the wrapper converts the IN3D file into command-line arguments and invokes it.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#New science or features are added to the AWP code.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; hhttps://github.com/SCECcode/cybershake-core/AWP-GPU-SGT&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kim Olsen, Steve Day, Yifeng Cui, various students and post-docs, wrapped by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; CUDA toolkit module&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  gpu_wrapper.py&lt;br /&gt;
    bin/pmcl3d&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;modules PrgEnv-gnu and module cudatoolkit must be loaded first.  Then, run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./pmcl3d &lt;br /&gt;
Options:&lt;br /&gt;
	[(-T | --TMAX) &amp;lt;TMAX&amp;gt;]&lt;br /&gt;
	[(-H | --DH) &amp;lt;DH&amp;gt;]&lt;br /&gt;
	[(-t | --DT) &amp;lt;DT&amp;gt;]&lt;br /&gt;
	[(-A | --ARBC) &amp;lt;ARBC&amp;gt;]&lt;br /&gt;
	[(-P | --PHT) &amp;lt;PHT&amp;gt;]&lt;br /&gt;
	[(-M | --NPC) &amp;lt;NPC&amp;gt;]&lt;br /&gt;
	[(-D | --ND) &amp;lt;ND&amp;gt;]&lt;br /&gt;
	[(-S | --NSRC) &amp;lt;NSRC&amp;gt;]&lt;br /&gt;
	[(-N | --NST) &amp;lt;NST&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-V | --NVE) &amp;lt;NVE&amp;gt;]&lt;br /&gt;
	[(-B | --MEDIASTART) &amp;lt;MEDIASTART&amp;gt;]&lt;br /&gt;
	[(-n | --NVAR) &amp;lt;NVAR&amp;gt;]&lt;br /&gt;
	[(-I | --IFAULT) &amp;lt;IFAULT&amp;gt;]&lt;br /&gt;
	[(-R | --READ_STEP) &amp;lt;x READ_STEP]&lt;br /&gt;
&lt;br /&gt;
	[(-X | --NX) &amp;lt;x length]&lt;br /&gt;
	[(-Y | --NY) &amp;lt;y length&amp;gt;]&lt;br /&gt;
	[(-Z | --NZ) &amp;lt;z length]&lt;br /&gt;
	[(-x | --NPX) &amp;lt;x processors]&lt;br /&gt;
	[(-y | --NPY) &amp;lt;y processors&amp;gt;]&lt;br /&gt;
	[(-z | --NPZ) &amp;lt;z processors&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-1 | --NBGX) &amp;lt;starting point to record in X&amp;gt;]&lt;br /&gt;
	[(-2 | --NEDX) &amp;lt;ending point to record in X&amp;gt;]&lt;br /&gt;
	[(-3 | --NSKPX) &amp;lt;skipping points to record in X&amp;gt;]&lt;br /&gt;
	[(-11 | --NBGY) &amp;lt;starting point to record in Y&amp;gt;]&lt;br /&gt;
	[(-12 | --NEDY) &amp;lt;ending point to record in Y&amp;gt;]&lt;br /&gt;
	[(-13 | --NSKPY) &amp;lt;skipping points to record in Y&amp;gt;]&lt;br /&gt;
	[(-21 | --NBGZ) &amp;lt;starting point to record in Z&amp;gt;]&lt;br /&gt;
	[(-22 | --NEDZ) &amp;lt;ending point to record in Z&amp;gt;]&lt;br /&gt;
	[(-23 | --NSKPZ) &amp;lt;skipping points to record in Z&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-i | --IDYNA) &amp;lt;i IDYNA&amp;gt;]&lt;br /&gt;
	[(-s | --SoCalQ) &amp;lt;s SoCalQ&amp;gt;]&lt;br /&gt;
	[(-l | --FL) &amp;lt;l FL&amp;gt;]&lt;br /&gt;
	[(-h | --FH) &amp;lt;i FH&amp;gt;]&lt;br /&gt;
	[(-p | --FP) &amp;lt;p FP&amp;gt;]&lt;br /&gt;
	[(-r | --NTISKP) &amp;lt;time skipping in writing&amp;gt;]&lt;br /&gt;
	[(-W | --WRITE_STEP) &amp;lt;time aggregation in writing&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-100 | --INSRC) &amp;lt;source file&amp;gt;]&lt;br /&gt;
	[(-101 | --INVEL) &amp;lt;mesh file&amp;gt;]&lt;br /&gt;
	[(-o | --OUT) &amp;lt;output file&amp;gt;]&lt;br /&gt;
	[(-c | --CHKFILE) &amp;lt;checkpoint file to write statistics&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
	[(-G | --IGREEN) &amp;lt;IGREEN for SGT&amp;gt;]&lt;br /&gt;
	[(-200 | --NTISKP_SGT) &amp;lt;NTISKP for SGT&amp;gt;]&lt;br /&gt;
	[(-201 | --INSGT) &amp;lt;SGT input file&amp;gt;]&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel; for 1 Hz run (10 billion points, 40k timesteps), takes about 55 minutes on 800 GPUs.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP cordfile|AWP cordfile]],  [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]]), [[CyberShake_Code_Base#AWP source|AWP source]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]].&lt;br /&gt;
&lt;br /&gt;
=== PostAWP ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To prepare the AWP results for use in post-processing.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PostAWP prepares the outputs of AWP so that they can be used with the RWG-authored post-processing code.  Specifically, it undoes the AWP coordinate transformation and reformats the AWP output files into the SGT component order expected by RWG (XX-&amp;gt;YY, YY-&amp;gt;XX, XZ-&amp;gt;-YZ, YZ-&amp;gt;-XZ, and all SGTs are doubled if we are calculating the Z-component), creates separate SGT header files, and calculates MD5 sums on the SGT files.  Calculating the header information requires a number of input files, since lambda, mu, and the location of the impulse must all be included.  The MD5 sums can be calculated separately, using the MD5 wrapper RunMD5sum. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#The AWP code is modified to produce outputs in exactly RWG order&lt;br /&gt;
#The header format for the post-processing code changes&lt;br /&gt;
#We decide not to calculate MD5 sums&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/AWP-HIP-SGT/utils/prepare_for_pp.py (this will work for the CPU version of AWP also, despite the path); https://github.com/SCECcode/cybershake-core/SgtHead&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  AWP-GPU-SGT/utils/prepare_for_pp.py&lt;br /&gt;
    SgtHead/bin/reformat_awp_mpi&lt;br /&gt;
    SgtHead/bin/write_head&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make write_head' and 'make reformat_awp_mpi' in the SgtHead/src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./prepare_for_pp.py &amp;lt;site&amp;gt; &amp;lt;AWP SGT&amp;gt; &amp;lt;reformatted SGT filename&amp;gt; &amp;lt;modelbox file&amp;gt; &amp;lt;rwg cordfile&amp;gt; &amp;lt;fdloc file&amp;gt; &amp;lt;gridout file&amp;gt; &amp;lt;IN3D file&amp;gt; &amp;lt;AWP media file&amp;gt; &amp;lt;component&amp;gt; &amp;lt;run_id&amp;gt; &amp;lt;header&amp;gt; [frequency]&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel, 4 processors on 2 nodes; for a 750 GB SGT, takes about 100 minutes &amp;lt;b&amp;gt;without&amp;lt;/b&amp;gt; the MD5 sums.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]], [[CyberShake_Code_Base#Modelbox|modelbox]],  [[CyberShake_Code_Base#SgtCoords|RWG cordfile]]), [[CyberShake_Code_Base#Fdloc|fdloc]], [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT file]], [[CyberShake_Code_Base#SGT header file|SGT header file]]&lt;br /&gt;
&lt;br /&gt;
=== RunMD5sum ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Wrapper for performing MD5sums.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; On Titan, we ran into wallclock issues when bundling the MD5sums along with PostAWP.  This wrapper supports performing the MD5 sums separately.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change hash algorithms&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/SgtHead/run_md5sum.sh&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  run_md5sum.sh&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./run_md5sum.sh &amp;lt;file&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for a 750 GB SGT, takes about 70 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; MD5sum, with filename &amp;lt;RWG SGT filename&amp;gt;.md5&lt;br /&gt;
&lt;br /&gt;
=== NanCheck ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Check the SGTs for anomalies before the post-processing.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code checks to be sure the SGTs are the expected size, then checks for NaNs or too many consecutive zeros in the SGT files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change the number of timesteps in the SGT file.  Currently this is hardcoded, but it should be a command-line parameter.&lt;br /&gt;
#We want to add additional checks.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/SgtTest/&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Rob Graves, Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  perform_checks.py&lt;br /&gt;
    bin/check_for_nans&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Run 'make' in SgtTest/src .&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./perform_checks.py &amp;lt;SGT file&amp;gt; &amp;lt;SGT header file&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for a 750 GB SGT, takes about 45 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]], [[CyberShake_Code_Base#Sgt Coords|RWG coordinate file]], [[CyberShake_Code_Base#IN3D | IN3D file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
== PP-related codes ==&lt;br /&gt;
&lt;br /&gt;
The following codes are related to the post-processing part of the workflow.&lt;br /&gt;
&lt;br /&gt;
[[File:PP_workflow_stages.png|thumb|right|300px|Overview of the codes involved in the PP part of CyberShake, [http://hypocenter.usc.edu/research/cybershake/full_PP_workflow.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== CheckSgt ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To check the MD5 sums of the SGT files to be sure they match.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; CheckSgt takes the SGT files and their corresponding MD5 sums and checks for agreement.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change hashing algorithms.&lt;br /&gt;
#We decide to add additional sanity checks to the beginning of the post-processing.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/CheckSgt&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  CheckSgt.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./CheckSgt.py &amp;lt;sgt file&amp;gt; &amp;lt;md5 file&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for a 750 GB SGT, takes about 90 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT]], SGT MD5 sums&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
=== DirectSynth ===&lt;br /&gt;
&lt;br /&gt;
DirectSynth is the code we currently use to perform the post-processing.  For historical reasons, all of the codes used for CyberShake post-processing are documented here: [https://scec.usc.edu/it/Post-processing_options CyberShake post-processing options] (login required).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To perform reciprocity calculations and produce seismograms, intensity measures, and duration measures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; DirectSynth reads in the SGTs across a group of processes, and hands out tasks (synthesis jobs) to worker processes.  These worker processes read in rupture geometry information from disk and call the RupGen-api to generate full slip histories in memory.  The workers request SGTs from the reader processes over MPI. X and Y component PSA calculations are performed from the resultant seismograms, and RotD and duration calculations are also performed, if requested.  More details about the approach used are available at [[DirectSynth]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We have new intensity measures or other calculations per seismogram to perform.&lt;br /&gt;
#We decide to change the post-processing algorithm.&lt;br /&gt;
#The wrapper needs to be modified if we want to set different custom environment variables.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/DirectSynth&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan, original seismogram synthesis code by Rob Graves, X and Y component PSA code by David Okaya, RotD code by Christine Goulet&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake Code Base#Getpar | Getpar]], [[CyberShake Code Base#libcfu | libcfu]], [[CyberShake Code Base#RupGen-api-v3.3.1 | RupGen-api-v3.3.1, [[CyberShake Code Base#FFTW | FFTW]], [[CyberShake Code Base#libmemcached | libmemcached]] (optional) and [[CyberShake Code Base#memcached | memcached]] (optional)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  direct_synth_v3.3.1.py (current version, uses the Graves &amp;amp; Pitarka (2014) rupture generator)&lt;br /&gt;
    utils/pegasus_wrappers/invoke_memcached.sh&lt;br /&gt;
      memcached&lt;br /&gt;
    bin/direct_synth  &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; &lt;br /&gt;
#Compile RupGen-api first.&lt;br /&gt;
#Edit the makefile in DirectSynth/src .  Check the following variables:&lt;br /&gt;
##BASE_DIR should point to the top-level CyberShake install directory&lt;br /&gt;
##LIBCFU should point to the libcfu install directory&lt;br /&gt;
##V3_3_1_RG_LIB should point to the RupGen-api-3.3.1/lib directory&lt;br /&gt;
##LDLIBS should have the correct paths to the libcfu and libmemcached lib directories&lt;br /&gt;
##V3_3_1_RG_INC should point to the RupGen-api-3.3.1/include directory&lt;br /&gt;
##IFLAGS should have the correct paths to the libcfu and libmemcached include directories&lt;br /&gt;
#Run 'make direct_synth_v3.3.1' in DirectSynth/src.&lt;br /&gt;
&lt;br /&gt;
You will also need to edit the hard-coded paths to memcached in direct_synth_v3.3.1.py, in lines 15 and 24.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
direct_synth_v3.3.1.py&lt;br /&gt;
 stat=&amp;lt;site short name&amp;gt;&lt;br /&gt;
 slat=&amp;lt;site lat&amp;gt;&lt;br /&gt;
 slon=&amp;lt;site lon&amp;gt;&lt;br /&gt;
 run_id=&amp;lt;run id&amp;gt;&lt;br /&gt;
 sgt_handlers=&amp;lt;number of SGT handler processes; must be enough for the SGTs to be read into memory&amp;gt;&lt;br /&gt;
 debug=&amp;lt;print logs for each process; 1 is yes, 0 no&amp;gt;&lt;br /&gt;
 max_buf_mb=&amp;lt;buffer size in MB for each worker to use for storing SGT information&amp;gt;&lt;br /&gt;
 rupture_spacing=&amp;lt;'uniform' or 'random' hypocenter spacing&amp;gt;&lt;br /&gt;
 ntout=&amp;lt;nt for seismograms&amp;gt;&lt;br /&gt;
 dtout=&amp;lt;dt for seismograms&amp;gt;&lt;br /&gt;
 rup_list_file=&amp;lt;input file containing ruptures to process&amp;gt;&lt;br /&gt;
 sgt_xfile=&amp;lt;input SGT X file&amp;gt;&lt;br /&gt;
 sgt_yfile=&amp;lt;input SGT Y file&amp;gt;&lt;br /&gt;
 x_header=&amp;lt;input SGT X header&amp;gt;&lt;br /&gt;
 y_header=&amp;lt;input SGT Y header&amp;gt;&lt;br /&gt;
 det_max_freq=&amp;lt;maximum frequency of deterministic part&amp;gt;&lt;br /&gt;
 stoch_max_freq=&amp;lt;maximum frequency of stochastic part&amp;gt;&lt;br /&gt;
 run_psa=&amp;lt;'1' to run X and Y component PSA, '0' to not&amp;gt;&lt;br /&gt;
 run_rotd=&amp;lt;'1' to run RotD calculations, '0' to not&amp;gt;&lt;br /&gt;
 run_durations=&amp;lt;'1' to run duration calculation, '0' to not&amp;gt;&lt;br /&gt;
 simulation_out_pointsX=&amp;lt;'2', the number of components&amp;gt;&lt;br /&gt;
 simulation_out_pointsY=1&lt;br /&gt;
 simulation_out_timesamples=&amp;lt;same as ntout&amp;gt;&lt;br /&gt;
 simulation_out_timeskip=&amp;lt;same as dtout&amp;gt;&lt;br /&gt;
 surfseis_rspectra_seismogram_units=cmpersec&lt;br /&gt;
 surfseis_rspectra_output_units=cmpersec2&lt;br /&gt;
 surfseis_rspectra_output_type=aa&lt;br /&gt;
 surfseis_rspectra_period=all&lt;br /&gt;
 surfseis_rspectra_apply_filter_highHZ=&amp;lt;high filter, 5.0 for 1 Hz runs, 20.0 or higher for 10 Hz runs&amp;gt;&lt;br /&gt;
 surfseis_rspectra_apply_byteswap=no&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel, typically on 3840 processors; for 750 GB SGTs with ~7000 ruptures, takes about 12 hours.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT]], [[CyberShake_Code_Base#SGT header file|SGT headers]], [[CyberShake_Code_Base#rupture list file|rupture list file]], [[CyberShake_Code_Base#rupture variation info file|rupture variation info file]], [[CyberShake_Rupture_Files|rupture geometry files]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | Seismograms]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | PSA files]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | RotD files]], [[Accessing_CyberShake_Duration_Data | Duration files]]&lt;br /&gt;
&lt;br /&gt;
== Data Product Codes ==&lt;br /&gt;
&lt;br /&gt;
The software in this section takes the data products produced by the SGT and post-processing stages, adds some of it to the database, and creates final data products.  Note that all these codes should be installed on a server close to the database, to reduce insertion and query time.  Currently these are all installed on SCEC disks and accessed from shock.usc.edu.&lt;br /&gt;
&lt;br /&gt;
[[File:Data_workflow_stages.png|thumb|right|300px|Overview of the codes involved in the data product of CyberShake, [http://hypocenter.usc.edu/research/cybershake/full_data_workflow.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
=== Load Amps ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Load data from output files into the database.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code loads either PSA, RotD, or Duration data into the database, depending on command-line options.  It also performs sanity checks on the PSA data being inserted: values must be between 0.008 and 8400 cm/s2. If they are less than 0.008, some will still be passed through if it's a small magnitude event at large distances.  If this constraint is violated, it will abort.  Note that if LoadAmps needs to be rerun, sometimes the database must be cleaned out first, as data from the previous attempt may have inserted successfully and will cause duplicate key errors if you try to insert the same data again.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change the sanity checks on data inserts.&lt;br /&gt;
#We modify the format of the PSA, RotD, or Duration files.&lt;br /&gt;
#We add new types of data to insert.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
#We add a new server.  To add a new server, in addition to providing a command-line option for it, you will need to create a Hibernate config file.  You can start with moment.cfg.xml or focal.cfg.xml and edit lines 7-16 appropriately.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/CyberCommands&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Joshua Garcia, Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; LoadAmps calls CyberCommands, a Java code with a long list of dependencies (all of these are checked into the Java project):&lt;br /&gt;
&lt;br /&gt;
*Ant&lt;br /&gt;
*Apache Commons&lt;br /&gt;
*Hibernate&lt;br /&gt;
*MySQL bindings&lt;br /&gt;
*Xerces&lt;br /&gt;
*DOM4J&lt;br /&gt;
*Log4J&lt;br /&gt;
*Java 1.6+&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  insert_dir.sh&lt;br /&gt;
    CyberLoadAmps_SC&lt;br /&gt;
      cybercommands_SC.jar&lt;br /&gt;
        CyberLoadamps.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Check out CyberCommands into Eclipse.  Create the cybercommands_SC.jar file using Eclipse's JAR build framework and the cybercommands_SC.jardesc description file.  Install cybercommand_SC.jar and the required JAR files on the server.  Point insert_dir.sh to CyberLoadAmps_SC to cybercommands_SC.jar.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: CyberLoadAmps [-r | -d | -z | -u][-c] [-d] [-periods periods] [-run RunID] [-p directory] [-server name] [-z] [-help] [-i insertion_values]&lt;br /&gt;
       [-u] [-f]&lt;br /&gt;
 -i &amp;lt;insertion_values&amp;gt;   Which values to insert -&lt;br /&gt;
                         gm:	geometric mean PSA data (default)&lt;br /&gt;
                         xy:	X and Y component PSA data&lt;br /&gt;
                         gmxy:  Geometric mean and X and Y components&lt;br /&gt;
 -run &amp;lt;RunID&amp;gt;            Run ID - this option is required&lt;br /&gt;
 -p &amp;lt;directory&amp;gt;          file path with spectral acceleration files,&lt;br /&gt;
                         either top-level directory or zip file - this option is required&lt;br /&gt;
 -server &amp;lt;name&amp;gt;          server name (focal, surface, intensity, moment,&lt;br /&gt;
                         or csep-x) - this option is required&lt;br /&gt;
 -periods &amp;lt;periods&amp;gt;      Comma-delimited periods to insert&lt;br /&gt;
 -c                      Convert values from g to cm/sec^2&lt;br /&gt;
 -d                      Assume one BSA file per rupture, with embedded&lt;br /&gt;
                         header information.&lt;br /&gt;
 -f                      Don't apply value checks to insertion values; use&lt;br /&gt;
                         with care!.&lt;br /&gt;
 -help                   print this message&lt;br /&gt;
 -r                      Read rotd files (instead of bsa.)&lt;br /&gt;
 -u                      Read duration files (instead of bsa.)&lt;br /&gt;
 -z                      Read zip files instead of bsa.&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for 5 periods, takes about 10 minutes.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | PSA files]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | RotD files]], [[Accessing_CyberShake_Duration_Data | Duration files]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
=== Check DB Site ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Verify that data was correctly loaded into the database.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a list of components (or type IDs) to check for a run ID, and verifies that there is one entry for every rupture variation.  If some rupture variations are missing, a file is produced which lists the missing source, rupture, rupture variation tuples.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/db/CheckDBDataForSite.java and DBConnect.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan (CheckDBDataForSite.java), Nitin Gupta, Vipin Gupta, Phil Maechling (DBConnect.java)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; Both are checked into the CyberShake project:&lt;br /&gt;
&lt;br /&gt;
*Apache Commons&lt;br /&gt;
*MySQL bindings&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  check_db.sh&lt;br /&gt;
    CheckDBDataForSite.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Check out CheckDBDataForSite.java and DBConnect.java.  Compile them by running 'javac -classpath mysql-connector-java-5.0.5-bin.jar:commons-cli-1.0.jar DBConnect.java CheckDBDataForSite.java'.  The paths to the MySQL bindings jar and the Apache Commons jar may be different depending on your installation.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;usage: CheckDBDataForSite&lt;br /&gt;
 -p &amp;lt;periods&amp;gt;     Comma-separated list of periods to check, for geometric&lt;br /&gt;
                  and rotd.&lt;br /&gt;
 -t &amp;lt;type_ids&amp;gt;    Comma-separated list of type IDs to check, for duration.&lt;br /&gt;
 -c &amp;lt;component&amp;gt;   Component type (geometric, rotd, duration) to check.&lt;br /&gt;
 -h,--help        Print help for CheckDBDataForSite&lt;br /&gt;
 -o &amp;lt;output&amp;gt;      Path to output file, if something is missing (required).&lt;br /&gt;
 -r &amp;lt;run_id&amp;gt;      Run ID to check (required).&lt;br /&gt;
 -s &amp;lt;server&amp;gt;      DB server to query against.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; typically takes just a few seconds.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Missing variations file | Missing variations file]]&lt;br /&gt;
&lt;br /&gt;
=== DB Report ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Produce a database report, a data product which Rob Graves used for a time.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a run ID, queries the database for PSA values for all components, and writes the output to a text file.  The list of periods and the DB config parameters are specified in an XML config file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed: the DB connection parameters in default.xml would need to be edited.&lt;br /&gt;
#We want results for different periods: edit default.xml.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/reports/db_report_gen.py . default.xml in the same directory is also needed, and can be generated by editing and running conf_get.py, also in the same directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kevin Milner&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; MySQLdb&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  db_report_gen.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None, all code is Python.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: db_report_gen.py [options] SITE_SHORT_NAME&lt;br /&gt;
&lt;br /&gt;
NOTE: defaults are loaded from defaults.xml and can be edited manually&lt;br /&gt;
	  or overridden with conf_gen.py&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  -e ERF_ID, --erfID=ERF_ID&lt;br /&gt;
                        ERF ID for Report (default = none)&lt;br /&gt;
  -f FILENAME, --file=FILENAME&lt;br /&gt;
                        Store Results to a file instead of STDOUT. If a&lt;br /&gt;
                        directory is given, a name will be auto generated.&lt;br /&gt;
  -i, --id              Flag for specifying site ID instead of Short Name&lt;br /&gt;
                        (default uses Short Name)&lt;br /&gt;
  --hypo, --hpyocenter  Flag for appending hypocenter locations to result&lt;br /&gt;
  -l LIMIT, --limit=LIMIT&lt;br /&gt;
                        Limit the total number of rusults, or 0 for no limit&lt;br /&gt;
                        (default = 0)&lt;br /&gt;
  -o, --sort            SLOW: Force SQL Order By statement for sorting. It&lt;br /&gt;
                        will probably come out sorted, but if it doesn't, you&lt;br /&gt;
                        can use this. (default will not sort)&lt;br /&gt;
  -p PERIODS, --periods=PERIODS&lt;br /&gt;
                        Comma separated period values (default = 3.0,5.0,10.0)&lt;br /&gt;
  --pr, --print-runs    Print run IDs for site and optionally ERF/Rup Var&lt;br /&gt;
                        Scen/SGT Var IDs&lt;br /&gt;
  -r RUP_VAR_SCENARIO_ID, --rupVarID=RUP_VAR_SCENARIO_ID&lt;br /&gt;
                        Rupture Variation Scenario ID for Report (default =&lt;br /&gt;
                        none)&lt;br /&gt;
  --ri=RUN_ID, --runID=RUN_ID&lt;br /&gt;
                        Allows you to specify a run ID to use (default uses&lt;br /&gt;
                        latest compatible run ID)&lt;br /&gt;
  -R RUPTURE, --rupture=RUPTURE&lt;br /&gt;
                        Only give information on specified rupture. Must be&lt;br /&gt;
                        acompanied by -S/--source flag (default shows all&lt;br /&gt;
                        ruptures)&lt;br /&gt;
  -s SGT_VAR_ID, --sgtVarID=SGT_VAR_ID&lt;br /&gt;
                        SGT Variation ID for Report (default = none)&lt;br /&gt;
  -S SOURCE, --source=SOURCE&lt;br /&gt;
                        Only give information on specified source. To specify&lt;br /&gt;
                        rupture, see -R option (default shows all sources)&lt;br /&gt;
  --s_im, --sort-ims    Sort output by IM value (increasing)...may be slow!&lt;br /&gt;
  -v, --verbose         Verbosity Flag (default = False)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; about 1 minute.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; default.xml&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#DB Report file | DB Report file]]&lt;br /&gt;
&lt;br /&gt;
=== Curve Calc ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Calculate CyberShake hazard curves alongside comparison GMPEs.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a run ID, component, and period, queries the database for the appropriate IM values, and calculates a hazard curve in the desired format. Comparison GMPE curves can also be plotted.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
#New IM types need to be supported.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; The CyberShake curve calculator is part of the OpenSHA codebase.  The specific Java class is org.opensha.sha.cybershake.plot.HazardCurvePlotter (available via https://github.com/opensha/opensha-cybershake/tree/master/src/main/java/org/opensha/sha/cybershake/plot), but it has a complex set of Java depdendencies.  To compile and run, you should follow the instructions on http://www.opensha.org/trac/wiki/SettingUpEclipse to access the source.  The curve calculator is also wrapped by curve_plot_wrapper.sh, in https://github.com/SCECcode/cybershake-tools/blob/master/HazardCurveGeneration/curve_plot_wrapper.sh .&lt;br /&gt;
&lt;br /&gt;
The OpenSHA project also has configuration files for various GMPEs, config files for UCERF2, and configuration files for output formats preferred by Tom and Rob, in src/org/opensha/sha/cybershake/conf.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kevin Milner&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; Standard OpenSHA dependencies&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  curve_plot_wrapper.sh&lt;br /&gt;
    HazardCurvePlotter.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Use the OpenSHA build process if building from source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
usage: HazardCurvePlotter [-?] [-af &amp;lt;arg&amp;gt;] [-benchmark] [-c] [-cmp &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-comp &amp;lt;arg&amp;gt;] [-cvmvs] [-e &amp;lt;arg&amp;gt;] [-ef &amp;lt;arg&amp;gt;] [-f] [-fvs &amp;lt;arg&amp;gt;] [-h&lt;br /&gt;
       &amp;lt;arg&amp;gt;] [-imid &amp;lt;arg&amp;gt;] [-imt &amp;lt;arg&amp;gt;] [-n] [-novm] [-o &amp;lt;arg&amp;gt;] [-p&lt;br /&gt;
       &amp;lt;arg&amp;gt;] [-pf &amp;lt;arg&amp;gt;] [-pl &amp;lt;arg&amp;gt;] [-R &amp;lt;arg&amp;gt;] [-r &amp;lt;arg&amp;gt;] [-s &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-sgt &amp;lt;arg&amp;gt;] [-sgtsym] [-t &amp;lt;arg&amp;gt;] [-v &amp;lt;arg&amp;gt;] [-vel &amp;lt;arg&amp;gt;] [-w&lt;br /&gt;
       &amp;lt;arg&amp;gt;]&lt;br /&gt;
 -?,--help                            Display this message&lt;br /&gt;
 -af,--atten-rel-file &amp;lt;arg&amp;gt;           XML Attenuation Relationship&lt;br /&gt;
                                      description file(s) for comparison.&lt;br /&gt;
                                      Multiple files should be comma&lt;br /&gt;
                                      separated&lt;br /&gt;
 -benchmark,--benchmark-test-recalc   Forces recalculation of hazard&lt;br /&gt;
                                      curves to test calculation speed.&lt;br /&gt;
                                      Newly recalculated curves are not&lt;br /&gt;
                                      kept and the original curves are&lt;br /&gt;
                                      plotted.&lt;br /&gt;
 -c,--calc-only                       Only calculate and insert the&lt;br /&gt;
                                      CyberShake curves, don't make plots.&lt;br /&gt;
                                      If a curve already exists, it will&lt;br /&gt;
                                      be skipped.&lt;br /&gt;
 -cmp,--component &amp;lt;arg&amp;gt;               Intensity measure component.&lt;br /&gt;
                                      Options: GEOM,X,Y,RotD100,RotD50,&lt;br /&gt;
                                      Default: GEOM&lt;br /&gt;
 -comp,--compare-to &amp;lt;arg&amp;gt;             Compare to  aspecific Run ID (or&lt;br /&gt;
                                      multiple IDs, comma separated)&lt;br /&gt;
 -cvmvs,--cvm-vs30                    Option to use Vs30 value from the&lt;br /&gt;
                                      velocity model itself in GMPE&lt;br /&gt;
                                      calculations rather than, for&lt;br /&gt;
                                      example, the Wills 2006 value.&lt;br /&gt;
 -e,--erf-id &amp;lt;arg&amp;gt;                    ERF ID&lt;br /&gt;
 -ef,--erf-file &amp;lt;arg&amp;gt;                 XML ERF description file for&lt;br /&gt;
                                      comparison&lt;br /&gt;
 -f,--force-add                       Flag to add curves to db without&lt;br /&gt;
                                      prompt&lt;br /&gt;
 -fvs,--force-vs30 &amp;lt;arg&amp;gt;              Option to force the given Vs30 value&lt;br /&gt;
                                      to be used in GMPE calculations.&lt;br /&gt;
 -h,--height &amp;lt;arg&amp;gt;                    Plot height (default = 500)&lt;br /&gt;
 -imid,--im-type-id &amp;lt;arg&amp;gt;             Intensity measure type ID. If not&lt;br /&gt;
                                      supplied, will be detected from im&lt;br /&gt;
                                      type/component/period parameters&lt;br /&gt;
 -imt,--im-type &amp;lt;arg&amp;gt;                 Intensity measure type. Options: SA,&lt;br /&gt;
                                      Default: SA&lt;br /&gt;
 -n,--no-add                          Flag to not automatically calculate&lt;br /&gt;
                                      curves not in the database&lt;br /&gt;
 -novm,--no-vm-colors                 Disables Velocity Model coloring&lt;br /&gt;
 -o,--output-dir &amp;lt;arg&amp;gt;                Output directory&lt;br /&gt;
 -p,--period &amp;lt;arg&amp;gt;                    Period(s) to calculate. Multiple&lt;br /&gt;
                                      periods should be comma separated&lt;br /&gt;
                                      (default: 3)&lt;br /&gt;
 -pf,--password-file &amp;lt;arg&amp;gt;            Path to a file that contains the&lt;br /&gt;
                                      username and password for inserting&lt;br /&gt;
                                      curves into the database. Format&lt;br /&gt;
                                      should be &amp;quot;user:pass&amp;quot;&lt;br /&gt;
 -pl,--plot-chars-file &amp;lt;arg&amp;gt;          Specify the path to a plot&lt;br /&gt;
                                      characteristics XML file&lt;br /&gt;
 -R,--run-id &amp;lt;arg&amp;gt;                    Run ID&lt;br /&gt;
 -r,--rv-id &amp;lt;arg&amp;gt;                     Rupture Variation ID&lt;br /&gt;
 -s,--site &amp;lt;arg&amp;gt;                      Site short name&lt;br /&gt;
 -sgt,--sgt-var-id &amp;lt;arg&amp;gt;              STG Variation ID&lt;br /&gt;
 -sgtsym,--sgt-colors                 Enables SGT specific symbols&lt;br /&gt;
 -t,--type &amp;lt;arg&amp;gt;                      Plot save type. Options are png,&lt;br /&gt;
                                      pdf, jpg, and txt. Multiple types&lt;br /&gt;
                                      can be comma separated (default is&lt;br /&gt;
                                      pdf)&lt;br /&gt;
 -v,--vs30 &amp;lt;arg&amp;gt;                      Specify default Vs30 for sites with&lt;br /&gt;
                                      no Vs30 data, or leave blank for&lt;br /&gt;
                                      default value. Otherwise, you will&lt;br /&gt;
                                      be prompted to enter vs30&lt;br /&gt;
                                      interactively if needed.&lt;br /&gt;
 -vel,--vel-model-id &amp;lt;arg&amp;gt;            Velocity Model ID&lt;br /&gt;
 -w,--width &amp;lt;arg&amp;gt;                     Plot width (default = 600)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; about 30 seconds per curve.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; ERF config file, GMPE config files&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Hazard Curve | Hazard Curve]]&lt;br /&gt;
&lt;br /&gt;
=== Disaggregate ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Disaggregate the curve results to determine the largest contributing sources.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a run ID, a probability or IM level, and a period to disaggregate at.  It produces disaggregation distance-magnitude plots and also a list of the % contribution of each source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
#We want to support different kinds of disaggregation, or for a different kind of ERF.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; The Disaggregator is part of the OpenSHA codebase.  The specific Java class is org.opensha.sha.cybershake.plot.DisaggregationPlotter (available via https://github.com/opensha/opensha-cybershake/tree/master/src/main/java/org/opensha/sha/cybershake/plot), but it has a complex set of Java depdendencies.  To compile and run, you should follow the instructions on http://www.opensha.org/trac/wiki/SettingUpEclipse to access the source.  The curve calculator is also wrapped by disagg_plot_wrapper.sh, in https://github.com/SCECcode/cybershake-tools/blob/master/HazardCurveGeneration/disagg_plot_wrapper.sh .&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kevin Milner, Nitin Gupta, Vipin Gupta&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; Standard OpenSHA dependencies&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  disagg_plot_wrapper.sh&lt;br /&gt;
    DisaggregationPlotter.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Use the standard OpenSHA building process if building from source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;usage: DisaggregationPlotter [-?] [-af &amp;lt;arg&amp;gt;] [-cmp &amp;lt;arg&amp;gt;] [-e &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-fvs &amp;lt;arg&amp;gt;] [-i &amp;lt;arg&amp;gt;] [-imid &amp;lt;arg&amp;gt;] [-imt &amp;lt;arg&amp;gt;] [-o &amp;lt;arg&amp;gt;] [-p&lt;br /&gt;
       &amp;lt;arg&amp;gt;] [-pr &amp;lt;arg&amp;gt;] [-r &amp;lt;arg&amp;gt;] [-R &amp;lt;arg&amp;gt;] [-s &amp;lt;arg&amp;gt;] [-sgt &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-t &amp;lt;arg&amp;gt;] [-vel &amp;lt;arg&amp;gt;]&lt;br /&gt;
 -?,--help                    Display this message&lt;br /&gt;
 -af,--atten-rel-file &amp;lt;arg&amp;gt;   XML Attenuation Relationship description&lt;br /&gt;
                              file(s) for comparison. Multiple files&lt;br /&gt;
                              should be comma separated&lt;br /&gt;
 -cmp,--component &amp;lt;arg&amp;gt;       Intensity measure component. Options:&lt;br /&gt;
                              GEOM,X,Y,RotD100,RotD50, Default: GEOM&lt;br /&gt;
 -e,--erf-id &amp;lt;arg&amp;gt;            ERF ID&lt;br /&gt;
 -fvs,--force-vs30 &amp;lt;arg&amp;gt;      Option to force the given Vs30 value to be&lt;br /&gt;
                              used in GMPE calculations.&lt;br /&gt;
 -i,--imls &amp;lt;arg&amp;gt;              Intensity Measure Levels (IMLs) to&lt;br /&gt;
                              disaggregate at. Multiple IMLs should be&lt;br /&gt;
                              comma separated.&lt;br /&gt;
 -imid,--im-type-id &amp;lt;arg&amp;gt;     Intensity measure type ID. If not supplied,&lt;br /&gt;
                              will be detected from im&lt;br /&gt;
                              type/component/period parameters&lt;br /&gt;
 -imt,--im-type &amp;lt;arg&amp;gt;         Intensity measure type. Options: SA,&lt;br /&gt;
                              Default: SA&lt;br /&gt;
 -o,--output-dir &amp;lt;arg&amp;gt;        Output directory&lt;br /&gt;
 -p,--period &amp;lt;arg&amp;gt;            Period(s) to calculate. Multiple periods&lt;br /&gt;
                              should be comma separated (default: 3)&lt;br /&gt;
 -pr,--probs &amp;lt;arg&amp;gt;            Probabilities (1 year) to disaggregate at.&lt;br /&gt;
                              Multiple probabilities should be comma&lt;br /&gt;
                              separated.&lt;br /&gt;
 -r,--rv-id &amp;lt;arg&amp;gt;             Rupture Variation ID&lt;br /&gt;
 -R,--run-id &amp;lt;arg&amp;gt;            Run ID&lt;br /&gt;
 -s,--site &amp;lt;arg&amp;gt;              Site short name&lt;br /&gt;
 -sgt,--sgt-var-id &amp;lt;arg&amp;gt;      STG Variation ID&lt;br /&gt;
 -t,--type &amp;lt;arg&amp;gt;              Plot save type. Options are png, pdf, and&lt;br /&gt;
                              txt. Multiple types can be comma separated&lt;br /&gt;
                              (default is pdf)&lt;br /&gt;
 -vel,--vel-model-id &amp;lt;arg&amp;gt;    Velocity Model ID&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; typically takes about 30 seconds.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Disaggregation file | Disaggregation file]]&lt;br /&gt;
&lt;br /&gt;
== Stochastic codes ==&lt;br /&gt;
&lt;br /&gt;
With CyberShake, we also have the option to augment a completed run with stochastic seismograms.  The following codes are used to add stochastic high-frequency content to an already-completed low-frequency deterministic run.&lt;br /&gt;
&lt;br /&gt;
[[File:stochastic workflow overview.png|thumb|right|300px|Overview of the codes involved in the stochastic part of CyberShake, [http://hypocenter.usc.edu/research/cybershake/stochastic_workflow_overview.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
=== Velocity Info ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To determine slowness-averaged VsX values for a CyberShake site, from UCVM.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; Velocity Info takes a location, a velocity model, and grid spacing information and queries UCVM to generate three VsX values needed by the site response:&lt;br /&gt;
#Vs30, calculated as: 30 / sum( 1 / (Vs sampled from [0.5, 29.5] at 1 meter increments, for 30 values) )&lt;br /&gt;
#Vs5H, like Vs30 but calculated over the shallowest 5*gridspacing meters.  So if gridspacing=100m, Vs5H = 500 / sum( 1 / (Vs sampled from [0.5, 499.5] at 1 meter increments, for 500 values) )&lt;br /&gt;
#VsD5H, like Vs30, but calculated over gridspacing increments, instead of 1 meter.  The start and end are weighted half as much.  So if gridspacing=100m, VsD5H = 5 / sum( 0 / (Vs sampled from [0, 500] at 1 meter increments, for 500 values) )&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We need to support more than one model - for instance, if the CyberShake site box (not simulation volume) spans multiple models.  The code to parse the model string and load models in initialize_ucvm() would need to be changed. &lt;br /&gt;
#We want to support new kinds of velocity values.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/src/retrieve_vs.c&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#UCVM | UCVM]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  retrieve_vs&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make retrieve_vs'&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./retrieve_vs &amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;model&amp;gt; &amp;lt;gridspacing&amp;gt; &amp;lt;out filename&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes about 15 seconds.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Velocity_Info file|Velocity Info file]]&lt;br /&gt;
&lt;br /&gt;
=== Local VM ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate a &amp;quot;local&amp;quot; 1D velocity file, required for the high-frequency codes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; Local VM takes in an input file containing a 1D velocity model.  It then calculates Qs from these values and writes all the velocity data to a new file.  For all Study 15.12 runs, we used the LA Basin 1D model from the BBP, v14.3.0.  It's registered in the RLS, and is located at /home/scec-02/cybershk/runs/genslip_nr_generic1d-gp01.vmod.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change the algorithm for calculating Vs.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/gen_local_vel.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan, modified from Rob Graves' code&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  gen_local_vel.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./gen_local_vel.py &amp;lt;1D velocity model&amp;gt; &amp;lt;output&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes less than a second.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#BBP velocity file|BBP 1D velocity file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Local VM file|Local VM file]]&lt;br /&gt;
&lt;br /&gt;
=== Create Dirs ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To create a directory for each source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The high-frequency codes produce many intermediate files.  To avoid overloading the filesystem, Create Dirs creates a separate directory for every source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; This code is basically just a wrapper around mkdir, and is unlikely to need changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/create_dirs.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  create_dirs.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt; &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Usage: ./create_dirs.py &amp;lt;file with list of dirs&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial, takes just a few seconds.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; File with a directory to create (a source ID) on each line.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; None.&lt;br /&gt;
&lt;br /&gt;
=== HF Synth ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; HF Synth generates a high-frequency stochastic seismogram for one or more rupture variations.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code wraps multiple broadband platform codes to reduce the number of invocations required.  Specifically, it calls:&lt;br /&gt;
#srf2stoch_lite(), a reduced-memory version of srf2stoch.  We have modified it to call rupgen_genslip() to generate the SRF, rather than reading it in from disk.&lt;br /&gt;
#hfsim(), a wrapper for:&lt;br /&gt;
##hb_high(), Rob Graves's original BBP code to produce the seismograms&lt;br /&gt;
##wcc_getpeak(), which calculates PGA for the seismogram&lt;br /&gt;
##wcc_siteamp14(), which performs site amplification.&lt;br /&gt;
&lt;br /&gt;
Vs30 is required, so if it is not passed as a command-line argument, UCVM is called to determine it.&lt;br /&gt;
&lt;br /&gt;
Additionally, hf_synth_lite is able to handle processing on multiple rupture variations, to further reduce the number of invocations.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; &lt;br /&gt;
#A new version of one of Rob's codes - the high-frequency generator or the site amplification - is needed.  We have tried to use whatever the most recent version is on the BBP, for consistency.&lt;br /&gt;
#New velocity parameters are needed for the site amplification.&lt;br /&gt;
#The format of the rupture geometry files changes.&lt;br /&gt;
&lt;br /&gt;
The makefile needs to be changed if the path to libmemcached, UCVM, Getpar, or the rupture generator changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; wrapper by Scott Callaghan, hb_high(), wcc_getpeak(), and wcc_siteamp14() by Rob Graves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar | Getpar]], [[CyberShake_Code_Base#UCVM | UCVM]], [[CyberShake_Code_Base#RupGen-api-v3.3.1 | rupture generator]], [[CyberShake_Code_Base#libmemcached | libmemcached]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  hf_synth_lite&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Run 'make' in src.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt; There is no 'help' usage string, but here's a sample invocation:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/projects/sciteam/jmz/CyberShake/software/HFSim_mem/bin/hf_synth_lite&lt;br /&gt;
   stat=OSI slat=34.6145 slon=-118.7235&lt;br /&gt;
   rup_geom_file=e36_rv6_121_0.txt source_id=121 rupture_id=0&lt;br /&gt;
   num_rup_vars=5 rup_vars=(0,0,0);(1,1,0);(2,2,0);(3,3,0);(4,4,0)&lt;br /&gt;
   outfile=121/Seismogram_OSI_4331_121_0_hf_t0.grm&lt;br /&gt;
   dx=2.0 dy=2.0 tlen=300.0 dt=0.025&lt;br /&gt;
   do_site_response=1 vs30=359.1 debug=0 vmod=LA_Basin_BBP_14.3.0.local&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes a few seconds per rupture variation up to a minute, depending on the size of the fault surface.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Local_VM_file | Local velocity file]], [[CyberShake_Rupture_Files#UCERF2_Rupture_Geometry_Files | rupture geometry file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; High-frequency seismograms, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
=== Combine HF Synth ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; This code combines the seismograms produced by HF Synth so that there is just 1 seismogram per source/rupture combo.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; Since we split up work so that each HF Synth job takes a chunk of rupture variations, we may end up with multiple seismogram files per rupture, each containing some of the rupture variations.  This script concatenates the files, using cat, into a single file, ready to be worked on later in the workflow.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; I can't think of a circumstance where we would need to change this.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/combine_seis.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  combine_seis.py&lt;br /&gt;
    cat&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Usage: ./combine_seis.py &amp;lt;seis 0&amp;gt; &amp;lt;seis 1&amp;gt; ... &amp;lt;seis N&amp;gt; &amp;lt;output seis name&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes a few seconds.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; High-frequency seismograms, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; A single high-frequency seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
=== LF Site Response ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; This code performs site response modifications to the CyberShake low-frequency seismograms.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The LF Site Response code takes a low-frequency seismogram and some velocity parameters, and outputs a seismogram with site response applied.  In Study 15.12, this was a necessary step before combining the low and high frequency seismograms together.  Since Vs30 is required, if it's not passed as a command-line argument, then UCVM is called to determine it.&lt;br /&gt;
&lt;br /&gt;
The reason we calculate site response for the low-frequency deterministic seismograms is that we want both the low- and high-frequency results to be for the same site-response condition.  For the HF, we used Vs30 directly&lt;br /&gt;
for the site-response adjustment, but for the LF we had to use an adjusted VsX value since the grid spacing was 100 m (so Vs30 doesn't make sense).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; &lt;br /&gt;
#We change the site response algorithm.&lt;br /&gt;
#We decide to use different velocity parameters for setting site response.&lt;br /&gt;
#The format of the seismogram files changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/LF_Site_Response&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; wrapper by Scott Callaghan, site response by Rob Graves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar | Getpar]], [[CyberShake_Code_Base#UCVM | UCVM]], [[CyberShake_Code_Base#RupGen-api-v3.3.1 | rupture generator]], [[CyberShake_Code_Base#libmemcached | libmemcached]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  lf_site_response&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Edit the makefile to point to RupGen, libmemcached, and Getpar, then run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
Sample invocation:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./lf_site_response&lt;br /&gt;
seis_in=Seismogram_OSI_3923_263_3.grm seis_out=263/Seismogram_OSI_3923_263_3_site_response.grm&lt;br /&gt;
slat=34.6145 slon=-118.7235&lt;br /&gt;
module=cb2014&lt;br /&gt;
vs30=359.1 vref=344.7&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes less than a second.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; A low-frequency deterministic seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; A low-frequency deterministic seismogram with site response, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
=== Merge IM ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; This code combines low-frequency deterministic and high-frequency stochastic seismograms, then processes them to obtain intensity measures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The Merge IM code takes an LF and HF seismogram and performs the following processing:&lt;br /&gt;
#A high-pass filter is applied to the HF seismogram.&lt;br /&gt;
#The LF seismogram is resampled to the same dt as the HF seismogram.&lt;br /&gt;
#The two seismograms are combined into a single broadband (BB) seismogram.&lt;br /&gt;
#The PSA code is run on the resulting seismogram.&lt;br /&gt;
#If desired, the RotD and duration codes are also run on the seismogram.&lt;br /&gt;
&lt;br /&gt;
Merge IM works on a seismogram file at the rupture level, so it assumes that the input files contain multiple rupture variations.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; &lt;br /&gt;
#We change the filter-and-combine algorithm.&lt;br /&gt;
#We decide to modify the post-processing and IM types we want to capture.&lt;br /&gt;
#The format of the seismogram files changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/MergeIM&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan, Rob Graves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar | Getpar]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  merge_psa&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Edit the makefile to point to Getpar, then run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
Sample invocation:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./merge_psa&lt;br /&gt;
lf_seis=182/Seismogram_OSI_3923_182_23_site_response.grm hf_seis=182/Seismogram_OSI_4331_182_23_hf.grm seis_out=182/Seismogram_OSI_4331_182_23_bb.grm&lt;br /&gt;
freq=1.0 comps=2 num_rup_vars=16&lt;br /&gt;
simulation_out_pointsX=2 simulation_out_pointsY=1&lt;br /&gt;
simulation_out_timesamples=12000 simulation_out_timeskip=0.025&lt;br /&gt;
surfseis_rspectra_seismogram_units=cmpersec surfseis_rspectra_output_units=cmpersec2&lt;br /&gt;
surfseis_rspectra_output_type=aa surfseis_rspectra_period=all&lt;br /&gt;
surfseis_rspectra_apply_filter_highHZ=20.0 surfseis_rspectra_apply_byteswap=no&lt;br /&gt;
out=182/PeakVals_OSI_4331_182_23_bb.bsa&lt;br /&gt;
run_rotd=1 rotd_out=182/RotD_OSI_4331_182_23_bb.rotd&lt;br /&gt;
run_duration=1 duration_out=182/Duration_OSI_4331_182_23_bb.dur&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes 5-30 seconds, depending on the number of rupture variations in the files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; LF deterministic seismogram and HF stochastic seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; BB seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]]; also [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | PSA files]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | RotD files]], and [[Accessing_CyberShake_Duration_Data | Duration files]]&lt;br /&gt;
&lt;br /&gt;
== File types ==&lt;br /&gt;
&lt;br /&gt;
=== Modelbox ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains a description of the simulation box, at the surface.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.modelbox&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;site name&amp;gt;&lt;br /&gt;
APPROXIMATE CENTROID:&lt;br /&gt;
  clon= &amp;lt;centroid lon&amp;gt; clat =&amp;lt;centroid lat&amp;gt;&lt;br /&gt;
MODEL PARAMETERS:&lt;br /&gt;
  mlon= &amp;lt;model lon&amp;gt; mlat =&amp;lt;model lat&amp;gt; mrot=&amp;lt;model rot, default -55&amp;gt; xlen= &amp;lt;x-length in km&amp;gt; ylen= &amp;lt;y-length in km&amp;gt;&lt;br /&gt;
MODEL CORNERS:&lt;br /&gt;
  &amp;lt;lon 1&amp;gt; &amp;lt;lat 1&amp;gt; (x= 0.000 y= 0.000)&lt;br /&gt;
  &amp;lt;lon 2&amp;gt; &amp;lt;lat 2&amp;gt; (x= &amp;lt;max x&amp;gt; y= 0.000)&lt;br /&gt;
  &amp;lt;lon 3&amp;gt; &amp;lt;lat 3&amp;gt; (x= &amp;lt;max x&amp;gt; y= &amp;lt;max y&amp;gt;)&lt;br /&gt;
  &amp;lt;lon 4&amp;gt; &amp;lt;lat 4&amp;gt; (x= 0.000 y= &amp;lt;max y&amp;gt;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== Gridfile ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the three dimensions, and gridspacing in each dimension, of the volume.&lt;br /&gt;
&lt;br /&gt;
Filename convention: gridfile_&amp;lt;site&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
xlen=&amp;lt;x-length in km&amp;gt;&lt;br /&gt;
   0.0  &amp;lt;x-length&amp;gt;  &amp;lt;grid spacing in km&amp;gt;&lt;br /&gt;
ylen=&amp;lt;y-length in km&amp;gt;&lt;br /&gt;
   0.0  &amp;lt;y-length&amp;gt;  &amp;lt;grid spacing in km&amp;gt;&lt;br /&gt;
zlen=&amp;lt;z-length in km&amp;gt;&lt;br /&gt;
   0.0  &amp;lt;z-length&amp;gt;  &amp;lt;grid spacing in km&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gridout ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the km offsets for each grid index, in X, Y, and Z, from the upper southwest corner.&lt;br /&gt;
&lt;br /&gt;
Filename convention: gridout_&amp;lt;site&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
xlen=&amp;lt;x-length in km&amp;gt;&lt;br /&gt;
nx=&amp;lt;number of gridpoints in X direction&amp;gt;&lt;br /&gt;
  0   0   &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  1   &amp;lt;grid spacing&amp;gt;  &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  2   &amp;lt;2*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  3   &amp;lt;3*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
  nx-1 &amp;lt;(nx-1)*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
ylen=&amp;lt;y-length in km&amp;gt;&lt;br /&gt;
ny=&amp;lt;number of gridpoints in Y direction&amp;gt;&lt;br /&gt;
  0   0   &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  1   &amp;lt;grid spacing&amp;gt;  &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
  ny-1 &amp;lt;(ny-1)*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
zlen=&amp;lt;z-length in km&amp;gt;&lt;br /&gt;
nz=&amp;lt;number of gridpoints in Z direction&amp;gt;&lt;br /&gt;
  0   0   &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  1   &amp;lt;grid spacing&amp;gt;  &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
  nz-1 &amp;lt;(nz-1)*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: UCVM, smoothing, PreSGT, PreAWP&lt;br /&gt;
&lt;br /&gt;
=== Params ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Succinctly specify the parameters for the CyberShake volume.  Similar information to the modelbox file, but in a different format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: model_params_GC_&amp;lt;site&amp;gt; (GC stands for 'great circle', the projection we use).&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Model origin coordinates:&lt;br /&gt;
 lon= &amp;lt;model lon&amp;gt; lat=   &amp;lt;model lat&amp;gt; rotate=  &amp;lt;model rotation, default -55&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Model origin shift (cartesian vs. geographic):&lt;br /&gt;
 xshift(km)=   &amp;lt;x shift, usually half the x-length minus 1 grid spacing&amp;gt; yshift(km)=   &amp;lt;y-shift, usually half the y-length minus 1 grid spacing&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Model corners:&lt;br /&gt;
 c1= &amp;lt;nw lon&amp;gt;   &amp;lt;nw lat&amp;gt;&lt;br /&gt;
 c2= &amp;lt;ne lon&amp;gt;   &amp;lt;ne lat&amp;gt;&lt;br /&gt;
 c3= &amp;lt;se lon&amp;gt;   &amp;lt;se lat&amp;gt;&lt;br /&gt;
 c4= &amp;lt;sw lon&amp;gt;   &amp;lt;sw lat&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Model Dimensions:&lt;br /&gt;
 xlen=   &amp;lt;x-length&amp;gt; km&lt;br /&gt;
 ylen=   &amp;lt;y-length&amp;gt; km&lt;br /&gt;
 zlen=   &amp;lt;z-length&amp;gt; km&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: &lt;br /&gt;
&lt;br /&gt;
=== Coord ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the mapping of latitude and longitude to X and Y offsets, for each point on the surface.&lt;br /&gt;
&lt;br /&gt;
Filename convention: model_coords_GC_&amp;lt;site&amp;gt; (GC stands for 'great circle', the projection we use).&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 1 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 2 0&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 1&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 1&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; &amp;lt;ny-1&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: UCVM, smoothing, PreSGT&lt;br /&gt;
&lt;br /&gt;
=== Bounds ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the mapping of latitude and longitude to X and Y offsets, but only for the points along the boundary.  A subset of the coord file.&lt;br /&gt;
&lt;br /&gt;
Filename convention: model_bounds_GC_&amp;lt;site&amp;gt; (GC stands for 'great circle', the projection we use).&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 1 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 2 0&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 1&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 1&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 2&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 2&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 &amp;lt;ny-1&amp;gt;&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 1 &amp;lt;ny-1&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; &amp;lt;ny-1&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by:&lt;br /&gt;
&lt;br /&gt;
=== Velocity files ===&lt;br /&gt;
&lt;br /&gt;
==== RWG format ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Input velocity files for the RWG wave propagation code, emod3d.&lt;br /&gt;
&lt;br /&gt;
Filename convention: v_sgt-&amp;lt;site&amp;gt;.&amp;lt;p, s, or d&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: 3 files, one each for Vp (*.p), Vs (*.s), and rho (*.d).  Each is binary, with 4-byte floats, in fast X, Z (surface-&amp;gt;down), slow Y order.&lt;br /&gt;
&lt;br /&gt;
Generated by: UCVM&lt;br /&gt;
&lt;br /&gt;
Used by: PreAWP&lt;br /&gt;
&lt;br /&gt;
==== AWP format ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Input velocity file for the AWP-ODC wave propagation code.&lt;br /&gt;
&lt;br /&gt;
Filename convention: awp.&amp;lt;site&amp;gt;.media&lt;br /&gt;
&lt;br /&gt;
Format: Binary, with 4-byte floats, in fast Y, X, slow Z (surface down) order.&lt;br /&gt;
&lt;br /&gt;
Generated by: UCVM&lt;br /&gt;
&lt;br /&gt;
Used by: Smoothing, PreAWP, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== Fdloc ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Coordinates of the site, in X Y grid indices, and therefore the coordinates where the SGT impulse should be placed.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.fdloc&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;X grid index of site&amp;gt; &amp;lt;Y grid index of site&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreAWP, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== Faultlist ===&lt;br /&gt;
&lt;br /&gt;
Purpose: List of paths to all the rupture geometry files for all ruptures which are within the cutoff for this site. Used to produce a list of points to save SGTs for.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.faultlist&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;path to rupture file&amp;gt; nheader=&amp;lt;number of header lines, usually 6&amp;gt; latfirst=&amp;lt;1, to signify that latitude comes first in the rupture files&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT&lt;br /&gt;
&lt;br /&gt;
=== Radiusfile ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Describe the adaptive mesh SGTs will be saved for.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.radiusfile&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of gradations in X and Y&amp;gt;&lt;br /&gt;
&amp;lt;radius 1&amp;gt; &amp;lt;radius 2&amp;gt; &amp;lt;radius 3&amp;gt; &amp;lt;radius 4&amp;gt;&lt;br /&gt;
&amp;lt;decimation less than radius 1&amp;gt; &amp;lt;decimation between radius 1 and 2&amp;gt; &amp;lt;between 2 and 3&amp;gt; &amp;lt;between 3 and 4&amp;gt;&lt;br /&gt;
&amp;lt;number of gradations in Z&amp;gt;&lt;br /&gt;
&amp;lt;depth 1&amp;gt; &amp;lt;depth 2&amp;gt; &amp;lt;depth 3&amp;gt; &amp;lt;depth 4&amp;gt;&lt;br /&gt;
&amp;lt;decimation less than depth 1&amp;gt; &amp;lt;decimation between depth 1 and 2&amp;gt; &amp;lt;between 2 and 3&amp;gt; &amp;lt;between 3 and 4&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT&lt;br /&gt;
&lt;br /&gt;
=== SGT Coordinate files ===&lt;br /&gt;
&lt;br /&gt;
There are two formats for the list of points to save SGTs for, one for Rob's codes and one for AWP-ODC.  As with other coordinate transformations between the two systems, to convert X and Y offsets from RWG to AWP you have to flip the X and Y and add 1 to each, since RWG is 0-indexed and AWP is 1-indexed.&lt;br /&gt;
&lt;br /&gt;
==== SgtCoords ====&lt;br /&gt;
&lt;br /&gt;
Purpose: List of all the points to save SGTs for.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.cordfile&lt;br /&gt;
&lt;br /&gt;
Format: Z changes fastest, then Y, then X slowest.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# geoproj= &amp;lt;projection; we usually use 1 for great circle&amp;gt;&lt;br /&gt;
# modellon= &amp;lt;model lon&amp;gt; modellat= &amp;lt;model lat&amp;gt; modelrot= &amp;lt;model rot, usually -55&amp;gt;&lt;br /&gt;
# xlen= &amp;lt;x-length&amp;gt; ylen= &amp;lt;y-length&amp;gt;&lt;br /&gt;
#&lt;br /&gt;
&amp;lt;total number of points&amp;gt;&lt;br /&gt;
&amp;lt;X index&amp;gt; &amp;lt;Y index&amp;gt; &amp;lt;Z index&amp;gt; &amp;lt;Single long to capture the index, in the form XXXXYYYYZZZZ&amp;gt; &amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;depth in km&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT, PreAWP, PostAWP&lt;br /&gt;
&lt;br /&gt;
==== AWP cordfile ====&lt;br /&gt;
&lt;br /&gt;
Purpose: List of SGT points to save in a format usable by AWP-ODC-SGT.&lt;br /&gt;
&lt;br /&gt;
Filename convention: awp.&amp;lt;site&amp;gt;.cordfile&lt;br /&gt;
&lt;br /&gt;
Format: Remember that X and Y are flipped and have 1 added from RWG.  The points are sorted by Y, then X, then Z, so Y changes slowest and Z changes fastest.  This is flipped from the RWG cordfile because X and Y components are swapped.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of points&amp;gt;&lt;br /&gt;
&amp;lt;X coordinate&amp;gt; &amp;lt;Y coordinate&amp;gt; &amp;lt;Z coordinate&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreAWP&lt;br /&gt;
&lt;br /&gt;
Used by: AWP-ODC-SGT CPU, AWP-ODC-SGT GPU&lt;br /&gt;
&lt;br /&gt;
=== Impulse source descriptions ===&lt;br /&gt;
&lt;br /&gt;
We generate the initial source description for CyberShake, with the required dt, nt, and filtering, using gen_source, in https://github.com/SCECcode/cybershake-core/SimSgt_V3.0.3/src/ (run 'make get_source').  gen_source hard-codes its parameters, but you should only change 'nt', 'dt', and 'flo'.  We have been setting flo to twice the CyberShake maximum frequency, to reduce filtering affects at the frequency of interest.  gen_source wraps Rob Graves's source generator, which we use for consistency.&lt;br /&gt;
&lt;br /&gt;
To generate a source for a component, run&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$&amp;gt;./gen_source xsrc=0 ysrc=0 zsrc=0 &amp;lt;fxsrc|fysrc|fzsrc&amp;gt;=1 moment=1e20&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once this RWG source is generated, we then use AWP-GPU-SGT/utils/data/format_source.py to reprocess the RWG source into an AWP-source friendly format.  This involves reformatting the file and multiplying all values by 1e15 for unit conversion.  Different files must be produced for X and Y coordinates, since in the AWP format different columns are used for different components.&lt;br /&gt;
&lt;br /&gt;
Finally, AWP-GPU-SGT/utils/build_src.py takes the correct AWP-friendly source (nt and dt) for a run and adds the impulse location coordinates, producing a complete AWP format source description.&lt;br /&gt;
&lt;br /&gt;
==== RWG source ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Source description for the SGT impulse.&lt;br /&gt;
&lt;br /&gt;
Filename convention: source_cos0.10_&amp;lt;frequency&amp;gt;hz&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
source cos&lt;br /&gt;
&amp;lt;nt&amp;gt; &amp;lt;dt&amp;gt; 0 0 0.0 0.0 0.0 0.0&lt;br /&gt;
&amp;lt;value at ts0&amp;gt; &amp;lt;value at ts1&amp;gt; &amp;lt;value at ts2&amp;gt; &amp;lt;value at ts3&amp;gt; &amp;lt;value at ts4&amp;gt; &amp;lt;value at ts5&amp;gt;&lt;br /&gt;
&amp;lt;value at ts6&amp;gt; &amp;lt;value at ts7&amp;gt; &amp;lt;value at ts8&amp;gt; &amp;lt;value at ts9&amp;gt; &amp;lt;value at ts10&amp;gt; &amp;lt;value at ts11&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: gen_source (see above)&lt;br /&gt;
&lt;br /&gt;
Used by: PreAWP&lt;br /&gt;
&lt;br /&gt;
==== AWP source ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Source description which can be used by AWP-ODC.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_f&amp;lt;x or y&amp;gt;_src&lt;br /&gt;
&lt;br /&gt;
Format: Note that X and Y coordinates are swapped between RWG and AWP format, because of how the box is defined.  Additionally, RWG is 0-indexed, and AWP is 1-indexed, and the RWG values must be multiplied by 1e15 for unit conversion.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;X index of source, same as site X index&amp;gt; &amp;lt;Y index of source, same as site Y index&amp;gt;&lt;br /&gt;
&amp;lt;XX impulse at ts0&amp;gt; &amp;lt;YY at ts0&amp;gt; &amp;lt;ZZ at ts0&amp;gt; &amp;lt;XY at ts0&amp;gt; &amp;lt;XZ at ts0&amp;gt; &amp;lt;YZ at ts0&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Generated by: PreAWP&lt;br /&gt;
&lt;br /&gt;
Used by: AWP-ODC-SGT CPU, AWP-ODC-SGT GPU&lt;br /&gt;
&lt;br /&gt;
=== IN3D ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Input file for AWP-ODC.&lt;br /&gt;
&lt;br /&gt;
Filename convention: IN3D.&amp;lt;site&amp;gt;.&amp;lt;x or y&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: Specified [https://scec.usc.edu/it/AWP-ODC-SGT#IN3D here (login required)].&lt;br /&gt;
&lt;br /&gt;
Generated by: PreAWP&lt;br /&gt;
&lt;br /&gt;
Used by: AWP-ODC-SGT CPU, AWP-ODC-SGT GPU, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== AWP SGT ===&lt;br /&gt;
&lt;br /&gt;
Purpose: SGT file, created by AWP-ODC-SGT.&lt;br /&gt;
&lt;br /&gt;
Filename convention: awp-strain-&amp;lt;site&amp;gt;-f&amp;lt;x or y&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: binary, 4-byte floats.  Points are in the same order as in the AWP SGT coordinate file, which is fast Z, X, Y.  For each point, the SGT components are stored in XX, YY, ZZ, XY, XZ, YZ order, with time fastest.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YY component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 2nd z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (2nd x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 2nd y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, last y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: AWP-ODC-SGT CPU and GPU&lt;br /&gt;
&lt;br /&gt;
Used by: PostAWP, NanCheck &lt;br /&gt;
&lt;br /&gt;
=== RWG SGT ===&lt;br /&gt;
&lt;br /&gt;
Purpose: SGT file, created by PostAWP for use in post-processing.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_f&amp;lt;x or y&amp;gt;_&amp;lt;run id&amp;gt;.sgt&lt;br /&gt;
&lt;br /&gt;
Format: binary, 4-byte floats.  Points are in the same order as in the RWG coordinate file, which is fast Z, Y, X.  For each point, the SGT components are stored in XX, YY, ZZ, XY, XZ, YZ order, with time fastest.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YY component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 2nd z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (2nd x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 2nd y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, last y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PostAWP&lt;br /&gt;
&lt;br /&gt;
Used by: CheckSgt, DirectSynth&lt;br /&gt;
&lt;br /&gt;
=== SGT header file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: SGT header information, used to parse and understand SGT files&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_f&amp;lt;x or y&amp;gt;_&amp;lt;run id&amp;gt;.sgthead&lt;br /&gt;
&lt;br /&gt;
Format: binary.  It consists of three sections:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;The sgtmaster structure, described below in C.  Its information can be used to set up data structures to read the rest of the SGTs.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
struct sgtmaster&lt;br /&gt;
   {&lt;br /&gt;
   int geoproj;     /* =0: RWG local flat earth; =1: RWG great circle arcs; =2: UTM */&lt;br /&gt;
   float modellon;  /* longitude of geographic origin */&lt;br /&gt;
   float modellat;  /* latitude of geographic origin */&lt;br /&gt;
   float modelrot;  /* rotation of y-axis from south (clockwise positive)   */&lt;br /&gt;
   float xshift;    /* xshift of cartesian origin from geographic origin */&lt;br /&gt;
   float yshift;    /* yshift of cartesian origin from geographic origin */&lt;br /&gt;
   int globnp;      /* total number of SGT locations (entire model) */&lt;br /&gt;
   int localnp;     /* local number of SGT locations (this file only) */&lt;br /&gt;
   int nt;          /* number of time points                                */&lt;br /&gt;
   };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;The sgtindex structures, described below in C.  There is one of these for each point in the SGTs, and they're used to determine the X/Y/Z indices of all the SGT points.  Note that the current way of packing the X,Y,Z coordinates into the long allows for 6 digits (so maximum 1M grid points) for each component.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
struct sgtindex   /* indices for all 'globnp' SGT locations */&lt;br /&gt;
   {&lt;br /&gt;
   long long indx; /* indx= xsgt*1000000000000 + ysgt*1000000 + zsgt */&lt;br /&gt;
   int xsgt;     /* x grid location */&lt;br /&gt;
   int ysgt;     /* y grid location */&lt;br /&gt;
   int zsgt;     /* z grid location */&lt;br /&gt;
   float h;         /* grid spacing                                         */&lt;br /&gt;
   };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;The sgtheader structures, described below in C.  There is one of these for each point in the SGTs.  They're used when we perform reciprocity.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
struct sgtheader&lt;br /&gt;
   {&lt;br /&gt;
   long long indx;  /* index of this SGT */&lt;br /&gt;
   int geoproj;     /* =0: RWG local flat earth; =1: RWG great circle arcs; =2: UTM */&lt;br /&gt;
   float modellon;  /* longitude of geographic origin */&lt;br /&gt;
   float modellat;  /* latitude of geographic origin */&lt;br /&gt;
   float modelrot;  /* rotation of y-axis from south (clockwise positive)   */&lt;br /&gt;
   float xshift;    /* xshift of cartesian origin from geographic origin */&lt;br /&gt;
   float yshift;    /* yshift of cartesian origin from geographic origin */&lt;br /&gt;
   int nt;          /* number of time points                                */&lt;br /&gt;
   float xazim;     /* azimuth of X-axis in FD model (clockwise from north) */&lt;br /&gt;
   float dt;        /* time sampling                                        */&lt;br /&gt;
   float tst;       /* start time of 1st point in GF                        */&lt;br /&gt;
   float h;         /* grid spacing                                         */&lt;br /&gt;
   float src_lat;   /* site latitude */&lt;br /&gt;
   float src_lon;   /* site longitude */&lt;br /&gt;
   float src_dep;   /* site depth */&lt;br /&gt;
   int xsrc;        /* x grid location for source (station in recip. exp.)  */&lt;br /&gt;
   int ysrc;        /* y grid location for source (station in recip. exp.)  */&lt;br /&gt;
   int zsrc;        /* z grid location for source (station in recip. exp.)  */&lt;br /&gt;
   float sgt_lat;   /* SGT location latitude */&lt;br /&gt;
   float sgt_lon;   /* SGT location longitude */&lt;br /&gt;
   float sgt_dep;   /* SGT location depth */&lt;br /&gt;
   int xsgt;        /* x grid location for output (source in recip. exp.)   */&lt;br /&gt;
   int ysgt;        /* y grid location for output (source in recip. exp.)   */&lt;br /&gt;
   int zsgt;        /* z grid location for output (source in recip. exp.)   */&lt;br /&gt;
   float cdist;     /* straight-line distance btw site and SGT location */&lt;br /&gt;
   float lam;       /* lambda [in dyne/(cm*cm)] at output point             */&lt;br /&gt;
   float mu;        /* rigidity [in dyne/(cm*cm)] at output point           */&lt;br /&gt;
   float rho;       /* density [in gm/(cm*cm*cm)] at output point           */&lt;br /&gt;
   float xmom;      /* moment strength of x-oriented force in this run      */&lt;br /&gt;
   float ymom;      /* moment strength of y-oriented force in this run      */&lt;br /&gt;
   float zmom;      /* moment strength of z-oriented force in this run      */&lt;br /&gt;
   };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Overall, then, the format for the file is:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;sgtmaster&amp;gt;&lt;br /&gt;
&amp;lt;sgtindex for point 1&amp;gt;&lt;br /&gt;
&amp;lt;sgtindex for point 2&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;sgtindex for point globnp&amp;gt;&lt;br /&gt;
&amp;lt;sgtheader for point 1&amp;gt;&lt;br /&gt;
&amp;lt;sgtheader for point 2&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;sgtheader for point globnp&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PostAWP&lt;br /&gt;
&lt;br /&gt;
Used by: DirectSynth&lt;br /&gt;
&lt;br /&gt;
=== Velocity Info file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  Contains the 3D velocity information needed for stochastic jobs&lt;br /&gt;
&lt;br /&gt;
Filename convention: velocity_info_&amp;lt;site&amp;gt;.txt&lt;br /&gt;
&lt;br /&gt;
Format: Text format, three lines:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Vs30 = &amp;lt;Vs30 value&amp;gt;&lt;br /&gt;
Vs500 = &amp;lt;Vs500 value&amp;gt;&lt;br /&gt;
VsD500 = &amp;lt;VsD500 value&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: Velocity Info job&lt;br /&gt;
&lt;br /&gt;
Used by: Sub Stoch DAX generator, to add these values as command-line arguments to HF Synth and LF Site Response jobs.&lt;br /&gt;
&lt;br /&gt;
=== rupture list file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains a list of all the ruptures for which low-frequency seismograms should be synthesized.  This file is used to construct the tasks in DirectSynth.  The number of rows, columns, and magnitude are included because DirectSynth uses this information to determine how much memory the tasks will use.  This file is constructed at abstract workflow creation time.&lt;br /&gt;
&lt;br /&gt;
Filename convention: rupture_file_list_&amp;lt;site&amp;gt;_&amp;lt;run_id&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: Text format, with the number of ruptures and then 1 line per rupture:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of ruptures N&amp;gt;&lt;br /&gt;
&amp;lt;rupture geometry filename 1&amp;gt; &amp;lt;number of hypocenters&amp;gt; &amp;lt;number of slips per hypocenter&amp;gt; &amp;lt;number of rows in the rupture geometry&amp;gt; &amp;lt;number of columns in the rupture geometry&amp;gt; &amp;lt;magnitude&amp;gt;&lt;br /&gt;
&amp;lt;rupture geometry filename 2&amp;gt; &amp;lt;number of hypocenters&amp;gt; &amp;lt;number of slips per hypocenter&amp;gt; &amp;lt;number of rows in the rupture geometry&amp;gt; &amp;lt;number of columns in the rupture geometry&amp;gt; &amp;lt;magnitude&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;rupture geometry filename N&amp;gt; &amp;lt;number of hypocenters&amp;gt; &amp;lt;number of slips per hypocenter&amp;gt; &amp;lt;number of rows in the rupture geometry&amp;gt; &amp;lt;number of columns in the rupture geometry&amp;gt; &amp;lt;magnitude&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Used by: DirectSynth&lt;br /&gt;
&lt;br /&gt;
=== rupture variation info file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  This file provides the rvfrac (rupture velocity speed as a fraction of shear wave velocity) and random seed for each rupture variation.  It is constructed at abstract workflow creation time, using values from the database.  It is highly recommended that these values are stored somewhere for reproducibility, and the same values should be used for all sites so that the rupture variations are the same for all sites.&lt;br /&gt;
&lt;br /&gt;
Filename convention: rvfrac_seed_values_&amp;lt;site&amp;gt;_&amp;lt;run_id&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:  Text format, with the number of rupture variations and then 1 line per rupture variation:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of rupture variations N&amp;gt;&lt;br /&gt;
&amp;lt;source ID 1&amp;gt; &amp;lt;rupture ID 1&amp;gt; &amp;lt;rupture variation ID 1&amp;gt; &amp;lt;rvfrac&amp;gt; &amp;lt;random seed&amp;gt;&lt;br /&gt;
&amp;lt;source ID 2&amp;gt; &amp;lt;rupture ID 2&amp;gt; &amp;lt;rupture variation ID 2&amp;gt; &amp;lt;rvfrac&amp;gt; &amp;lt;random seed&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;source ID N&amp;gt; &amp;lt;rupture ID N&amp;gt; &amp;lt;rupture variation ID N&amp;gt; &amp;lt;rvfrac&amp;gt; &amp;lt;random seed&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Used by: DirectSynth, when linked with RupGen-v5.5.2 or later&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== BBP 1D Velocity file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  Contains 1D velocity profile information&lt;br /&gt;
&lt;br /&gt;
Filename convention: The only one currently in use in CyberShake is /home/scec-02/cybershk/runs/genslip_nr_generic1d-gp01.vmod .&lt;br /&gt;
&lt;br /&gt;
Format: Text format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of thickness layers L&amp;gt;&lt;br /&gt;
&amp;lt;layer 1 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;not used&amp;gt; &amp;lt;not used&amp;gt;&lt;br /&gt;
&amp;lt;layer 2 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;not used&amp;gt; &amp;lt;not used&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;layer L thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;not used&amp;gt; &amp;lt;not used&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note that the last layer has thickness 999.0.&lt;br /&gt;
&lt;br /&gt;
Generated by: Rob Graves&lt;br /&gt;
&lt;br /&gt;
Used by: Local VM job&lt;br /&gt;
&lt;br /&gt;
=== Local VM file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  Contains 1D velocity profile information for use with stochastic codes&lt;br /&gt;
&lt;br /&gt;
Filename convention: The only one currently in use in CyberShake is LA_Basin_BBP_14.3.0.local .&lt;br /&gt;
&lt;br /&gt;
Format: Text format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of thickness layers L&amp;gt;&lt;br /&gt;
&amp;lt;layer 1 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;Qs&amp;gt; &amp;lt;Qs&amp;gt;&lt;br /&gt;
&amp;lt;layer 2 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;Qs&amp;gt; &amp;lt;Qs&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;layer L thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;Qs&amp;gt; &amp;lt;Qs&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note that the last layer has thickness '0.0', indicating it has no bottom.&lt;br /&gt;
&lt;br /&gt;
Generated by: Local VM Job&lt;br /&gt;
&lt;br /&gt;
Used by: &lt;br /&gt;
&lt;br /&gt;
=== Missing variations file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Lists the variations which the Check DB stage has found are missing.&lt;br /&gt;
&lt;br /&gt;
Filename convention: DB_Check_Out_&amp;lt;PSA or RotD or Duration&amp;gt;_&amp;lt;site&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: For each source and rupture pair with missing variations, the following record is output in text format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;source ID&amp;gt; &amp;lt;rupture ID&amp;gt; &amp;lt;number N of missing rupture variations&amp;gt;&lt;br /&gt;
&amp;lt;ID of first missing rupture variation&amp;gt;&lt;br /&gt;
&amp;lt;ID of second missing rupture variation&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;ID of Nth missing rupture variation&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Originally, a file in this format could be directly fed back into the DAX generator, but that capability has not been used for many years and may not still be functional.&lt;br /&gt;
&lt;br /&gt;
Generated by: Check DB Site&lt;br /&gt;
&lt;br /&gt;
Used by: none&lt;br /&gt;
&lt;br /&gt;
=== DB Report file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Provides PSA data for a run in a text format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_ERF&amp;lt;erf id&amp;gt;_report_&amp;lt;date&amp;gt;.txt&lt;br /&gt;
&lt;br /&gt;
Format: It's a text file with the following header:&lt;br /&gt;
 Site_Name       ERF_ID  Source_ID       Rupture_ID      Rup_Var_ID      Rup_Var_Scenario_ID     Mag     Prob    Grid_Spacing    Num_Rows        Num_Columns     Period  Component       SA&lt;br /&gt;
The file is sorted by fast Rup_Var_ID, Rupture_ID, Source_ID, Period, slow Component.&lt;br /&gt;
&lt;br /&gt;
Generated by: DB Report&lt;br /&gt;
&lt;br /&gt;
Used by: none, output data product&lt;br /&gt;
&lt;br /&gt;
=== Hazard Curve ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains a hazard curve, either in text, PNG, or PDF format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_ERF&amp;lt;erf id&amp;gt;_Run&amp;lt;run id&amp;gt;_&amp;lt;IM type&amp;gt;_&amp;lt;period&amp;gt;sec_&amp;lt;IM component&amp;gt;_&amp;lt;date run completed&amp;gt;.&amp;lt;pdf|txt|png&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:  The PNG and PDF formats contain an image of the curve.  The PDF format also has an extended legend.  The TXT file contains a list of (X,Y) points which describe the curve.&lt;br /&gt;
&lt;br /&gt;
Generated by: Curve Calc&lt;br /&gt;
&lt;br /&gt;
Used by: none, output data product&lt;br /&gt;
&lt;br /&gt;
=== Disaggregation file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains disaggregation results for a single run, in either text, PNG, or PDF format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_ERF&amp;lt;erf id&amp;gt;_Run&amp;lt;run_id&amp;gt;_Disagg&amp;lt;POE|IM&amp;gt;_&amp;lt;disagg level&amp;gt;_&amp;lt;IM type&amp;gt;_&amp;lt;period&amp;gt;sec_&amp;lt;run date&amp;gt;.&amp;lt;txt|png|pdf&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: The PNG and PDF formats contain a plot of the disaggregation results, showing magnitude vs distance and color-coding based on epsilon.  The PDF and TXT formats contain additional information about individual source contributions, in the following format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;Summary data&lt;br /&gt;
Parameters used to create disaggregation&lt;br /&gt;
Disaggregation bin data:&lt;br /&gt;
Dist Mag &amp;lt;breakout by epsilon values&amp;gt;&lt;br /&gt;
&amp;lt;Breakdown of contribution by distance, magnitude, and epsilon range&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Disaggregation Source List Info:&lt;br /&gt;
Source# %Contribution TotExceedRate SourceName DistRup DistX DistSeis DistJB&lt;br /&gt;
&amp;lt;list of contributing sources, in decreasing order of % contribution&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: Disaggregation&lt;br /&gt;
&lt;br /&gt;
Used by: none, output data product&lt;br /&gt;
&lt;br /&gt;
== Dependencies ==&lt;br /&gt;
&lt;br /&gt;
The following are external software dependencies used by CyberShake software modules.&lt;br /&gt;
&lt;br /&gt;
=== Getpar ===&lt;br /&gt;
&lt;br /&gt;
Purpose: A library written in C which enables parsing of key-value command-line parameters, and enforcement of required parameters.  Rob Graves uses it in his codes.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Rob supplied a copy; it is in the CyberShake repository at https://github.com/SCECcode/cybershake-core/tree/main/Getpar .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Run 'make' in Getpar/getpar/src; this will make the library, libget.a, and install it in the lib directory, where CyberShake codes will expect it.&lt;br /&gt;
&lt;br /&gt;
=== MySQLdb ===&lt;br /&gt;
&lt;br /&gt;
This library has been deprecated in favor of pymysql.&lt;br /&gt;
&lt;br /&gt;
=== pymysql ===&lt;br /&gt;
&lt;br /&gt;
Purpose: MySQL bindings for Python.&lt;br /&gt;
&lt;br /&gt;
How to obtain: pip3 install pymysql .  Documentation is at https://pypi.org/project/PyMySQL/ .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: None; pip3 shouldn't have any issues.&lt;br /&gt;
&lt;br /&gt;
=== UCVM ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Supplies the query tools needed to populate a mesh with velocity information.&lt;br /&gt;
&lt;br /&gt;
How to obtain:  The most recent version of UCVM can be found at [[UCVM#Current_UCVM_Software_Releases|Current UCVM Software Releases]].  As of October 2017, we have only integrated the C version of UCVM into CyberShake.&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Following the standard installation instructions for a cluster should work (running ./ucvm_setup.py).  You will want to install CVM-S4, CVM-S426, CVM-S4.M01, CVM-H, CenCal, CCA-06, and CCA 1D velocity models for CyberShake.&lt;br /&gt;
&lt;br /&gt;
=== libcfu ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Provides a hash table library for a variety of CyberShake codes.&lt;br /&gt;
&lt;br /&gt;
How to obtain: https://sourceforge.net/projects/libcfu/ .  Documentation is at http://libcfu.sourceforge.net/libcfu.html .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Follow the instructions, and install into the utils directory.&lt;br /&gt;
&lt;br /&gt;
=== FFTW ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Library which provides FFTs.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Typically installed on supercomputers already, though you may need to load a module to activate it.&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Doesn't need to be installed in user space.&lt;br /&gt;
&lt;br /&gt;
=== memcached ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Server library for running a memory caching system, using key-value pairs.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Download from https://memcached.org/ .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Follow instructions and install in utils directory.  It has a dependency on libevent, which you may have to install also.&lt;br /&gt;
&lt;br /&gt;
=== libmemcached ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Client library and tools for memcached server.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Download from http://libmemcached.org/libMemcached.html .  Install memcached first.&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Sometimes installing can be a challenge, because it can't find the memcached install.  You may have to set the path to memcached either as an argument to configure, or you may even need to edit the configuration and makefiles directory.  Install into the utils directory.&lt;br /&gt;
&lt;br /&gt;
=== RupGen-api ===&lt;br /&gt;
&lt;br /&gt;
Purpose: To generate rupture variations from a rupture geometry for a given hypocenter and slip using the Graves &amp;amp; Pitarka rupture generator.  The current version is 5.5.2.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Check out from https://github.com/SCECcode/cybershake-core/tree/main/RuptureCodes/RupGen-api-5.5.2 .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions:&lt;br /&gt;
&lt;br /&gt;
#This code is dependent on FFTW.  You may need to edit the makefile to point to the FFTW include files and libraries, since different clusters often use different environment variables to capture FFTW paths.&lt;br /&gt;
#If you want memcached support, edit the makefile in RuptureCodes/RupGen-api-5.5.2/src to uncomment lines 19-21 and edit line 19 to point to the libmemcached install directory.  You'll need to do the same to RuptureCodes/RupGen-api-5.5.2/src/GenRandV5.0/makefile, lines 35-37.&lt;br /&gt;
#You may need to edit CFLAGS in RuptureCodes/RupGen-api-5.5.2/src/GenRandV5.5.2/makefile (lines 21-23) to point to the FFTW path; whether or not this is needed depends on the particular system.&lt;br /&gt;
#Run 'make' in RupGen-api-5.5.2 to make the librupgen.a library.&lt;br /&gt;
&lt;br /&gt;
=== SCEC Broadband Platform ===&lt;br /&gt;
&lt;br /&gt;
Purpose: To generate high-frequency stochastic seismograms for broadband CyberShake runs.&lt;br /&gt;
&lt;br /&gt;
How to obtain:  Follow installation instructions at https://github.com/SCECcode/bbp .&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Code_Base&amp;diff=30588</id>
		<title>CyberShake Code Base</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Code_Base&amp;diff=30588"/>
		<updated>2025-11-26T19:44:04Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* File types */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page details all the pieces of code which make up the CyberShake code base, as of November 2017.  Note that this does not include the workflow middleware, or the workflow generators; that code is detailed at [[CyberShake Workflow Framework]].&lt;br /&gt;
&lt;br /&gt;
Conceptually, we can divide up the CyberShake codes into three categories:&lt;br /&gt;
&lt;br /&gt;
#Strain Green Tensor-related codes: These codes produce the input files needed to generate SGTs, actually calculate the SGTs, and do some reformatting and sanity checks on the results.&lt;br /&gt;
#Synthesis-related codes: These codes take the SGTs and perform seismogram synthesis and intensity measure calculations.&lt;br /&gt;
#Data product codes: These codes insert the results into the database, and use the database to generate a variety of output data products.&lt;br /&gt;
&lt;br /&gt;
Below is a description of each piece of software we use, organized by these categories.  For each piece of software, we include a description of where it is located, how to compile and use it, and what its inputs and outputs are.  At the end, we provide a description of input and output files and formats.&lt;br /&gt;
&lt;br /&gt;
== Code Installation ==&lt;br /&gt;
&lt;br /&gt;
Historically, we have selected a root directory for CyberShake, then created the subdirectories 'software' for all the code, 'ruptures' for the rupture files, 'logs' for log files, and 'utils' for workflow tools.  This is typically set up in unpurged storage space, so once installed purging isn't a worry.  Each code listed below, along with the configuration file, should be checked out into the 'software' subdirectory.&lt;br /&gt;
&lt;br /&gt;
In terms of compilers, you should use the GNU compilers unless specifically directed otherwise.&lt;br /&gt;
&lt;br /&gt;
Most of the codes below contain a main directory.  Inside that is a bin directory, with binaries; a src directory with code requiring compilation; and wrappers, in the main directory.&lt;br /&gt;
&lt;br /&gt;
If you are looking for compilation instructions, a general guide is available [[CyberShake compilation guide | here]].&lt;br /&gt;
&lt;br /&gt;
=== Configuration file ===&lt;br /&gt;
&lt;br /&gt;
Many CyberShake codes use a configuration file, which specifies the root directory for the CyberShake installation, the command use to start an MPI executable, paths to a tmp and scratch space (which can be the same), and the path to the CyberShake rupture directory.  We have done this instead of environment variables because it's more transparent and easier for multiple users.  Both of these files should be stored in the 'software' subdirectory.&lt;br /&gt;
&lt;br /&gt;
The configuration file is available at:&lt;br /&gt;
 https://github.com/SCECcode/cybershake-core/cybershake.cfg&lt;br /&gt;
Obviously, this file must be edited to be correct for the install.&lt;br /&gt;
&lt;br /&gt;
The keys that CyberShake currently expects to find are:&lt;br /&gt;
*CS_PATH = /path/to/CyberShake/software/directory&lt;br /&gt;
*SCRATCH_PATH = /path/to/shared/scratch&lt;br /&gt;
*TMP_PATH = /path/to/tmp (can be node-local, or shared with scratch&lt;br /&gt;
*RUPTURE_PATH = /path/to/CyberShake/rupture/directory&lt;br /&gt;
*MPI_CMD = ibrun or aprun or mpiexec&lt;br /&gt;
*LOG_PATH = /path/to/CyberShake/logs/directory&lt;br /&gt;
&lt;br /&gt;
To interact with cybershake.cfg, the CyberShake codes use a Python script to deliver cybershake.cfg entries as key-value pairs, located here:&lt;br /&gt;
 https://github.com/SCECcode/cybershake-core/config.py&lt;br /&gt;
Several CyberShake codes import config, then use it to read out the cybershake.cfg file.&lt;br /&gt;
&lt;br /&gt;
=== Compiler file ===&lt;br /&gt;
&lt;br /&gt;
A long time ago, Gideon Juve created a compiler file, Compilers.mk, which contains information about which compilers should be used for which system.  This file should also be downloaded using 'svn export' and installed in the software directory, from&lt;br /&gt;
 https://github.com/SCECcode/cybershake-core/Compilers.mk&lt;br /&gt;
&lt;br /&gt;
Some of the makefiles reference this file.  This can - and should - be updated to reflect new systems.&lt;br /&gt;
&lt;br /&gt;
== SGT-related codes ==&lt;br /&gt;
&lt;br /&gt;
[[File:SGT_workflow_stages.png|thumb|right|300px|Overview of the codes involved in the SGT part of CyberShake, [http://hypocenter.usc.edu/research/cybershake/full_SGT_workflow.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
=== PreCVM ===&lt;br /&gt;
&lt;br /&gt;
This code stands for &amp;quot;Pre-Community-Velocity-Model&amp;quot;.  It has to be run before the UCVM codes, since it generates input files required by UCVM.  &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To determine the simulation volume for a particular CyberShake site.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PreCVM queries the CyberShake database to determine all of the ruptures which fall within a given cutoff for a certain site.  From that information, padding is added around the edges to construct the CyberShake simulation volume for this site.  Additional padding so the X and Y dimensions are multiples of 10, 20, or 40 might also be applied, depending on the input parameters.  Using this volume, both the X/Y offset of each grid point, and then the latitude and longitude using a great circle projection, are determined and written to output files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#The CyberShake volume depth needs to be changed, so as to have the right number of grid points. That is set in the genGrid() function in GenGrid_py/gen_grid.py, in km.&lt;br /&gt;
#X and Y padding needs to be altered.  That is set using 'bound_pad' in Modelbox/get_modelbox.py, around line 70.&lt;br /&gt;
#The rotation of the simulation volume needs to be changed.  That is set using 'model_rot' in Modelbox/get_modelbox.py, around line 70.&lt;br /&gt;
#The database access parameters have changed.  That's in Modelbox/get_modelbox.py, around line 80.&lt;br /&gt;
#The divisibility needs for GPU simulations change (currently, we need the dimensions to be evenly divisible by the number of GPUs used in that dimension.  That is in Modelbox/get_modelbox.py, around line 250.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/PreCVM/&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Rob Graves, wrapped by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]], [[CyberShake_Code_Base#MySQLdb|MySQLdb for Python]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  pre_cvm.py&lt;br /&gt;
    Modelbox/get_modelbox.py&lt;br /&gt;
      Modelbox/bin/gcproj&lt;br /&gt;
    GenGrid_py/gen_grid.py&lt;br /&gt;
      GenGrid_py/bin/gen_model_cords&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the Modelbox/src and the GenGrid_py/src directories.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: pre_cvm.py [options]&lt;br /&gt;
  Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --site=SITE           Site name&lt;br /&gt;
  --erf_id=ERF_ID       ERF ID&lt;br /&gt;
  --modelbox=MODELBOX   Path to modelbox file (output)&lt;br /&gt;
  --gridfile=GRIDFILE   Path to gridfile (output)&lt;br /&gt;
  --gridout=GRIDOUT     Path to gridout (output)&lt;br /&gt;
  --coordfile=COORDSFILE&lt;br /&gt;
                        Path to coorfile (output)&lt;br /&gt;
  --paramsfile=PARAMSFILE&lt;br /&gt;
                        Path to paramsfile (output)&lt;br /&gt;
  --boundsfile=BOUNDSFILE&lt;br /&gt;
                        Path to boundsfile (output)&lt;br /&gt;
  --frequency=FREQUENCY&lt;br /&gt;
                        Frequency&lt;br /&gt;
  --gpu                 Use GPU box settings.&lt;br /&gt;
  --spacing=SPACING     Override default spacing with this value.&lt;br /&gt;
  --server=SERVER       Address of server to query in creating modelbox,&lt;br /&gt;
                        default is focal.usc.edu.&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; requires 6 minutes for 100m spacing, 10 billion point volume&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; None; inputs are retrieved from the database&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Modelbox|modelbox]], [[CyberShake_Code_Base#Gridfile|gridfile]], [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Params|params]], [[CyberShake_Code_Base#Coord|coord]], [[CyberShake_Code_Base#Bounds|bounds]]&lt;br /&gt;
&lt;br /&gt;
=== UCVM ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate a populated velocity mesh for a CyberShake simulation volume.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; UCVM takes the volume defined by PreCVM and queries the [[UCVM]] software, using the C API, to populate the volume.  The resulting mesh is then checked for Vp/Vs ratio, minimum Vp/Vs/rho, and for no Infs or NaNs.  The data is outputted in either Graves (RWG) format or AWP format.  This code also produces log files, which will be written to the CyberShake logs directory/GenLog/site/v_mpi-&amp;lt;processor number&amp;gt;.log.  This can be useful if there's an error and you aren't sure why.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#New velocity models are added.  Velocity models are specified in the DAX and passed through the wrapper scripts into the C code and then ultimately to UCVM, so an if statement must be added to around line 250 (and around line 450 if it's applicable for no GTL).&lt;br /&gt;
#The backend UCVM substantially changes.  If we move to the Python implementation, for example.&lt;br /&gt;
#If additional models are added, new libraries may need to be added to the makefile.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/UCVM&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]], [[CyberShake_Code_Base#UCVM|UCVM]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  single_exe.py&lt;br /&gt;
    single_exe.csh&lt;br /&gt;
      bin/ucvm-single-mpi&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;The makefile needs to be edited so that &amp;quot;UCVM_HOME&amp;quot; points to the UCVM home directory.  Then run 'make' in the UCVM/src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;All of site, gridout, modelcords, models, and format must be specified.&lt;br /&gt;
Usage: single_exe.py [options]&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --site=SITE           Site name&lt;br /&gt;
  --gridout=GRIDOUT     Path to gridout (output)&lt;br /&gt;
  --coordfile=COORDSFILE&lt;br /&gt;
                        Path to coordfile (output)&lt;br /&gt;
  --models=MODELS       Comma-separated string on velocity models to use.&lt;br /&gt;
  --format=FORMAT       Specify awp or rwg format for output.&lt;br /&gt;
  --frequency=FREQUENCY&lt;br /&gt;
                        Frequency&lt;br /&gt;
  --spacing=SPACING     Override default spacing with this value (km)&lt;br /&gt;
  --min_vs=MIN_VS       Override minimum Vs value.  Minimum Vp and minimum&lt;br /&gt;
                        density will be 3.4 times this value.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel on ~4000 cores; for 10 billion points and the C version of UCVM, takes about 20 minutes.  Typically only half the cores per node are used to get more memory per process.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#coords|coords]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; either [[CyberShake_Code_Base#RWG_format|RWG format]] or [[CyberShake_Code_Base#AWP_format|AWP format]], depending on the option selected.&lt;br /&gt;
&lt;br /&gt;
=== Smoothing ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To smooth a velocity file along model interfaces.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The smoothing code takes in a velocity mesh, determines the surface coordinates of the interfaces between velocity models, gets a list of all the points which need to be smoothed, and then performs the smoothing by averaging in both the X and Y direction for a user-specified number of points (default of 10km in each direction).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change our version of UCVM.  The LD_LIBRARY_PATH needs to be modified, in run_smoothing.py around line 98.&lt;br /&gt;
#The smoothing algorithm is modified.  Currently that is specified in the average_point() function in smooth_mpi.c.&lt;br /&gt;
#We start using velocity models with boundaries aren't perpendicular to the earth's surface.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/UCVM/smoothing&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#UCVM|UCVM]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  smoothing/run_smoothing.py&lt;br /&gt;
    bin/determine_surface_model&lt;br /&gt;
    smoothing/determine_smoothing_points.py&lt;br /&gt;
    smoothing/smooth_mpi&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the smoothing directory, and make sure that determine_surface_model has been compiled in the UCVM/src directory.  You may need to change the compiler; currently it uses 'cc'.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: run_smoothing.py [options]&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --gridout=GRIDOUT     gridout file&lt;br /&gt;
  --coords=COORDS       coords file&lt;br /&gt;
  --models=MODELSTRING  comma-separated list of velocity models&lt;br /&gt;
  --smoothing-dist=SMOOTHING_DIST&lt;br /&gt;
                        Number of grid points to smooth over.  About 10km of&lt;br /&gt;
                        grid points is a good starting place.&lt;br /&gt;
  --mesh=MESH           AWP-format velocity mesh to smooth&lt;br /&gt;
  --mesh-out=MESH_OUT   Output smoothed mesh&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel on ~1500 cores; for 5 billion points and the C version of UCVM, takes about 16 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP_format|AWP format velocity file]], [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Coord|coord]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP_format|AWP format]] smoothed velocity file.&lt;br /&gt;
&lt;br /&gt;
=== PreSGT ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate a series of input files which are used by the wave propagation codes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PreSGT determines the X and Y coordinates of the site location (where the impulse will go for the wave propagation simulation) and determines, which mesh point (X and Y) maps most closely to every point on a fault surface which is within the cutoff.  That information is combined with an adaptive mesh approach to create a list of all the points for which SGTs should be saved.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change our approach for saving adaptive mesh points.&lt;br /&gt;
#We change the location of the rupture geometry files, currently assumed to be &amp;lt;rupture root&amp;gt;/Ruptures_erf&amp;lt;erf ID&amp;gt; .  This is specified in presgt.py, line 167.&lt;br /&gt;
#The directory hierarchy and naming scheme for rupture geometry files, currently &amp;lt;src id&amp;gt;/&amp;lt;rup id&amp;gt;/&amp;lt;src id&amp;gt;_&amp;lt;rup_id&amp;gt;.txt, changes.  This is specified in faultlist_py/CreateFaultList.py, line 36.&lt;br /&gt;
#The number of header lines in the rupture geometry file changes.  This would require changing the nheader value, currently 6, specified in faultlist_py/CreateFaultList.py, line 36.&lt;br /&gt;
#We switch to RSQSim ruptures, or other ruptures in which the geometry isn't planar.  Modifications would be required to gen_sgtgrid.c.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/PreSgt&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Rob Graves, heavily modified by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]], [[CyberShake_Code_Base#libcfu|libcfu]], [[CyberShake_Code_Base#MySQLdb|MySQLdb for Python]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  presgt.py&lt;br /&gt;
    faultlist_py/CreateFaultList.py&lt;br /&gt;
    bin/gen_sgtgrid&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./presgt.py &amp;lt;site&amp;gt; &amp;lt;erf_id&amp;gt; &amp;lt;modelbox&amp;gt; &amp;lt;gridout&amp;gt; &amp;lt;model_coords&amp;gt; &amp;lt;fdloc&amp;gt; &amp;lt;faultlist&amp;gt; &amp;lt;radiusfile&amp;gt; &amp;lt;sgtcords&amp;gt; &amp;lt;spacing&amp;gt; [frequency]&lt;br /&gt;
Example: ./presgt.py USC 33 USC.modelbox gridout_USC model_coords_GC_USC USC.fdloc USC.faultlist USC.radiusfile USC.cordfile 200.0 0.1&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel on 8 nodes, 32 cores (gen_sgtgrid is a parallel code); for 200m spacing UCERF2, takes about 8 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Modelbox|modelbox]], [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Coord|coord]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Fdloc|fdloc]], [[CyberShake_Code_Base#Faultlist|faultlist]], [[CyberShake_Code_Base#Radiusfile|radiusfile]], [[CyberShake_Code_Base#SgtCoords|sgtcoords]].&lt;br /&gt;
&lt;br /&gt;
=== PreAWP ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate input files in a format that AWP-ODC expects.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PreAWP performs a number of steps:&lt;br /&gt;
#An IN3D parameter file is produced, needed for AWP-ODC.&lt;br /&gt;
#A file with the SGT coordinates to save in AWP format is produced.  Since RWG and AWP use different coordinate systems, a coordinate transformation (X-&amp;gt;Y, Y-&amp;gt;X, zero-indexing-&amp;gt;one-indexing) is performed on the SGT coordinates file.&lt;br /&gt;
#The velocity file in translated to AWP format, if it isn't in AWP format already. &lt;br /&gt;
#The correct source, based on the dt and nt, is selected.  The source must be generated manually ahead of time.  Details about source generation are given [[CyberShake Code Base#Impulse source descriptions | here]].&lt;br /&gt;
#Striping for the output file is also set up here.&lt;br /&gt;
#Files are symlinked into the directory structure that AWP expects.  Note that slightly different versions of this exist for the CPU and GPU implementations of AWP-ODC-SGT.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#The path to the Lustre striping command (lfs) changes.  This path is hard-coded in build_awp_inputs.py, line 14.  Note that this is the path to lfs on the compute node, NOT the login node.&lt;br /&gt;
#The AWP code changes its input format.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/AWP-HIP-SGT/utils/ (HIP GPU) or https://github.com/SCECcode/cybershake-core/AWP-GPU-SGT/utils/ (CUDA GPU) or https://github.com/SCECcode/cybershake-core/AWP-ODC-SGT/utils/ (CPU), AND also https://github.com/SCECcode/cybershake-core/SgtHead &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; SgtHead&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  build_awp_inputs.py&lt;br /&gt;
    build_IN3D.py&lt;br /&gt;
    build_src.py&lt;br /&gt;
    build_cordfile.py&lt;br /&gt;
      SgtHead/gen_awp_cordfile.py&lt;br /&gt;
    build_media.py&lt;br /&gt;
      SgtHead/bin/reformat_velocity&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make' in the SgtHead/src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: build_awp_inputs.py [options]&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  --site=SITE           Site name&lt;br /&gt;
  --gridout=GRIDOUT     Path to gridout input file&lt;br /&gt;
  --fdloc=FDLOC         Path to fdloc input file&lt;br /&gt;
  --cordfile=CORDFILE   Path to cordfile input file&lt;br /&gt;
  --velocity-prefix=VEL_PREFIX&lt;br /&gt;
                        RWG velocity prefix.  If omitted, will not reformat&lt;br /&gt;
                        velocity file, just symlink.&lt;br /&gt;
  --frequency=FREQUENCY&lt;br /&gt;
                        Frequency of SGT run, 0.5 Hz by default.&lt;br /&gt;
  --px=PX               Number of processors in X-direction.&lt;br /&gt;
  --py=PY               Number of processors in Y-direction.&lt;br /&gt;
  --pz=PZ               Number of processors in Z-direction.&lt;br /&gt;
  --source-frequency=SOURCE_FREQ&lt;br /&gt;
                        Low-pass filter frequency to use on the source,&lt;br /&gt;
                        default is same frequency as the frequency of the run.&lt;br /&gt;
  --spacing=SPACING     Override default spacing, derived from frequency.&lt;br /&gt;
  --velocity-mesh=VEL_MESH&lt;br /&gt;
                        Provide path to velocity mesh.  If omitted, will&lt;br /&gt;
                        assume mesh is named awp.&amp;lt;site&amp;gt;.media.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for 1 Hz run, takes about 11 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Gridout|gridout]], [[CyberShake_Code_Base#Fdloc|fdloc]], [[CyberShake_Code_Base#SgtCoords|cordfile]], velocity mesh (if in [[CyberShake_Code_Base#RWG_format|RWG format]], will be converted to [[CyberShake_Code_Base#AWP_format|AWP]]), [[CyberShake_Code_Base#RWG_source|RWG source]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP source|AWP source]], [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]], [[CyberShake_Code_Base#AWP cordfile|AWP cordfile]].&lt;br /&gt;
&lt;br /&gt;
=== AWP-ODC-SGT, CPU version ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To perform SGT synthesis&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; AWP-ODC-SGT is the CPU version. It uses the IN3D file for its parameters.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#New science or features are added to the AWP code.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/AWP-ODC-SGT&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kim Olsen, Steve Day, Yifeng Cui, various students and post-docs, wrapped by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; iobuf module&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  awp_odc_wrapper.sh&lt;br /&gt;
    bin/pmcl3d&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Using the GNU compilers, run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt; pmcl3d &amp;lt;IN3D parameter file&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel; for 0.5 Hz run (2 billion points, 20k timesteps), takes about 45 minutes on 10,000 cores.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP cordfile|AWP cordfile]],  [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]]), [[CyberShake_Code_Base#AWP source|AWP source]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]].&lt;br /&gt;
&lt;br /&gt;
=== AWP-ODC-SGT, CUDA GPU version ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To perform SGT synthesis&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; AWP-ODC-SGT is the GPU version. It takes parameters on the command-line, so the wrapper converts the IN3D file into command-line arguments and invokes it.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#New science or features are added to the AWP code.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; hhttps://github.com/SCECcode/cybershake-core/AWP-GPU-SGT&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kim Olsen, Steve Day, Yifeng Cui, various students and post-docs, wrapped by Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; CUDA toolkit module&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  gpu_wrapper.py&lt;br /&gt;
    bin/pmcl3d&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;modules PrgEnv-gnu and module cudatoolkit must be loaded first.  Then, run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./pmcl3d &lt;br /&gt;
Options:&lt;br /&gt;
	[(-T | --TMAX) &amp;lt;TMAX&amp;gt;]&lt;br /&gt;
	[(-H | --DH) &amp;lt;DH&amp;gt;]&lt;br /&gt;
	[(-t | --DT) &amp;lt;DT&amp;gt;]&lt;br /&gt;
	[(-A | --ARBC) &amp;lt;ARBC&amp;gt;]&lt;br /&gt;
	[(-P | --PHT) &amp;lt;PHT&amp;gt;]&lt;br /&gt;
	[(-M | --NPC) &amp;lt;NPC&amp;gt;]&lt;br /&gt;
	[(-D | --ND) &amp;lt;ND&amp;gt;]&lt;br /&gt;
	[(-S | --NSRC) &amp;lt;NSRC&amp;gt;]&lt;br /&gt;
	[(-N | --NST) &amp;lt;NST&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-V | --NVE) &amp;lt;NVE&amp;gt;]&lt;br /&gt;
	[(-B | --MEDIASTART) &amp;lt;MEDIASTART&amp;gt;]&lt;br /&gt;
	[(-n | --NVAR) &amp;lt;NVAR&amp;gt;]&lt;br /&gt;
	[(-I | --IFAULT) &amp;lt;IFAULT&amp;gt;]&lt;br /&gt;
	[(-R | --READ_STEP) &amp;lt;x READ_STEP]&lt;br /&gt;
&lt;br /&gt;
	[(-X | --NX) &amp;lt;x length]&lt;br /&gt;
	[(-Y | --NY) &amp;lt;y length&amp;gt;]&lt;br /&gt;
	[(-Z | --NZ) &amp;lt;z length]&lt;br /&gt;
	[(-x | --NPX) &amp;lt;x processors]&lt;br /&gt;
	[(-y | --NPY) &amp;lt;y processors&amp;gt;]&lt;br /&gt;
	[(-z | --NPZ) &amp;lt;z processors&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-1 | --NBGX) &amp;lt;starting point to record in X&amp;gt;]&lt;br /&gt;
	[(-2 | --NEDX) &amp;lt;ending point to record in X&amp;gt;]&lt;br /&gt;
	[(-3 | --NSKPX) &amp;lt;skipping points to record in X&amp;gt;]&lt;br /&gt;
	[(-11 | --NBGY) &amp;lt;starting point to record in Y&amp;gt;]&lt;br /&gt;
	[(-12 | --NEDY) &amp;lt;ending point to record in Y&amp;gt;]&lt;br /&gt;
	[(-13 | --NSKPY) &amp;lt;skipping points to record in Y&amp;gt;]&lt;br /&gt;
	[(-21 | --NBGZ) &amp;lt;starting point to record in Z&amp;gt;]&lt;br /&gt;
	[(-22 | --NEDZ) &amp;lt;ending point to record in Z&amp;gt;]&lt;br /&gt;
	[(-23 | --NSKPZ) &amp;lt;skipping points to record in Z&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-i | --IDYNA) &amp;lt;i IDYNA&amp;gt;]&lt;br /&gt;
	[(-s | --SoCalQ) &amp;lt;s SoCalQ&amp;gt;]&lt;br /&gt;
	[(-l | --FL) &amp;lt;l FL&amp;gt;]&lt;br /&gt;
	[(-h | --FH) &amp;lt;i FH&amp;gt;]&lt;br /&gt;
	[(-p | --FP) &amp;lt;p FP&amp;gt;]&lt;br /&gt;
	[(-r | --NTISKP) &amp;lt;time skipping in writing&amp;gt;]&lt;br /&gt;
	[(-W | --WRITE_STEP) &amp;lt;time aggregation in writing&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
	[(-100 | --INSRC) &amp;lt;source file&amp;gt;]&lt;br /&gt;
	[(-101 | --INVEL) &amp;lt;mesh file&amp;gt;]&lt;br /&gt;
	[(-o | --OUT) &amp;lt;output file&amp;gt;]&lt;br /&gt;
	[(-c | --CHKFILE) &amp;lt;checkpoint file to write statistics&amp;gt;]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
	[(-G | --IGREEN) &amp;lt;IGREEN for SGT&amp;gt;]&lt;br /&gt;
	[(-200 | --NTISKP_SGT) &amp;lt;NTISKP for SGT&amp;gt;]&lt;br /&gt;
	[(-201 | --INSGT) &amp;lt;SGT input file&amp;gt;]&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel; for 1 Hz run (10 billion points, 40k timesteps), takes about 55 minutes on 800 GPUs.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP cordfile|AWP cordfile]],  [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]]), [[CyberShake_Code_Base#AWP source|AWP source]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]].&lt;br /&gt;
&lt;br /&gt;
=== PostAWP ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To prepare the AWP results for use in post-processing.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; PostAWP prepares the outputs of AWP so that they can be used with the RWG-authored post-processing code.  Specifically, it undoes the AWP coordinate transformation and reformats the AWP output files into the SGT component order expected by RWG (XX-&amp;gt;YY, YY-&amp;gt;XX, XZ-&amp;gt;-YZ, YZ-&amp;gt;-XZ, and all SGTs are doubled if we are calculating the Z-component), creates separate SGT header files, and calculates MD5 sums on the SGT files.  Calculating the header information requires a number of input files, since lambda, mu, and the location of the impulse must all be included.  The MD5 sums can be calculated separately, using the MD5 wrapper RunMD5sum. &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#The AWP code is modified to produce outputs in exactly RWG order&lt;br /&gt;
#The header format for the post-processing code changes&lt;br /&gt;
#We decide not to calculate MD5 sums&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/AWP-HIP-SGT/utils/prepare_for_pp.py (this will work for the CPU version of AWP also, despite the path); https://github.com/SCECcode/cybershake-core/SgtHead&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  AWP-GPU-SGT/utils/prepare_for_pp.py&lt;br /&gt;
    SgtHead/bin/reformat_awp_mpi&lt;br /&gt;
    SgtHead/bin/write_head&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make write_head' and 'make reformat_awp_mpi' in the SgtHead/src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./prepare_for_pp.py &amp;lt;site&amp;gt; &amp;lt;AWP SGT&amp;gt; &amp;lt;reformatted SGT filename&amp;gt; &amp;lt;modelbox file&amp;gt; &amp;lt;rwg cordfile&amp;gt; &amp;lt;fdloc file&amp;gt; &amp;lt;gridout file&amp;gt; &amp;lt;IN3D file&amp;gt; &amp;lt;AWP media file&amp;gt; &amp;lt;component&amp;gt; &amp;lt;run_id&amp;gt; &amp;lt;header&amp;gt; [frequency]&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel, 4 processors on 2 nodes; for a 750 GB SGT, takes about 100 minutes &amp;lt;b&amp;gt;without&amp;lt;/b&amp;gt; the MD5 sums.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]], [[CyberShake_Code_Base#Modelbox|modelbox]],  [[CyberShake_Code_Base#SgtCoords|RWG cordfile]]), [[CyberShake_Code_Base#Fdloc|fdloc]], [[CyberShake_Code_Base#IN3D|IN3D]], [[CyberShake_Code_Base#AWP_format|AWP velocity mesh]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT file]], [[CyberShake_Code_Base#SGT header file|SGT header file]]&lt;br /&gt;
&lt;br /&gt;
=== RunMD5sum ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Wrapper for performing MD5sums.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; On Titan, we ran into wallclock issues when bundling the MD5sums along with PostAWP.  This wrapper supports performing the MD5 sums separately.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change hash algorithms&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/SgtHead/run_md5sum.sh&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  run_md5sum.sh&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./run_md5sum.sh &amp;lt;file&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for a 750 GB SGT, takes about 70 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; MD5sum, with filename &amp;lt;RWG SGT filename&amp;gt;.md5&lt;br /&gt;
&lt;br /&gt;
=== NanCheck ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Check the SGTs for anomalies before the post-processing.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code checks to be sure the SGTs are the expected size, then checks for NaNs or too many consecutive zeros in the SGT files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change the number of timesteps in the SGT file.  Currently this is hardcoded, but it should be a command-line parameter.&lt;br /&gt;
#We want to add additional checks.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/SgtTest/&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Rob Graves, Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar|Getpar]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  perform_checks.py&lt;br /&gt;
    bin/check_for_nans&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Run 'make' in SgtTest/src .&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./perform_checks.py &amp;lt;SGT file&amp;gt; &amp;lt;SGT header file&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for a 750 GB SGT, takes about 45 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#AWP SGT|AWP SGT file]], [[CyberShake_Code_Base#Sgt Coords|RWG coordinate file]], [[CyberShake_Code_Base#IN3D | IN3D file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
== PP-related codes ==&lt;br /&gt;
&lt;br /&gt;
The following codes are related to the post-processing part of the workflow.&lt;br /&gt;
&lt;br /&gt;
[[File:PP_workflow_stages.png|thumb|right|300px|Overview of the codes involved in the PP part of CyberShake, [http://hypocenter.usc.edu/research/cybershake/full_PP_workflow.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== CheckSgt ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To check the MD5 sums of the SGT files to be sure they match.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; CheckSgt takes the SGT files and their corresponding MD5 sums and checks for agreement.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change hashing algorithms.&lt;br /&gt;
#We decide to add additional sanity checks to the beginning of the post-processing.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/CheckSgt&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  CheckSgt.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./CheckSgt.py &amp;lt;sgt file&amp;gt; &amp;lt;md5 file&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for a 750 GB SGT, takes about 90 minutes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT]], SGT MD5 sums&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
=== DirectSynth ===&lt;br /&gt;
&lt;br /&gt;
DirectSynth is the code we currently use to perform the post-processing.  For historical reasons, all of the codes used for CyberShake post-processing are documented here: [https://scec.usc.edu/it/Post-processing_options CyberShake post-processing options] (login required).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To perform reciprocity calculations and produce seismograms, intensity measures, and duration measures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; DirectSynth reads in the SGTs across a group of processes, and hands out tasks (synthesis jobs) to worker processes.  These worker processes read in rupture geometry information from disk and call the RupGen-api to generate full slip histories in memory.  The workers request SGTs from the reader processes over MPI. X and Y component PSA calculations are performed from the resultant seismograms, and RotD and duration calculations are also performed, if requested.  More details about the approach used are available at [[DirectSynth]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We have new intensity measures or other calculations per seismogram to perform.&lt;br /&gt;
#We decide to change the post-processing algorithm.&lt;br /&gt;
#The wrapper needs to be modified if we want to set different custom environment variables.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/DirectSynth&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan, original seismogram synthesis code by Rob Graves, X and Y component PSA code by David Okaya, RotD code by Christine Goulet&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake Code Base#Getpar | Getpar]], [[CyberShake Code Base#libcfu | libcfu]], [[CyberShake Code Base#RupGen-api-v3.3.1 | RupGen-api-v3.3.1, [[CyberShake Code Base#FFTW | FFTW]], [[CyberShake Code Base#libmemcached | libmemcached]] (optional) and [[CyberShake Code Base#memcached | memcached]] (optional)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  direct_synth_v3.3.1.py (current version, uses the Graves &amp;amp; Pitarka (2014) rupture generator)&lt;br /&gt;
    utils/pegasus_wrappers/invoke_memcached.sh&lt;br /&gt;
      memcached&lt;br /&gt;
    bin/direct_synth  &lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; &lt;br /&gt;
#Compile RupGen-api first.&lt;br /&gt;
#Edit the makefile in DirectSynth/src .  Check the following variables:&lt;br /&gt;
##BASE_DIR should point to the top-level CyberShake install directory&lt;br /&gt;
##LIBCFU should point to the libcfu install directory&lt;br /&gt;
##V3_3_1_RG_LIB should point to the RupGen-api-3.3.1/lib directory&lt;br /&gt;
##LDLIBS should have the correct paths to the libcfu and libmemcached lib directories&lt;br /&gt;
##V3_3_1_RG_INC should point to the RupGen-api-3.3.1/include directory&lt;br /&gt;
##IFLAGS should have the correct paths to the libcfu and libmemcached include directories&lt;br /&gt;
#Run 'make direct_synth_v3.3.1' in DirectSynth/src.&lt;br /&gt;
&lt;br /&gt;
You will also need to edit the hard-coded paths to memcached in direct_synth_v3.3.1.py, in lines 15 and 24.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
direct_synth_v3.3.1.py&lt;br /&gt;
 stat=&amp;lt;site short name&amp;gt;&lt;br /&gt;
 slat=&amp;lt;site lat&amp;gt;&lt;br /&gt;
 slon=&amp;lt;site lon&amp;gt;&lt;br /&gt;
 run_id=&amp;lt;run id&amp;gt;&lt;br /&gt;
 sgt_handlers=&amp;lt;number of SGT handler processes; must be enough for the SGTs to be read into memory&amp;gt;&lt;br /&gt;
 debug=&amp;lt;print logs for each process; 1 is yes, 0 no&amp;gt;&lt;br /&gt;
 max_buf_mb=&amp;lt;buffer size in MB for each worker to use for storing SGT information&amp;gt;&lt;br /&gt;
 rupture_spacing=&amp;lt;'uniform' or 'random' hypocenter spacing&amp;gt;&lt;br /&gt;
 ntout=&amp;lt;nt for seismograms&amp;gt;&lt;br /&gt;
 dtout=&amp;lt;dt for seismograms&amp;gt;&lt;br /&gt;
 rup_list_file=&amp;lt;input file containing ruptures to process&amp;gt;&lt;br /&gt;
 sgt_xfile=&amp;lt;input SGT X file&amp;gt;&lt;br /&gt;
 sgt_yfile=&amp;lt;input SGT Y file&amp;gt;&lt;br /&gt;
 x_header=&amp;lt;input SGT X header&amp;gt;&lt;br /&gt;
 y_header=&amp;lt;input SGT Y header&amp;gt;&lt;br /&gt;
 det_max_freq=&amp;lt;maximum frequency of deterministic part&amp;gt;&lt;br /&gt;
 stoch_max_freq=&amp;lt;maximum frequency of stochastic part&amp;gt;&lt;br /&gt;
 run_psa=&amp;lt;'1' to run X and Y component PSA, '0' to not&amp;gt;&lt;br /&gt;
 run_rotd=&amp;lt;'1' to run RotD calculations, '0' to not&amp;gt;&lt;br /&gt;
 run_durations=&amp;lt;'1' to run duration calculation, '0' to not&amp;gt;&lt;br /&gt;
 simulation_out_pointsX=&amp;lt;'2', the number of components&amp;gt;&lt;br /&gt;
 simulation_out_pointsY=1&lt;br /&gt;
 simulation_out_timesamples=&amp;lt;same as ntout&amp;gt;&lt;br /&gt;
 simulation_out_timeskip=&amp;lt;same as dtout&amp;gt;&lt;br /&gt;
 surfseis_rspectra_seismogram_units=cmpersec&lt;br /&gt;
 surfseis_rspectra_output_units=cmpersec2&lt;br /&gt;
 surfseis_rspectra_output_type=aa&lt;br /&gt;
 surfseis_rspectra_period=all&lt;br /&gt;
 surfseis_rspectra_apply_filter_highHZ=&amp;lt;high filter, 5.0 for 1 Hz runs, 20.0 or higher for 10 Hz runs&amp;gt;&lt;br /&gt;
 surfseis_rspectra_apply_byteswap=no&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Parallel, typically on 3840 processors; for 750 GB SGTs with ~7000 ruptures, takes about 12 hours.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#RWG SGT|RWG SGT]], [[CyberShake_Code_Base#SGT header file|SGT headers]], [[CyberShake_Code_Base#rupture list file|rupture list file]], rupture geometry files&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | Seismograms]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | PSA files]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | RotD files]], [[Accessing_CyberShake_Duration_Data | Duration files]]&lt;br /&gt;
&lt;br /&gt;
== Data Product Codes ==&lt;br /&gt;
&lt;br /&gt;
The software in this section takes the data products produced by the SGT and post-processing stages, adds some of it to the database, and creates final data products.  Note that all these codes should be installed on a server close to the database, to reduce insertion and query time.  Currently these are all installed on SCEC disks and accessed from shock.usc.edu.&lt;br /&gt;
&lt;br /&gt;
[[File:Data_workflow_stages.png|thumb|right|300px|Overview of the codes involved in the data product of CyberShake, [http://hypocenter.usc.edu/research/cybershake/full_data_workflow.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
=== Load Amps ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Load data from output files into the database.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code loads either PSA, RotD, or Duration data into the database, depending on command-line options.  It also performs sanity checks on the PSA data being inserted: values must be between 0.008 and 8400 cm/s2. If they are less than 0.008, some will still be passed through if it's a small magnitude event at large distances.  If this constraint is violated, it will abort.  Note that if LoadAmps needs to be rerun, sometimes the database must be cleaned out first, as data from the previous attempt may have inserted successfully and will cause duplicate key errors if you try to insert the same data again.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change the sanity checks on data inserts.&lt;br /&gt;
#We modify the format of the PSA, RotD, or Duration files.&lt;br /&gt;
#We add new types of data to insert.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
#We add a new server.  To add a new server, in addition to providing a command-line option for it, you will need to create a Hibernate config file.  You can start with moment.cfg.xml or focal.cfg.xml and edit lines 7-16 appropriately.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/CyberCommands&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Joshua Garcia, Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; LoadAmps calls CyberCommands, a Java code with a long list of dependencies (all of these are checked into the Java project):&lt;br /&gt;
&lt;br /&gt;
*Ant&lt;br /&gt;
*Apache Commons&lt;br /&gt;
*Hibernate&lt;br /&gt;
*MySQL bindings&lt;br /&gt;
*Xerces&lt;br /&gt;
*DOM4J&lt;br /&gt;
*Log4J&lt;br /&gt;
*Java 1.6+&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  insert_dir.sh&lt;br /&gt;
    CyberLoadAmps_SC&lt;br /&gt;
      cybercommands_SC.jar&lt;br /&gt;
        CyberLoadamps.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Check out CyberCommands into Eclipse.  Create the cybercommands_SC.jar file using Eclipse's JAR build framework and the cybercommands_SC.jardesc description file.  Install cybercommand_SC.jar and the required JAR files on the server.  Point insert_dir.sh to CyberLoadAmps_SC to cybercommands_SC.jar.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: CyberLoadAmps [-r | -d | -z | -u][-c] [-d] [-periods periods] [-run RunID] [-p directory] [-server name] [-z] [-help] [-i insertion_values]&lt;br /&gt;
       [-u] [-f]&lt;br /&gt;
 -i &amp;lt;insertion_values&amp;gt;   Which values to insert -&lt;br /&gt;
                         gm:	geometric mean PSA data (default)&lt;br /&gt;
                         xy:	X and Y component PSA data&lt;br /&gt;
                         gmxy:  Geometric mean and X and Y components&lt;br /&gt;
 -run &amp;lt;RunID&amp;gt;            Run ID - this option is required&lt;br /&gt;
 -p &amp;lt;directory&amp;gt;          file path with spectral acceleration files,&lt;br /&gt;
                         either top-level directory or zip file - this option is required&lt;br /&gt;
 -server &amp;lt;name&amp;gt;          server name (focal, surface, intensity, moment,&lt;br /&gt;
                         or csep-x) - this option is required&lt;br /&gt;
 -periods &amp;lt;periods&amp;gt;      Comma-delimited periods to insert&lt;br /&gt;
 -c                      Convert values from g to cm/sec^2&lt;br /&gt;
 -d                      Assume one BSA file per rupture, with embedded&lt;br /&gt;
                         header information.&lt;br /&gt;
 -f                      Don't apply value checks to insertion values; use&lt;br /&gt;
                         with care!.&lt;br /&gt;
 -help                   print this message&lt;br /&gt;
 -r                      Read rotd files (instead of bsa.)&lt;br /&gt;
 -u                      Read duration files (instead of bsa.)&lt;br /&gt;
 -z                      Read zip files instead of bsa.&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; for 5 periods, takes about 10 minutes.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | PSA files]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | RotD files]], [[Accessing_CyberShake_Duration_Data | Duration files]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
=== Check DB Site ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Verify that data was correctly loaded into the database.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a list of components (or type IDs) to check for a run ID, and verifies that there is one entry for every rupture variation.  If some rupture variations are missing, a file is produced which lists the missing source, rupture, rupture variation tuples.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/db/CheckDBDataForSite.java and DBConnect.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan (CheckDBDataForSite.java), Nitin Gupta, Vipin Gupta, Phil Maechling (DBConnect.java)&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; Both are checked into the CyberShake project:&lt;br /&gt;
&lt;br /&gt;
*Apache Commons&lt;br /&gt;
*MySQL bindings&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  check_db.sh&lt;br /&gt;
    CheckDBDataForSite.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Check out CheckDBDataForSite.java and DBConnect.java.  Compile them by running 'javac -classpath mysql-connector-java-5.0.5-bin.jar:commons-cli-1.0.jar DBConnect.java CheckDBDataForSite.java'.  The paths to the MySQL bindings jar and the Apache Commons jar may be different depending on your installation.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;usage: CheckDBDataForSite&lt;br /&gt;
 -p &amp;lt;periods&amp;gt;     Comma-separated list of periods to check, for geometric&lt;br /&gt;
                  and rotd.&lt;br /&gt;
 -t &amp;lt;type_ids&amp;gt;    Comma-separated list of type IDs to check, for duration.&lt;br /&gt;
 -c &amp;lt;component&amp;gt;   Component type (geometric, rotd, duration) to check.&lt;br /&gt;
 -h,--help        Print help for CheckDBDataForSite&lt;br /&gt;
 -o &amp;lt;output&amp;gt;      Path to output file, if something is missing (required).&lt;br /&gt;
 -r &amp;lt;run_id&amp;gt;      Run ID to check (required).&lt;br /&gt;
 -s &amp;lt;server&amp;gt;      DB server to query against.&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; typically takes just a few seconds.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Missing variations file | Missing variations file]]&lt;br /&gt;
&lt;br /&gt;
=== DB Report ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Produce a database report, a data product which Rob Graves used for a time.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a run ID, queries the database for PSA values for all components, and writes the output to a text file.  The list of periods and the DB config parameters are specified in an XML config file.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed: the DB connection parameters in default.xml would need to be edited.&lt;br /&gt;
#We want results for different periods: edit default.xml.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/reports/db_report_gen.py . default.xml in the same directory is also needed, and can be generated by editing and running conf_get.py, also in the same directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kevin Milner&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; MySQLdb&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  db_report_gen.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None, all code is Python.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: db_report_gen.py [options] SITE_SHORT_NAME&lt;br /&gt;
&lt;br /&gt;
NOTE: defaults are loaded from defaults.xml and can be edited manually&lt;br /&gt;
	  or overridden with conf_gen.py&lt;br /&gt;
&lt;br /&gt;
Options:&lt;br /&gt;
  -h, --help            show this help message and exit&lt;br /&gt;
  -e ERF_ID, --erfID=ERF_ID&lt;br /&gt;
                        ERF ID for Report (default = none)&lt;br /&gt;
  -f FILENAME, --file=FILENAME&lt;br /&gt;
                        Store Results to a file instead of STDOUT. If a&lt;br /&gt;
                        directory is given, a name will be auto generated.&lt;br /&gt;
  -i, --id              Flag for specifying site ID instead of Short Name&lt;br /&gt;
                        (default uses Short Name)&lt;br /&gt;
  --hypo, --hpyocenter  Flag for appending hypocenter locations to result&lt;br /&gt;
  -l LIMIT, --limit=LIMIT&lt;br /&gt;
                        Limit the total number of rusults, or 0 for no limit&lt;br /&gt;
                        (default = 0)&lt;br /&gt;
  -o, --sort            SLOW: Force SQL Order By statement for sorting. It&lt;br /&gt;
                        will probably come out sorted, but if it doesn't, you&lt;br /&gt;
                        can use this. (default will not sort)&lt;br /&gt;
  -p PERIODS, --periods=PERIODS&lt;br /&gt;
                        Comma separated period values (default = 3.0,5.0,10.0)&lt;br /&gt;
  --pr, --print-runs    Print run IDs for site and optionally ERF/Rup Var&lt;br /&gt;
                        Scen/SGT Var IDs&lt;br /&gt;
  -r RUP_VAR_SCENARIO_ID, --rupVarID=RUP_VAR_SCENARIO_ID&lt;br /&gt;
                        Rupture Variation Scenario ID for Report (default =&lt;br /&gt;
                        none)&lt;br /&gt;
  --ri=RUN_ID, --runID=RUN_ID&lt;br /&gt;
                        Allows you to specify a run ID to use (default uses&lt;br /&gt;
                        latest compatible run ID)&lt;br /&gt;
  -R RUPTURE, --rupture=RUPTURE&lt;br /&gt;
                        Only give information on specified rupture. Must be&lt;br /&gt;
                        acompanied by -S/--source flag (default shows all&lt;br /&gt;
                        ruptures)&lt;br /&gt;
  -s SGT_VAR_ID, --sgtVarID=SGT_VAR_ID&lt;br /&gt;
                        SGT Variation ID for Report (default = none)&lt;br /&gt;
  -S SOURCE, --source=SOURCE&lt;br /&gt;
                        Only give information on specified source. To specify&lt;br /&gt;
                        rupture, see -R option (default shows all sources)&lt;br /&gt;
  --s_im, --sort-ims    Sort output by IM value (increasing)...may be slow!&lt;br /&gt;
  -v, --verbose         Verbosity Flag (default = False)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; about 1 minute.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; default.xml&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#DB Report file | DB Report file]]&lt;br /&gt;
&lt;br /&gt;
=== Curve Calc ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Calculate CyberShake hazard curves alongside comparison GMPEs.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a run ID, component, and period, queries the database for the appropriate IM values, and calculates a hazard curve in the desired format. Comparison GMPE curves can also be plotted.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
#New IM types need to be supported.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; The CyberShake curve calculator is part of the OpenSHA codebase.  The specific Java class is org.opensha.sha.cybershake.plot.HazardCurvePlotter (available via https://github.com/opensha/opensha-cybershake/tree/master/src/main/java/org/opensha/sha/cybershake/plot), but it has a complex set of Java depdendencies.  To compile and run, you should follow the instructions on http://www.opensha.org/trac/wiki/SettingUpEclipse to access the source.  The curve calculator is also wrapped by curve_plot_wrapper.sh, in https://github.com/SCECcode/cybershake-tools/blob/master/HazardCurveGeneration/curve_plot_wrapper.sh .&lt;br /&gt;
&lt;br /&gt;
The OpenSHA project also has configuration files for various GMPEs, config files for UCERF2, and configuration files for output formats preferred by Tom and Rob, in src/org/opensha/sha/cybershake/conf.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kevin Milner&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; Standard OpenSHA dependencies&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  curve_plot_wrapper.sh&lt;br /&gt;
    HazardCurvePlotter.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Use the OpenSHA build process if building from source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
usage: HazardCurvePlotter [-?] [-af &amp;lt;arg&amp;gt;] [-benchmark] [-c] [-cmp &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-comp &amp;lt;arg&amp;gt;] [-cvmvs] [-e &amp;lt;arg&amp;gt;] [-ef &amp;lt;arg&amp;gt;] [-f] [-fvs &amp;lt;arg&amp;gt;] [-h&lt;br /&gt;
       &amp;lt;arg&amp;gt;] [-imid &amp;lt;arg&amp;gt;] [-imt &amp;lt;arg&amp;gt;] [-n] [-novm] [-o &amp;lt;arg&amp;gt;] [-p&lt;br /&gt;
       &amp;lt;arg&amp;gt;] [-pf &amp;lt;arg&amp;gt;] [-pl &amp;lt;arg&amp;gt;] [-R &amp;lt;arg&amp;gt;] [-r &amp;lt;arg&amp;gt;] [-s &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-sgt &amp;lt;arg&amp;gt;] [-sgtsym] [-t &amp;lt;arg&amp;gt;] [-v &amp;lt;arg&amp;gt;] [-vel &amp;lt;arg&amp;gt;] [-w&lt;br /&gt;
       &amp;lt;arg&amp;gt;]&lt;br /&gt;
 -?,--help                            Display this message&lt;br /&gt;
 -af,--atten-rel-file &amp;lt;arg&amp;gt;           XML Attenuation Relationship&lt;br /&gt;
                                      description file(s) for comparison.&lt;br /&gt;
                                      Multiple files should be comma&lt;br /&gt;
                                      separated&lt;br /&gt;
 -benchmark,--benchmark-test-recalc   Forces recalculation of hazard&lt;br /&gt;
                                      curves to test calculation speed.&lt;br /&gt;
                                      Newly recalculated curves are not&lt;br /&gt;
                                      kept and the original curves are&lt;br /&gt;
                                      plotted.&lt;br /&gt;
 -c,--calc-only                       Only calculate and insert the&lt;br /&gt;
                                      CyberShake curves, don't make plots.&lt;br /&gt;
                                      If a curve already exists, it will&lt;br /&gt;
                                      be skipped.&lt;br /&gt;
 -cmp,--component &amp;lt;arg&amp;gt;               Intensity measure component.&lt;br /&gt;
                                      Options: GEOM,X,Y,RotD100,RotD50,&lt;br /&gt;
                                      Default: GEOM&lt;br /&gt;
 -comp,--compare-to &amp;lt;arg&amp;gt;             Compare to  aspecific Run ID (or&lt;br /&gt;
                                      multiple IDs, comma separated)&lt;br /&gt;
 -cvmvs,--cvm-vs30                    Option to use Vs30 value from the&lt;br /&gt;
                                      velocity model itself in GMPE&lt;br /&gt;
                                      calculations rather than, for&lt;br /&gt;
                                      example, the Wills 2006 value.&lt;br /&gt;
 -e,--erf-id &amp;lt;arg&amp;gt;                    ERF ID&lt;br /&gt;
 -ef,--erf-file &amp;lt;arg&amp;gt;                 XML ERF description file for&lt;br /&gt;
                                      comparison&lt;br /&gt;
 -f,--force-add                       Flag to add curves to db without&lt;br /&gt;
                                      prompt&lt;br /&gt;
 -fvs,--force-vs30 &amp;lt;arg&amp;gt;              Option to force the given Vs30 value&lt;br /&gt;
                                      to be used in GMPE calculations.&lt;br /&gt;
 -h,--height &amp;lt;arg&amp;gt;                    Plot height (default = 500)&lt;br /&gt;
 -imid,--im-type-id &amp;lt;arg&amp;gt;             Intensity measure type ID. If not&lt;br /&gt;
                                      supplied, will be detected from im&lt;br /&gt;
                                      type/component/period parameters&lt;br /&gt;
 -imt,--im-type &amp;lt;arg&amp;gt;                 Intensity measure type. Options: SA,&lt;br /&gt;
                                      Default: SA&lt;br /&gt;
 -n,--no-add                          Flag to not automatically calculate&lt;br /&gt;
                                      curves not in the database&lt;br /&gt;
 -novm,--no-vm-colors                 Disables Velocity Model coloring&lt;br /&gt;
 -o,--output-dir &amp;lt;arg&amp;gt;                Output directory&lt;br /&gt;
 -p,--period &amp;lt;arg&amp;gt;                    Period(s) to calculate. Multiple&lt;br /&gt;
                                      periods should be comma separated&lt;br /&gt;
                                      (default: 3)&lt;br /&gt;
 -pf,--password-file &amp;lt;arg&amp;gt;            Path to a file that contains the&lt;br /&gt;
                                      username and password for inserting&lt;br /&gt;
                                      curves into the database. Format&lt;br /&gt;
                                      should be &amp;quot;user:pass&amp;quot;&lt;br /&gt;
 -pl,--plot-chars-file &amp;lt;arg&amp;gt;          Specify the path to a plot&lt;br /&gt;
                                      characteristics XML file&lt;br /&gt;
 -R,--run-id &amp;lt;arg&amp;gt;                    Run ID&lt;br /&gt;
 -r,--rv-id &amp;lt;arg&amp;gt;                     Rupture Variation ID&lt;br /&gt;
 -s,--site &amp;lt;arg&amp;gt;                      Site short name&lt;br /&gt;
 -sgt,--sgt-var-id &amp;lt;arg&amp;gt;              STG Variation ID&lt;br /&gt;
 -sgtsym,--sgt-colors                 Enables SGT specific symbols&lt;br /&gt;
 -t,--type &amp;lt;arg&amp;gt;                      Plot save type. Options are png,&lt;br /&gt;
                                      pdf, jpg, and txt. Multiple types&lt;br /&gt;
                                      can be comma separated (default is&lt;br /&gt;
                                      pdf)&lt;br /&gt;
 -v,--vs30 &amp;lt;arg&amp;gt;                      Specify default Vs30 for sites with&lt;br /&gt;
                                      no Vs30 data, or leave blank for&lt;br /&gt;
                                      default value. Otherwise, you will&lt;br /&gt;
                                      be prompted to enter vs30&lt;br /&gt;
                                      interactively if needed.&lt;br /&gt;
 -vel,--vel-model-id &amp;lt;arg&amp;gt;            Velocity Model ID&lt;br /&gt;
 -w,--width &amp;lt;arg&amp;gt;                     Plot width (default = 600)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; about 30 seconds per curve.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; ERF config file, GMPE config files&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Hazard Curve | Hazard Curve]]&lt;br /&gt;
&lt;br /&gt;
=== Disaggregate ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; Disaggregate the curve results to determine the largest contributing sources.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code takes a run ID, a probability or IM level, and a period to disaggregate at.  It produces disaggregation distance-magnitude plots and also a list of the % contribution of each source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#Username, password, or database host are changed.&lt;br /&gt;
#We change the database schema.&lt;br /&gt;
#We want to support different kinds of disaggregation, or for a different kind of ERF.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; The Disaggregator is part of the OpenSHA codebase.  The specific Java class is org.opensha.sha.cybershake.plot.DisaggregationPlotter (available via https://github.com/opensha/opensha-cybershake/tree/master/src/main/java/org/opensha/sha/cybershake/plot), but it has a complex set of Java depdendencies.  To compile and run, you should follow the instructions on http://www.opensha.org/trac/wiki/SettingUpEclipse to access the source.  The curve calculator is also wrapped by disagg_plot_wrapper.sh, in https://github.com/SCECcode/cybershake-tools/blob/master/HazardCurveGeneration/disagg_plot_wrapper.sh .&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Kevin Milner, Nitin Gupta, Vipin Gupta&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; Standard OpenSHA dependencies&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  disagg_plot_wrapper.sh&lt;br /&gt;
    DisaggregationPlotter.java&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Use the standard OpenSHA building process if building from source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;usage: DisaggregationPlotter [-?] [-af &amp;lt;arg&amp;gt;] [-cmp &amp;lt;arg&amp;gt;] [-e &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-fvs &amp;lt;arg&amp;gt;] [-i &amp;lt;arg&amp;gt;] [-imid &amp;lt;arg&amp;gt;] [-imt &amp;lt;arg&amp;gt;] [-o &amp;lt;arg&amp;gt;] [-p&lt;br /&gt;
       &amp;lt;arg&amp;gt;] [-pr &amp;lt;arg&amp;gt;] [-r &amp;lt;arg&amp;gt;] [-R &amp;lt;arg&amp;gt;] [-s &amp;lt;arg&amp;gt;] [-sgt &amp;lt;arg&amp;gt;]&lt;br /&gt;
       [-t &amp;lt;arg&amp;gt;] [-vel &amp;lt;arg&amp;gt;]&lt;br /&gt;
 -?,--help                    Display this message&lt;br /&gt;
 -af,--atten-rel-file &amp;lt;arg&amp;gt;   XML Attenuation Relationship description&lt;br /&gt;
                              file(s) for comparison. Multiple files&lt;br /&gt;
                              should be comma separated&lt;br /&gt;
 -cmp,--component &amp;lt;arg&amp;gt;       Intensity measure component. Options:&lt;br /&gt;
                              GEOM,X,Y,RotD100,RotD50, Default: GEOM&lt;br /&gt;
 -e,--erf-id &amp;lt;arg&amp;gt;            ERF ID&lt;br /&gt;
 -fvs,--force-vs30 &amp;lt;arg&amp;gt;      Option to force the given Vs30 value to be&lt;br /&gt;
                              used in GMPE calculations.&lt;br /&gt;
 -i,--imls &amp;lt;arg&amp;gt;              Intensity Measure Levels (IMLs) to&lt;br /&gt;
                              disaggregate at. Multiple IMLs should be&lt;br /&gt;
                              comma separated.&lt;br /&gt;
 -imid,--im-type-id &amp;lt;arg&amp;gt;     Intensity measure type ID. If not supplied,&lt;br /&gt;
                              will be detected from im&lt;br /&gt;
                              type/component/period parameters&lt;br /&gt;
 -imt,--im-type &amp;lt;arg&amp;gt;         Intensity measure type. Options: SA,&lt;br /&gt;
                              Default: SA&lt;br /&gt;
 -o,--output-dir &amp;lt;arg&amp;gt;        Output directory&lt;br /&gt;
 -p,--period &amp;lt;arg&amp;gt;            Period(s) to calculate. Multiple periods&lt;br /&gt;
                              should be comma separated (default: 3)&lt;br /&gt;
 -pr,--probs &amp;lt;arg&amp;gt;            Probabilities (1 year) to disaggregate at.&lt;br /&gt;
                              Multiple probabilities should be comma&lt;br /&gt;
                              separated.&lt;br /&gt;
 -r,--rv-id &amp;lt;arg&amp;gt;             Rupture Variation ID&lt;br /&gt;
 -R,--run-id &amp;lt;arg&amp;gt;            Run ID&lt;br /&gt;
 -s,--site &amp;lt;arg&amp;gt;              Site short name&lt;br /&gt;
 -sgt,--sgt-var-id &amp;lt;arg&amp;gt;      STG Variation ID&lt;br /&gt;
 -t,--type &amp;lt;arg&amp;gt;              Plot save type. Options are png, pdf, and&lt;br /&gt;
                              txt. Multiple types can be comma separated&lt;br /&gt;
                              (default is pdf)&lt;br /&gt;
 -vel,--vel-model-id &amp;lt;arg&amp;gt;    Velocity Model ID&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; typically takes about 30 seconds.  It's wildly dependent on the database and contention from other database processes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; none&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Disaggregation file | Disaggregation file]]&lt;br /&gt;
&lt;br /&gt;
== Stochastic codes ==&lt;br /&gt;
&lt;br /&gt;
With CyberShake, we also have the option to augment a completed run with stochastic seismograms.  The following codes are used to add stochastic high-frequency content to an already-completed low-frequency deterministic run.&lt;br /&gt;
&lt;br /&gt;
[[File:stochastic workflow overview.png|thumb|right|300px|Overview of the codes involved in the stochastic part of CyberShake, [http://hypocenter.usc.edu/research/cybershake/stochastic_workflow_overview.odg source file (ODG)]]]&lt;br /&gt;
&lt;br /&gt;
=== Velocity Info ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To determine slowness-averaged VsX values for a CyberShake site, from UCVM.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; Velocity Info takes a location, a velocity model, and grid spacing information and queries UCVM to generate three VsX values needed by the site response:&lt;br /&gt;
#Vs30, calculated as: 30 / sum( 1 / (Vs sampled from [0.5, 29.5] at 1 meter increments, for 30 values) )&lt;br /&gt;
#Vs5H, like Vs30 but calculated over the shallowest 5*gridspacing meters.  So if gridspacing=100m, Vs5H = 500 / sum( 1 / (Vs sampled from [0.5, 499.5] at 1 meter increments, for 500 values) )&lt;br /&gt;
#VsD5H, like Vs30, but calculated over gridspacing increments, instead of 1 meter.  The start and end are weighted half as much.  So if gridspacing=100m, VsD5H = 5 / sum( 0 / (Vs sampled from [0, 500] at 1 meter increments, for 500 values) )&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We need to support more than one model - for instance, if the CyberShake site box (not simulation volume) spans multiple models.  The code to parse the model string and load models in initialize_ucvm() would need to be changed. &lt;br /&gt;
#We want to support new kinds of velocity values.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/src/retrieve_vs.c&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#UCVM | UCVM]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  retrieve_vs&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt;Run 'make retrieve_vs'&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./retrieve_vs &amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;model&amp;gt; &amp;lt;gridspacing&amp;gt; &amp;lt;out filename&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes about 15 seconds.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Velocity_Info file|Velocity Info file]]&lt;br /&gt;
&lt;br /&gt;
=== Local VM ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To generate a &amp;quot;local&amp;quot; 1D velocity file, required for the high-frequency codes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; Local VM takes in an input file containing a 1D velocity model.  It then calculates Qs from these values and writes all the velocity data to a new file.  For all Study 15.12 runs, we used the LA Basin 1D model from the BBP, v14.3.0.  It's registered in the RLS, and is located at /home/scec-02/cybershk/runs/genslip_nr_generic1d-gp01.vmod.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt;&lt;br /&gt;
#We change the algorithm for calculating Vs.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/gen_local_vel.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan, modified from Rob Graves' code&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  gen_local_vel.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;Usage: ./gen_local_vel.py &amp;lt;1D velocity model&amp;gt; &amp;lt;output&amp;gt;&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes less than a second.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#BBP velocity file|BBP 1D velocity file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Local VM file|Local VM file]]&lt;br /&gt;
&lt;br /&gt;
=== Create Dirs ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; To create a directory for each source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The high-frequency codes produce many intermediate files.  To avoid overloading the filesystem, Create Dirs creates a separate directory for every source.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; This code is basically just a wrapper around mkdir, and is unlikely to need changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/create_dirs.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  create_dirs.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt; &lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Usage: ./create_dirs.py &amp;lt;file with list of dirs&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial, takes just a few seconds.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; File with a directory to create (a source ID) on each line.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; None.&lt;br /&gt;
&lt;br /&gt;
=== HF Synth ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; HF Synth generates a high-frequency stochastic seismogram for one or more rupture variations.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; This code wraps multiple broadband platform codes to reduce the number of invocations required.  Specifically, it calls:&lt;br /&gt;
#srf2stoch_lite(), a reduced-memory version of srf2stoch.  We have modified it to call rupgen_genslip() to generate the SRF, rather than reading it in from disk.&lt;br /&gt;
#hfsim(), a wrapper for:&lt;br /&gt;
##hb_high(), Rob Graves's original BBP code to produce the seismograms&lt;br /&gt;
##wcc_getpeak(), which calculates PGA for the seismogram&lt;br /&gt;
##wcc_siteamp14(), which performs site amplification.&lt;br /&gt;
&lt;br /&gt;
Vs30 is required, so if it is not passed as a command-line argument, UCVM is called to determine it.&lt;br /&gt;
&lt;br /&gt;
Additionally, hf_synth_lite is able to handle processing on multiple rupture variations, to further reduce the number of invocations.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; &lt;br /&gt;
#A new version of one of Rob's codes - the high-frequency generator or the site amplification - is needed.  We have tried to use whatever the most recent version is on the BBP, for consistency.&lt;br /&gt;
#New velocity parameters are needed for the site amplification.&lt;br /&gt;
#The format of the rupture geometry files changes.&lt;br /&gt;
&lt;br /&gt;
The makefile needs to be changed if the path to libmemcached, UCVM, Getpar, or the rupture generator changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; wrapper by Scott Callaghan, hb_high(), wcc_getpeak(), and wcc_siteamp14() by Rob Graves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar | Getpar]], [[CyberShake_Code_Base#UCVM | UCVM]], [[CyberShake_Code_Base#RupGen-api-v3.3.1 | rupture generator]], [[CyberShake_Code_Base#libmemcached | libmemcached]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  hf_synth_lite&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Run 'make' in src.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt; There is no 'help' usage string, but here's a sample invocation:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
/projects/sciteam/jmz/CyberShake/software/HFSim_mem/bin/hf_synth_lite&lt;br /&gt;
   stat=OSI slat=34.6145 slon=-118.7235&lt;br /&gt;
   rup_geom_file=e36_rv6_121_0.txt source_id=121 rupture_id=0&lt;br /&gt;
   num_rup_vars=5 rup_vars=(0,0,0);(1,1,0);(2,2,0);(3,3,0);(4,4,0)&lt;br /&gt;
   outfile=121/Seismogram_OSI_4331_121_0_hf_t0.grm&lt;br /&gt;
   dx=2.0 dy=2.0 tlen=300.0 dt=0.025&lt;br /&gt;
   do_site_response=1 vs30=359.1 debug=0 vmod=LA_Basin_BBP_14.3.0.local&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes a few seconds per rupture variation up to a minute, depending on the size of the fault surface.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Local_VM_file | Local velocity file]], [[CyberShake_Rupture_Files#UCERF2_Rupture_Geometry_Files | rupture geometry file]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; High-frequency seismograms, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
=== Combine HF Synth ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; This code combines the seismograms produced by HF Synth so that there is just 1 seismogram per source/rupture combo.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; Since we split up work so that each HF Synth job takes a chunk of rupture variations, we may end up with multiple seismogram files per rupture, each containing some of the rupture variations.  This script concatenates the files, using cat, into a single file, ready to be worked on later in the workflow.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; I can't think of a circumstance where we would need to change this.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/HFSim_mem/combine_seis.py&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  combine_seis.py&lt;br /&gt;
    cat&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; None&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Usage: ./combine_seis.py &amp;lt;seis 0&amp;gt; &amp;lt;seis 1&amp;gt; ... &amp;lt;seis N&amp;gt; &amp;lt;output seis name&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes a few seconds.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; High-frequency seismograms, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; A single high-frequency seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
=== LF Site Response ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; This code performs site response modifications to the CyberShake low-frequency seismograms.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The LF Site Response code takes a low-frequency seismogram and some velocity parameters, and outputs a seismogram with site response applied.  In Study 15.12, this was a necessary step before combining the low and high frequency seismograms together.  Since Vs30 is required, if it's not passed as a command-line argument, then UCVM is called to determine it.&lt;br /&gt;
&lt;br /&gt;
The reason we calculate site response for the low-frequency deterministic seismograms is that we want both the low- and high-frequency results to be for the same site-response condition.  For the HF, we used Vs30 directly&lt;br /&gt;
for the site-response adjustment, but for the LF we had to use an adjusted VsX value since the grid spacing was 100 m (so Vs30 doesn't make sense).&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; &lt;br /&gt;
#We change the site response algorithm.&lt;br /&gt;
#We decide to use different velocity parameters for setting site response.&lt;br /&gt;
#The format of the seismogram files changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/LF_Site_Response&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; wrapper by Scott Callaghan, site response by Rob Graves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar | Getpar]], [[CyberShake_Code_Base#UCVM | UCVM]], [[CyberShake_Code_Base#RupGen-api-v3.3.1 | rupture generator]], [[CyberShake_Code_Base#libmemcached | libmemcached]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  lf_site_response&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Edit the makefile to point to RupGen, libmemcached, and Getpar, then run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
Sample invocation:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./lf_site_response&lt;br /&gt;
seis_in=Seismogram_OSI_3923_263_3.grm seis_out=263/Seismogram_OSI_3923_263_3_site_response.grm&lt;br /&gt;
slat=34.6145 slon=-118.7235&lt;br /&gt;
module=cb2014&lt;br /&gt;
vs30=359.1 vref=344.7&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes less than a second.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; A low-frequency deterministic seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; A low-frequency deterministic seismogram with site response, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
=== Merge IM ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Purpose:&amp;lt;/b&amp;gt; This code combines low-frequency deterministic and high-frequency stochastic seismograms, then processes them to obtain intensity measures.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Detailed description:&amp;lt;/b&amp;gt; The Merge IM code takes an LF and HF seismogram and performs the following processing:&lt;br /&gt;
#A high-pass filter is applied to the HF seismogram.&lt;br /&gt;
#The LF seismogram is resampled to the same dt as the HF seismogram.&lt;br /&gt;
#The two seismograms are combined into a single broadband (BB) seismogram.&lt;br /&gt;
#The PSA code is run on the resulting seismogram.&lt;br /&gt;
#If desired, the RotD and duration codes are also run on the seismogram.&lt;br /&gt;
&lt;br /&gt;
Merge IM works on a seismogram file at the rupture level, so it assumes that the input files contain multiple rupture variations.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Needs to be changed if:&amp;lt;/b&amp;gt; &lt;br /&gt;
#We change the filter-and-combine algorithm.&lt;br /&gt;
#We decide to modify the post-processing and IM types we want to capture.&lt;br /&gt;
#The format of the seismogram files changes.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Source code location:&amp;lt;/b&amp;gt; https://github.com/SCECcode/cybershake-core/MergeIM&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Author:&amp;lt;/b&amp;gt; Scott Callaghan, Rob Graves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Dependencies:&amp;lt;/b&amp;gt; [[CyberShake_Code_Base#Getpar | Getpar]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Executable chain:&amp;lt;/b&amp;gt;&lt;br /&gt;
  merge_psa&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Compile instructions:&amp;lt;/b&amp;gt; Edit the makefile to point to Getpar, then run 'make' in the src directory.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Usage:&amp;lt;/b&amp;gt;&lt;br /&gt;
Sample invocation:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
./merge_psa&lt;br /&gt;
lf_seis=182/Seismogram_OSI_3923_182_23_site_response.grm hf_seis=182/Seismogram_OSI_4331_182_23_hf.grm seis_out=182/Seismogram_OSI_4331_182_23_bb.grm&lt;br /&gt;
freq=1.0 comps=2 num_rup_vars=16&lt;br /&gt;
simulation_out_pointsX=2 simulation_out_pointsY=1&lt;br /&gt;
simulation_out_timesamples=12000 simulation_out_timeskip=0.025&lt;br /&gt;
surfseis_rspectra_seismogram_units=cmpersec surfseis_rspectra_output_units=cmpersec2&lt;br /&gt;
surfseis_rspectra_output_type=aa surfseis_rspectra_period=all&lt;br /&gt;
surfseis_rspectra_apply_filter_highHZ=20.0 surfseis_rspectra_apply_byteswap=no&lt;br /&gt;
out=182/PeakVals_OSI_4331_182_23_bb.bsa&lt;br /&gt;
run_rotd=1 rotd_out=182/RotD_OSI_4331_182_23_bb.rotd&lt;br /&gt;
run_duration=1 duration_out=182/Duration_OSI_4331_182_23_bb.dur&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Typical run configuration:&amp;lt;/b&amp;gt; Serial; takes 5-30 seconds, depending on the number of rupture variations in the files.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Input files:&amp;lt;/b&amp;gt; LF deterministic seismogram and HF stochastic seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]].&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Output files:&amp;lt;/b&amp;gt; BB seismogram, in the general [[Accessing_CyberShake_Seismograms#Reading_Seismogram_Files | seismogram format]]; also [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | PSA files]], [[Accessing_CyberShake_Peak_Acceleration_Data#Reading_Peak_Acceleration_Files | RotD files]], and [[Accessing_CyberShake_Duration_Data | Duration files]]&lt;br /&gt;
&lt;br /&gt;
== File types ==&lt;br /&gt;
&lt;br /&gt;
=== Modelbox ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains a description of the simulation box, at the surface.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.modelbox&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;site name&amp;gt;&lt;br /&gt;
APPROXIMATE CENTROID:&lt;br /&gt;
  clon= &amp;lt;centroid lon&amp;gt; clat =&amp;lt;centroid lat&amp;gt;&lt;br /&gt;
MODEL PARAMETERS:&lt;br /&gt;
  mlon= &amp;lt;model lon&amp;gt; mlat =&amp;lt;model lat&amp;gt; mrot=&amp;lt;model rot, default -55&amp;gt; xlen= &amp;lt;x-length in km&amp;gt; ylen= &amp;lt;y-length in km&amp;gt;&lt;br /&gt;
MODEL CORNERS:&lt;br /&gt;
  &amp;lt;lon 1&amp;gt; &amp;lt;lat 1&amp;gt; (x= 0.000 y= 0.000)&lt;br /&gt;
  &amp;lt;lon 2&amp;gt; &amp;lt;lat 2&amp;gt; (x= &amp;lt;max x&amp;gt; y= 0.000)&lt;br /&gt;
  &amp;lt;lon 3&amp;gt; &amp;lt;lat 3&amp;gt; (x= &amp;lt;max x&amp;gt; y= &amp;lt;max y&amp;gt;)&lt;br /&gt;
  &amp;lt;lon 4&amp;gt; &amp;lt;lat 4&amp;gt; (x= 0.000 y= &amp;lt;max y&amp;gt;)&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== Gridfile ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the three dimensions, and gridspacing in each dimension, of the volume.&lt;br /&gt;
&lt;br /&gt;
Filename convention: gridfile_&amp;lt;site&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
xlen=&amp;lt;x-length in km&amp;gt;&lt;br /&gt;
   0.0  &amp;lt;x-length&amp;gt;  &amp;lt;grid spacing in km&amp;gt;&lt;br /&gt;
ylen=&amp;lt;y-length in km&amp;gt;&lt;br /&gt;
   0.0  &amp;lt;y-length&amp;gt;  &amp;lt;grid spacing in km&amp;gt;&lt;br /&gt;
zlen=&amp;lt;z-length in km&amp;gt;&lt;br /&gt;
   0.0  &amp;lt;z-length&amp;gt;  &amp;lt;grid spacing in km&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Gridout ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the km offsets for each grid index, in X, Y, and Z, from the upper southwest corner.&lt;br /&gt;
&lt;br /&gt;
Filename convention: gridout_&amp;lt;site&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
xlen=&amp;lt;x-length in km&amp;gt;&lt;br /&gt;
nx=&amp;lt;number of gridpoints in X direction&amp;gt;&lt;br /&gt;
  0   0   &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  1   &amp;lt;grid spacing&amp;gt;  &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  2   &amp;lt;2*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  3   &amp;lt;3*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
  nx-1 &amp;lt;(nx-1)*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
ylen=&amp;lt;y-length in km&amp;gt;&lt;br /&gt;
ny=&amp;lt;number of gridpoints in Y direction&amp;gt;&lt;br /&gt;
  0   0   &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  1   &amp;lt;grid spacing&amp;gt;  &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
  ny-1 &amp;lt;(ny-1)*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
zlen=&amp;lt;z-length in km&amp;gt;&lt;br /&gt;
nz=&amp;lt;number of gridpoints in Z direction&amp;gt;&lt;br /&gt;
  0   0   &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
  1   &amp;lt;grid spacing&amp;gt;  &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
  nz-1 &amp;lt;(nz-1)*grid spacing&amp;gt; &amp;lt;grid spacing&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: UCVM, smoothing, PreSGT, PreAWP&lt;br /&gt;
&lt;br /&gt;
=== Params ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Succinctly specify the parameters for the CyberShake volume.  Similar information to the modelbox file, but in a different format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: model_params_GC_&amp;lt;site&amp;gt; (GC stands for 'great circle', the projection we use).&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Model origin coordinates:&lt;br /&gt;
 lon= &amp;lt;model lon&amp;gt; lat=   &amp;lt;model lat&amp;gt; rotate=  &amp;lt;model rotation, default -55&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Model origin shift (cartesian vs. geographic):&lt;br /&gt;
 xshift(km)=   &amp;lt;x shift, usually half the x-length minus 1 grid spacing&amp;gt; yshift(km)=   &amp;lt;y-shift, usually half the y-length minus 1 grid spacing&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Model corners:&lt;br /&gt;
 c1= &amp;lt;nw lon&amp;gt;   &amp;lt;nw lat&amp;gt;&lt;br /&gt;
 c2= &amp;lt;ne lon&amp;gt;   &amp;lt;ne lat&amp;gt;&lt;br /&gt;
 c3= &amp;lt;se lon&amp;gt;   &amp;lt;se lat&amp;gt;&lt;br /&gt;
 c4= &amp;lt;sw lon&amp;gt;   &amp;lt;sw lat&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Model Dimensions:&lt;br /&gt;
 xlen=   &amp;lt;x-length&amp;gt; km&lt;br /&gt;
 ylen=   &amp;lt;y-length&amp;gt; km&lt;br /&gt;
 zlen=   &amp;lt;z-length&amp;gt; km&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: &lt;br /&gt;
&lt;br /&gt;
=== Coord ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the mapping of latitude and longitude to X and Y offsets, for each point on the surface.&lt;br /&gt;
&lt;br /&gt;
Filename convention: model_coords_GC_&amp;lt;site&amp;gt; (GC stands for 'great circle', the projection we use).&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 1 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 2 0&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 1&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 1&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; &amp;lt;ny-1&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by: UCVM, smoothing, PreSGT&lt;br /&gt;
&lt;br /&gt;
=== Bounds ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Specify the mapping of latitude and longitude to X and Y offsets, but only for the points along the boundary.  A subset of the coord file.&lt;br /&gt;
&lt;br /&gt;
Filename convention: model_bounds_GC_&amp;lt;site&amp;gt; (GC stands for 'great circle', the projection we use).&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 1 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 2 0&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 0&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 1&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 1&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 2&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; 2&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 0 &amp;lt;ny-1&amp;gt;&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; 1 &amp;lt;ny-1&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;nx-1&amp;gt; &amp;lt;ny-1&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreCVM&lt;br /&gt;
&lt;br /&gt;
Used by:&lt;br /&gt;
&lt;br /&gt;
=== Velocity files ===&lt;br /&gt;
&lt;br /&gt;
==== RWG format ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Input velocity files for the RWG wave propagation code, emod3d.&lt;br /&gt;
&lt;br /&gt;
Filename convention: v_sgt-&amp;lt;site&amp;gt;.&amp;lt;p, s, or d&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: 3 files, one each for Vp (*.p), Vs (*.s), and rho (*.d).  Each is binary, with 4-byte floats, in fast X, Z (surface-&amp;gt;down), slow Y order.&lt;br /&gt;
&lt;br /&gt;
Generated by: UCVM&lt;br /&gt;
&lt;br /&gt;
Used by: PreAWP&lt;br /&gt;
&lt;br /&gt;
==== AWP format ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Input velocity file for the AWP-ODC wave propagation code.&lt;br /&gt;
&lt;br /&gt;
Filename convention: awp.&amp;lt;site&amp;gt;.media&lt;br /&gt;
&lt;br /&gt;
Format: Binary, with 4-byte floats, in fast Y, X, slow Z (surface down) order.&lt;br /&gt;
&lt;br /&gt;
Generated by: UCVM&lt;br /&gt;
&lt;br /&gt;
Used by: Smoothing, PreAWP, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== Fdloc ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Coordinates of the site, in X Y grid indices, and therefore the coordinates where the SGT impulse should be placed.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.fdloc&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;X grid index of site&amp;gt; &amp;lt;Y grid index of site&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreAWP, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== Faultlist ===&lt;br /&gt;
&lt;br /&gt;
Purpose: List of paths to all the rupture geometry files for all ruptures which are within the cutoff for this site. Used to produce a list of points to save SGTs for.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.faultlist&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;path to rupture file&amp;gt; nheader=&amp;lt;number of header lines, usually 6&amp;gt; latfirst=&amp;lt;1, to signify that latitude comes first in the rupture files&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT&lt;br /&gt;
&lt;br /&gt;
=== Radiusfile ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Describe the adaptive mesh SGTs will be saved for.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.radiusfile&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of gradations in X and Y&amp;gt;&lt;br /&gt;
&amp;lt;radius 1&amp;gt; &amp;lt;radius 2&amp;gt; &amp;lt;radius 3&amp;gt; &amp;lt;radius 4&amp;gt;&lt;br /&gt;
&amp;lt;decimation less than radius 1&amp;gt; &amp;lt;decimation between radius 1 and 2&amp;gt; &amp;lt;between 2 and 3&amp;gt; &amp;lt;between 3 and 4&amp;gt;&lt;br /&gt;
&amp;lt;number of gradations in Z&amp;gt;&lt;br /&gt;
&amp;lt;depth 1&amp;gt; &amp;lt;depth 2&amp;gt; &amp;lt;depth 3&amp;gt; &amp;lt;depth 4&amp;gt;&lt;br /&gt;
&amp;lt;decimation less than depth 1&amp;gt; &amp;lt;decimation between depth 1 and 2&amp;gt; &amp;lt;between 2 and 3&amp;gt; &amp;lt;between 3 and 4&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT&lt;br /&gt;
&lt;br /&gt;
=== SGT Coordinate files ===&lt;br /&gt;
&lt;br /&gt;
There are two formats for the list of points to save SGTs for, one for Rob's codes and one for AWP-ODC.  As with other coordinate transformations between the two systems, to convert X and Y offsets from RWG to AWP you have to flip the X and Y and add 1 to each, since RWG is 0-indexed and AWP is 1-indexed.&lt;br /&gt;
&lt;br /&gt;
==== SgtCoords ====&lt;br /&gt;
&lt;br /&gt;
Purpose: List of all the points to save SGTs for.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;.cordfile&lt;br /&gt;
&lt;br /&gt;
Format: Z changes fastest, then Y, then X slowest.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# geoproj= &amp;lt;projection; we usually use 1 for great circle&amp;gt;&lt;br /&gt;
# modellon= &amp;lt;model lon&amp;gt; modellat= &amp;lt;model lat&amp;gt; modelrot= &amp;lt;model rot, usually -55&amp;gt;&lt;br /&gt;
# xlen= &amp;lt;x-length&amp;gt; ylen= &amp;lt;y-length&amp;gt;&lt;br /&gt;
#&lt;br /&gt;
&amp;lt;total number of points&amp;gt;&lt;br /&gt;
&amp;lt;X index&amp;gt; &amp;lt;Y index&amp;gt; &amp;lt;Z index&amp;gt; &amp;lt;Single long to capture the index, in the form XXXXYYYYZZZZ&amp;gt; &amp;lt;lon&amp;gt; &amp;lt;lat&amp;gt; &amp;lt;depth in km&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreSGT&lt;br /&gt;
&lt;br /&gt;
Used by: PreSGT, PreAWP, PostAWP&lt;br /&gt;
&lt;br /&gt;
==== AWP cordfile ====&lt;br /&gt;
&lt;br /&gt;
Purpose: List of SGT points to save in a format usable by AWP-ODC-SGT.&lt;br /&gt;
&lt;br /&gt;
Filename convention: awp.&amp;lt;site&amp;gt;.cordfile&lt;br /&gt;
&lt;br /&gt;
Format: Remember that X and Y are flipped and have 1 added from RWG.  The points are sorted by Y, then X, then Z, so Y changes slowest and Z changes fastest.  This is flipped from the RWG cordfile because X and Y components are swapped.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of points&amp;gt;&lt;br /&gt;
&amp;lt;X coordinate&amp;gt; &amp;lt;Y coordinate&amp;gt; &amp;lt;Z coordinate&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PreAWP&lt;br /&gt;
&lt;br /&gt;
Used by: AWP-ODC-SGT CPU, AWP-ODC-SGT GPU&lt;br /&gt;
&lt;br /&gt;
=== Impulse source descriptions ===&lt;br /&gt;
&lt;br /&gt;
We generate the initial source description for CyberShake, with the required dt, nt, and filtering, using gen_source, in https://github.com/SCECcode/cybershake-core/SimSgt_V3.0.3/src/ (run 'make get_source').  gen_source hard-codes its parameters, but you should only change 'nt', 'dt', and 'flo'.  We have been setting flo to twice the CyberShake maximum frequency, to reduce filtering affects at the frequency of interest.  gen_source wraps Rob Graves's source generator, which we use for consistency.&lt;br /&gt;
&lt;br /&gt;
To generate a source for a component, run&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
$&amp;gt;./gen_source xsrc=0 ysrc=0 zsrc=0 &amp;lt;fxsrc|fysrc|fzsrc&amp;gt;=1 moment=1e20&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Once this RWG source is generated, we then use AWP-GPU-SGT/utils/data/format_source.py to reprocess the RWG source into an AWP-source friendly format.  This involves reformatting the file and multiplying all values by 1e15 for unit conversion.  Different files must be produced for X and Y coordinates, since in the AWP format different columns are used for different components.&lt;br /&gt;
&lt;br /&gt;
Finally, AWP-GPU-SGT/utils/build_src.py takes the correct AWP-friendly source (nt and dt) for a run and adds the impulse location coordinates, producing a complete AWP format source description.&lt;br /&gt;
&lt;br /&gt;
==== RWG source ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Source description for the SGT impulse.&lt;br /&gt;
&lt;br /&gt;
Filename convention: source_cos0.10_&amp;lt;frequency&amp;gt;hz&lt;br /&gt;
&lt;br /&gt;
Format:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
source cos&lt;br /&gt;
&amp;lt;nt&amp;gt; &amp;lt;dt&amp;gt; 0 0 0.0 0.0 0.0 0.0&lt;br /&gt;
&amp;lt;value at ts0&amp;gt; &amp;lt;value at ts1&amp;gt; &amp;lt;value at ts2&amp;gt; &amp;lt;value at ts3&amp;gt; &amp;lt;value at ts4&amp;gt; &amp;lt;value at ts5&amp;gt;&lt;br /&gt;
&amp;lt;value at ts6&amp;gt; &amp;lt;value at ts7&amp;gt; &amp;lt;value at ts8&amp;gt; &amp;lt;value at ts9&amp;gt; &amp;lt;value at ts10&amp;gt; &amp;lt;value at ts11&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: gen_source (see above)&lt;br /&gt;
&lt;br /&gt;
Used by: PreAWP&lt;br /&gt;
&lt;br /&gt;
==== AWP source ====&lt;br /&gt;
&lt;br /&gt;
Purpose: Source description which can be used by AWP-ODC.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_f&amp;lt;x or y&amp;gt;_src&lt;br /&gt;
&lt;br /&gt;
Format: Note that X and Y coordinates are swapped between RWG and AWP format, because of how the box is defined.  Additionally, RWG is 0-indexed, and AWP is 1-indexed, and the RWG values must be multiplied by 1e15 for unit conversion.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;X index of source, same as site X index&amp;gt; &amp;lt;Y index of source, same as site Y index&amp;gt;&lt;br /&gt;
&amp;lt;XX impulse at ts0&amp;gt; &amp;lt;YY at ts0&amp;gt; &amp;lt;ZZ at ts0&amp;gt; &amp;lt;XY at ts0&amp;gt; &amp;lt;XZ at ts0&amp;gt; &amp;lt;YZ at ts0&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Generated by: PreAWP&lt;br /&gt;
&lt;br /&gt;
Used by: AWP-ODC-SGT CPU, AWP-ODC-SGT GPU&lt;br /&gt;
&lt;br /&gt;
=== IN3D ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Input file for AWP-ODC.&lt;br /&gt;
&lt;br /&gt;
Filename convention: IN3D.&amp;lt;site&amp;gt;.&amp;lt;x or y&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: Specified [https://scec.usc.edu/it/AWP-ODC-SGT#IN3D here (login required)].&lt;br /&gt;
&lt;br /&gt;
Generated by: PreAWP&lt;br /&gt;
&lt;br /&gt;
Used by: AWP-ODC-SGT CPU, AWP-ODC-SGT GPU, PostAWP&lt;br /&gt;
&lt;br /&gt;
=== AWP SGT ===&lt;br /&gt;
&lt;br /&gt;
Purpose: SGT file, created by AWP-ODC-SGT.&lt;br /&gt;
&lt;br /&gt;
Filename convention: awp-strain-&amp;lt;site&amp;gt;-f&amp;lt;x or y&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: binary, 4-byte floats.  Points are in the same order as in the AWP SGT coordinate file, which is fast Z, X, Y.  For each point, the SGT components are stored in XX, YY, ZZ, XY, XZ, YZ order, with time fastest.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YY component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 2nd z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (2nd x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 2nd y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, last y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: AWP-ODC-SGT CPU and GPU&lt;br /&gt;
&lt;br /&gt;
Used by: PostAWP, NanCheck &lt;br /&gt;
&lt;br /&gt;
=== RWG SGT ===&lt;br /&gt;
&lt;br /&gt;
Purpose: SGT file, created by PostAWP for use in post-processing.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_f&amp;lt;x or y&amp;gt;_&amp;lt;run id&amp;gt;.sgt&lt;br /&gt;
&lt;br /&gt;
Format: binary, 4-byte floats.  Points are in the same order as in the RWG coordinate file, which is fast Z, Y, X.  For each point, the SGT components are stored in XX, YY, ZZ, XY, XZ, YZ order, with time fastest.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YY component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 1st z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, 2nd z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (2nd x-coordinate, 1st y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, 1st y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (1st x-coordinate, 2nd y-coordinate, 1st z-coordinate), XX component&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;timeseries for nt steps, for (last x-coordinate, last y-coordinate, last z-coordinate), YZ component&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PostAWP&lt;br /&gt;
&lt;br /&gt;
Used by: CheckSgt, DirectSynth&lt;br /&gt;
&lt;br /&gt;
=== SGT header file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: SGT header information, used to parse and understand SGT files&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_f&amp;lt;x or y&amp;gt;_&amp;lt;run id&amp;gt;.sgthead&lt;br /&gt;
&lt;br /&gt;
Format: binary.  It consists of three sections:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;The sgtmaster structure, described below in C.  Its information can be used to set up data structures to read the rest of the SGTs.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
struct sgtmaster&lt;br /&gt;
   {&lt;br /&gt;
   int geoproj;     /* =0: RWG local flat earth; =1: RWG great circle arcs; =2: UTM */&lt;br /&gt;
   float modellon;  /* longitude of geographic origin */&lt;br /&gt;
   float modellat;  /* latitude of geographic origin */&lt;br /&gt;
   float modelrot;  /* rotation of y-axis from south (clockwise positive)   */&lt;br /&gt;
   float xshift;    /* xshift of cartesian origin from geographic origin */&lt;br /&gt;
   float yshift;    /* yshift of cartesian origin from geographic origin */&lt;br /&gt;
   int globnp;      /* total number of SGT locations (entire model) */&lt;br /&gt;
   int localnp;     /* local number of SGT locations (this file only) */&lt;br /&gt;
   int nt;          /* number of time points                                */&lt;br /&gt;
   };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;The sgtindex structures, described below in C.  There is one of these for each point in the SGTs, and they're used to determine the X/Y/Z indices of all the SGT points.  Note that the current way of packing the X,Y,Z coordinates into the long allows for 6 digits (so maximum 1M grid points) for each component.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
struct sgtindex   /* indices for all 'globnp' SGT locations */&lt;br /&gt;
   {&lt;br /&gt;
   long long indx; /* indx= xsgt*1000000000000 + ysgt*1000000 + zsgt */&lt;br /&gt;
   int xsgt;     /* x grid location */&lt;br /&gt;
   int ysgt;     /* y grid location */&lt;br /&gt;
   int zsgt;     /* z grid location */&lt;br /&gt;
   float h;         /* grid spacing                                         */&lt;br /&gt;
   };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;The sgtheader structures, described below in C.  There is one of these for each point in the SGTs.  They're used when we perform reciprocity.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
struct sgtheader&lt;br /&gt;
   {&lt;br /&gt;
   long long indx;  /* index of this SGT */&lt;br /&gt;
   int geoproj;     /* =0: RWG local flat earth; =1: RWG great circle arcs; =2: UTM */&lt;br /&gt;
   float modellon;  /* longitude of geographic origin */&lt;br /&gt;
   float modellat;  /* latitude of geographic origin */&lt;br /&gt;
   float modelrot;  /* rotation of y-axis from south (clockwise positive)   */&lt;br /&gt;
   float xshift;    /* xshift of cartesian origin from geographic origin */&lt;br /&gt;
   float yshift;    /* yshift of cartesian origin from geographic origin */&lt;br /&gt;
   int nt;          /* number of time points                                */&lt;br /&gt;
   float xazim;     /* azimuth of X-axis in FD model (clockwise from north) */&lt;br /&gt;
   float dt;        /* time sampling                                        */&lt;br /&gt;
   float tst;       /* start time of 1st point in GF                        */&lt;br /&gt;
   float h;         /* grid spacing                                         */&lt;br /&gt;
   float src_lat;   /* site latitude */&lt;br /&gt;
   float src_lon;   /* site longitude */&lt;br /&gt;
   float src_dep;   /* site depth */&lt;br /&gt;
   int xsrc;        /* x grid location for source (station in recip. exp.)  */&lt;br /&gt;
   int ysrc;        /* y grid location for source (station in recip. exp.)  */&lt;br /&gt;
   int zsrc;        /* z grid location for source (station in recip. exp.)  */&lt;br /&gt;
   float sgt_lat;   /* SGT location latitude */&lt;br /&gt;
   float sgt_lon;   /* SGT location longitude */&lt;br /&gt;
   float sgt_dep;   /* SGT location depth */&lt;br /&gt;
   int xsgt;        /* x grid location for output (source in recip. exp.)   */&lt;br /&gt;
   int ysgt;        /* y grid location for output (source in recip. exp.)   */&lt;br /&gt;
   int zsgt;        /* z grid location for output (source in recip. exp.)   */&lt;br /&gt;
   float cdist;     /* straight-line distance btw site and SGT location */&lt;br /&gt;
   float lam;       /* lambda [in dyne/(cm*cm)] at output point             */&lt;br /&gt;
   float mu;        /* rigidity [in dyne/(cm*cm)] at output point           */&lt;br /&gt;
   float rho;       /* density [in gm/(cm*cm*cm)] at output point           */&lt;br /&gt;
   float xmom;      /* moment strength of x-oriented force in this run      */&lt;br /&gt;
   float ymom;      /* moment strength of y-oriented force in this run      */&lt;br /&gt;
   float zmom;      /* moment strength of z-oriented force in this run      */&lt;br /&gt;
   };&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Overall, then, the format for the file is:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;sgtmaster&amp;gt;&lt;br /&gt;
&amp;lt;sgtindex for point 1&amp;gt;&lt;br /&gt;
&amp;lt;sgtindex for point 2&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;sgtindex for point globnp&amp;gt;&lt;br /&gt;
&amp;lt;sgtheader for point 1&amp;gt;&lt;br /&gt;
&amp;lt;sgtheader for point 2&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;sgtheader for point globnp&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: PostAWP&lt;br /&gt;
&lt;br /&gt;
Used by: DirectSynth&lt;br /&gt;
&lt;br /&gt;
=== Velocity Info file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  Contains the 3D velocity information needed for stochastic jobs&lt;br /&gt;
&lt;br /&gt;
Filename convention: velocity_info_&amp;lt;site&amp;gt;.txt&lt;br /&gt;
&lt;br /&gt;
Format: Text format, three lines:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
Vs30 = &amp;lt;Vs30 value&amp;gt;&lt;br /&gt;
Vs500 = &amp;lt;Vs500 value&amp;gt;&lt;br /&gt;
VsD500 = &amp;lt;VsD500 value&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: Velocity Info job&lt;br /&gt;
&lt;br /&gt;
Used by: Sub Stoch DAX generator, to add these values as command-line arguments to HF Synth and LF Site Response jobs.&lt;br /&gt;
&lt;br /&gt;
=== rupture list file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains a list of all the ruptures for which low-frequency seismograms should be synthesized.  This file is used to construct the tasks in DirectSynth.  The number of rows, columns, and magnitude are included because DirectSynth uses this information to determine how much memory the tasks will use.  This file is constructed at abstract workflow creation time.&lt;br /&gt;
&lt;br /&gt;
Filename convention: rupture_file_list_&amp;lt;site&amp;gt;_&amp;lt;run_id&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: Text format, with the number of ruptures and then 1 line per rupture:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of ruptures N&amp;gt;&lt;br /&gt;
&amp;lt;rupture geometry filename 1&amp;gt; &amp;lt;number of hypocenters&amp;gt; &amp;lt;number of slips per hypocenter&amp;gt; &amp;lt;number of rows in the rupture geometry&amp;gt; &amp;lt;number of columns in the rupture geometry&amp;gt; &amp;lt;magnitude&amp;gt;&lt;br /&gt;
&amp;lt;rupture geometry filename 2&amp;gt; &amp;lt;number of hypocenters&amp;gt; &amp;lt;number of slips per hypocenter&amp;gt; &amp;lt;number of rows in the rupture geometry&amp;gt; &amp;lt;number of columns in the rupture geometry&amp;gt; &amp;lt;magnitude&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;rupture geometry filename N&amp;gt; &amp;lt;number of hypocenters&amp;gt; &amp;lt;number of slips per hypocenter&amp;gt; &amp;lt;number of rows in the rupture geometry&amp;gt; &amp;lt;number of columns in the rupture geometry&amp;gt; &amp;lt;magnitude&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Used by: DirectSynth&lt;br /&gt;
&lt;br /&gt;
=== rupture variation info file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  This file provides the rvfrac (rupture velocity speed as a fraction of shear wave velocity) and random seed for each rupture variation.  It is constructed at abstract workflow creation time, using values from the database.  It is highly recommended that these values are stored somewhere for reproducibility, and the same values should be used for all sites so that the rupture variations are the same for all sites.&lt;br /&gt;
&lt;br /&gt;
Filename convention: rvfrac_seed_values_&amp;lt;site&amp;gt;_&amp;lt;run_id&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:  Text format, with the number of rupture variations and then 1 line per rupture variation:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of rupture variations N&amp;gt;&lt;br /&gt;
&amp;lt;source ID 1&amp;gt; &amp;lt;rupture ID 1&amp;gt; &amp;lt;rupture variation ID 1&amp;gt; &amp;lt;rvfrac&amp;gt; &amp;lt;random seed&amp;gt;&lt;br /&gt;
&amp;lt;source ID 2&amp;gt; &amp;lt;rupture ID 2&amp;gt; &amp;lt;rupture variation ID 2&amp;gt; &amp;lt;rvfrac&amp;gt; &amp;lt;random seed&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;source ID N&amp;gt; &amp;lt;rupture ID N&amp;gt; &amp;lt;rupture variation ID N&amp;gt; &amp;lt;rvfrac&amp;gt; &amp;lt;random seed&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Used by: DirectSynth, when linked with RupGen-v5.5.2 or later&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== BBP 1D Velocity file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  Contains 1D velocity profile information&lt;br /&gt;
&lt;br /&gt;
Filename convention: The only one currently in use in CyberShake is /home/scec-02/cybershk/runs/genslip_nr_generic1d-gp01.vmod .&lt;br /&gt;
&lt;br /&gt;
Format: Text format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of thickness layers L&amp;gt;&lt;br /&gt;
&amp;lt;layer 1 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;not used&amp;gt; &amp;lt;not used&amp;gt;&lt;br /&gt;
&amp;lt;layer 2 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;not used&amp;gt; &amp;lt;not used&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;layer L thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;not used&amp;gt; &amp;lt;not used&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note that the last layer has thickness 999.0.&lt;br /&gt;
&lt;br /&gt;
Generated by: Rob Graves&lt;br /&gt;
&lt;br /&gt;
Used by: Local VM job&lt;br /&gt;
&lt;br /&gt;
=== Local VM file ===&lt;br /&gt;
&lt;br /&gt;
Purpose:  Contains 1D velocity profile information for use with stochastic codes&lt;br /&gt;
&lt;br /&gt;
Filename convention: The only one currently in use in CyberShake is LA_Basin_BBP_14.3.0.local .&lt;br /&gt;
&lt;br /&gt;
Format: Text format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;number of thickness layers L&amp;gt;&lt;br /&gt;
&amp;lt;layer 1 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;Qs&amp;gt; &amp;lt;Qs&amp;gt;&lt;br /&gt;
&amp;lt;layer 2 thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;Qs&amp;gt; &amp;lt;Qs&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;layer L thickness in km&amp;gt; &amp;lt;Vp&amp;gt; &amp;lt;Vs&amp;gt; &amp;lt;density&amp;gt; &amp;lt;Qs&amp;gt; &amp;lt;Qs&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note that the last layer has thickness '0.0', indicating it has no bottom.&lt;br /&gt;
&lt;br /&gt;
Generated by: Local VM Job&lt;br /&gt;
&lt;br /&gt;
Used by: &lt;br /&gt;
&lt;br /&gt;
=== Missing variations file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Lists the variations which the Check DB stage has found are missing.&lt;br /&gt;
&lt;br /&gt;
Filename convention: DB_Check_Out_&amp;lt;PSA or RotD or Duration&amp;gt;_&amp;lt;site&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: For each source and rupture pair with missing variations, the following record is output in text format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
&amp;lt;source ID&amp;gt; &amp;lt;rupture ID&amp;gt; &amp;lt;number N of missing rupture variations&amp;gt;&lt;br /&gt;
&amp;lt;ID of first missing rupture variation&amp;gt;&lt;br /&gt;
&amp;lt;ID of second missing rupture variation&amp;gt;&lt;br /&gt;
...&lt;br /&gt;
&amp;lt;ID of Nth missing rupture variation&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Originally, a file in this format could be directly fed back into the DAX generator, but that capability has not been used for many years and may not still be functional.&lt;br /&gt;
&lt;br /&gt;
Generated by: Check DB Site&lt;br /&gt;
&lt;br /&gt;
Used by: none&lt;br /&gt;
&lt;br /&gt;
=== DB Report file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Provides PSA data for a run in a text format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_ERF&amp;lt;erf id&amp;gt;_report_&amp;lt;date&amp;gt;.txt&lt;br /&gt;
&lt;br /&gt;
Format: It's a text file with the following header:&lt;br /&gt;
 Site_Name       ERF_ID  Source_ID       Rupture_ID      Rup_Var_ID      Rup_Var_Scenario_ID     Mag     Prob    Grid_Spacing    Num_Rows        Num_Columns     Period  Component       SA&lt;br /&gt;
The file is sorted by fast Rup_Var_ID, Rupture_ID, Source_ID, Period, slow Component.&lt;br /&gt;
&lt;br /&gt;
Generated by: DB Report&lt;br /&gt;
&lt;br /&gt;
Used by: none, output data product&lt;br /&gt;
&lt;br /&gt;
=== Hazard Curve ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains a hazard curve, either in text, PNG, or PDF format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_ERF&amp;lt;erf id&amp;gt;_Run&amp;lt;run id&amp;gt;_&amp;lt;IM type&amp;gt;_&amp;lt;period&amp;gt;sec_&amp;lt;IM component&amp;gt;_&amp;lt;date run completed&amp;gt;.&amp;lt;pdf|txt|png&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format:  The PNG and PDF formats contain an image of the curve.  The PDF format also has an extended legend.  The TXT file contains a list of (X,Y) points which describe the curve.&lt;br /&gt;
&lt;br /&gt;
Generated by: Curve Calc&lt;br /&gt;
&lt;br /&gt;
Used by: none, output data product&lt;br /&gt;
&lt;br /&gt;
=== Disaggregation file ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Contains disaggregation results for a single run, in either text, PNG, or PDF format.&lt;br /&gt;
&lt;br /&gt;
Filename convention: &amp;lt;site&amp;gt;_ERF&amp;lt;erf id&amp;gt;_Run&amp;lt;run_id&amp;gt;_Disagg&amp;lt;POE|IM&amp;gt;_&amp;lt;disagg level&amp;gt;_&amp;lt;IM type&amp;gt;_&amp;lt;period&amp;gt;sec_&amp;lt;run date&amp;gt;.&amp;lt;txt|png|pdf&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Format: The PNG and PDF formats contain a plot of the disaggregation results, showing magnitude vs distance and color-coding based on epsilon.  The PDF and TXT formats contain additional information about individual source contributions, in the following format:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;Summary data&lt;br /&gt;
Parameters used to create disaggregation&lt;br /&gt;
Disaggregation bin data:&lt;br /&gt;
Dist Mag &amp;lt;breakout by epsilon values&amp;gt;&lt;br /&gt;
&amp;lt;Breakdown of contribution by distance, magnitude, and epsilon range&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Disaggregation Source List Info:&lt;br /&gt;
Source# %Contribution TotExceedRate SourceName DistRup DistX DistSeis DistJB&lt;br /&gt;
&amp;lt;list of contributing sources, in decreasing order of % contribution&amp;gt;&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Generated by: Disaggregation&lt;br /&gt;
&lt;br /&gt;
Used by: none, output data product&lt;br /&gt;
&lt;br /&gt;
== Dependencies ==&lt;br /&gt;
&lt;br /&gt;
The following are external software dependencies used by CyberShake software modules.&lt;br /&gt;
&lt;br /&gt;
=== Getpar ===&lt;br /&gt;
&lt;br /&gt;
Purpose: A library written in C which enables parsing of key-value command-line parameters, and enforcement of required parameters.  Rob Graves uses it in his codes.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Rob supplied a copy; it is in the CyberShake repository at https://github.com/SCECcode/cybershake-core/tree/main/Getpar .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Run 'make' in Getpar/getpar/src; this will make the library, libget.a, and install it in the lib directory, where CyberShake codes will expect it.&lt;br /&gt;
&lt;br /&gt;
=== MySQLdb ===&lt;br /&gt;
&lt;br /&gt;
This library has been deprecated in favor of pymysql.&lt;br /&gt;
&lt;br /&gt;
=== pymysql ===&lt;br /&gt;
&lt;br /&gt;
Purpose: MySQL bindings for Python.&lt;br /&gt;
&lt;br /&gt;
How to obtain: pip3 install pymysql .  Documentation is at https://pypi.org/project/PyMySQL/ .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: None; pip3 shouldn't have any issues.&lt;br /&gt;
&lt;br /&gt;
=== UCVM ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Supplies the query tools needed to populate a mesh with velocity information.&lt;br /&gt;
&lt;br /&gt;
How to obtain:  The most recent version of UCVM can be found at [[UCVM#Current_UCVM_Software_Releases|Current UCVM Software Releases]].  As of October 2017, we have only integrated the C version of UCVM into CyberShake.&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Following the standard installation instructions for a cluster should work (running ./ucvm_setup.py).  You will want to install CVM-S4, CVM-S426, CVM-S4.M01, CVM-H, CenCal, CCA-06, and CCA 1D velocity models for CyberShake.&lt;br /&gt;
&lt;br /&gt;
=== libcfu ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Provides a hash table library for a variety of CyberShake codes.&lt;br /&gt;
&lt;br /&gt;
How to obtain: https://sourceforge.net/projects/libcfu/ .  Documentation is at http://libcfu.sourceforge.net/libcfu.html .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Follow the instructions, and install into the utils directory.&lt;br /&gt;
&lt;br /&gt;
=== FFTW ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Library which provides FFTs.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Typically installed on supercomputers already, though you may need to load a module to activate it.&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Doesn't need to be installed in user space.&lt;br /&gt;
&lt;br /&gt;
=== memcached ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Server library for running a memory caching system, using key-value pairs.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Download from https://memcached.org/ .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Follow instructions and install in utils directory.  It has a dependency on libevent, which you may have to install also.&lt;br /&gt;
&lt;br /&gt;
=== libmemcached ===&lt;br /&gt;
&lt;br /&gt;
Purpose: Client library and tools for memcached server.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Download from http://libmemcached.org/libMemcached.html .  Install memcached first.&lt;br /&gt;
&lt;br /&gt;
Special installation instructions: Sometimes installing can be a challenge, because it can't find the memcached install.  You may have to set the path to memcached either as an argument to configure, or you may even need to edit the configuration and makefiles directory.  Install into the utils directory.&lt;br /&gt;
&lt;br /&gt;
=== RupGen-api ===&lt;br /&gt;
&lt;br /&gt;
Purpose: To generate rupture variations from a rupture geometry for a given hypocenter and slip using the Graves &amp;amp; Pitarka rupture generator.  The current version is 5.5.2.&lt;br /&gt;
&lt;br /&gt;
How to obtain: Check out from https://github.com/SCECcode/cybershake-core/tree/main/RuptureCodes/RupGen-api-5.5.2 .&lt;br /&gt;
&lt;br /&gt;
Special installation instructions:&lt;br /&gt;
&lt;br /&gt;
#This code is dependent on FFTW.  You may need to edit the makefile to point to the FFTW include files and libraries, since different clusters often use different environment variables to capture FFTW paths.&lt;br /&gt;
#If you want memcached support, edit the makefile in RuptureCodes/RupGen-api-5.5.2/src to uncomment lines 19-21 and edit line 19 to point to the libmemcached install directory.  You'll need to do the same to RuptureCodes/RupGen-api-5.5.2/src/GenRandV5.0/makefile, lines 35-37.&lt;br /&gt;
#You may need to edit CFLAGS in RuptureCodes/RupGen-api-5.5.2/src/GenRandV5.5.2/makefile (lines 21-23) to point to the FFTW path; whether or not this is needed depends on the particular system.&lt;br /&gt;
#Run 'make' in RupGen-api-5.5.2 to make the librupgen.a library.&lt;br /&gt;
&lt;br /&gt;
=== SCEC Broadband Platform ===&lt;br /&gt;
&lt;br /&gt;
Purpose: To generate high-frequency stochastic seismograms for broadband CyberShake runs.&lt;br /&gt;
&lt;br /&gt;
How to obtain:  Follow installation instructions at https://github.com/SCECcode/bbp .&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_ucvm_with_sw4_using_cvmsi&amp;diff=30574</id>
		<title>UCVM ucvm with sw4 using cvmsi</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_ucvm_with_sw4_using_cvmsi&amp;diff=30574"/>
		<updated>2025-11-06T22:39:41Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Difference in Vp and Vs GTL depth traced to CVM-S4 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Questions 11/2025 ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
LBNL TARGET: &lt;br /&gt;
Performing 5 Hz simulations based on CVM.S4.26.M01 and apply additional GTL with &lt;br /&gt;
the Ely tapering method. For high frequency simulation requirements, the velocity model &lt;br /&gt;
needs some additional modifications. &lt;br /&gt;
&lt;br /&gt;
=== What causes the Ripple-like pattern (bullseyes) in the cvmsi? ===&lt;br /&gt;
&lt;br /&gt;
  Region of interest: -118.4232,33.58, -117.3459,34.3519&lt;br /&gt;
&lt;br /&gt;
Due to local borehole information that are in the CVMS(rule-based) with its GTL processing&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:bullseyes_vs_cvms.png|thumb|300px|cvms bullseyes vs]] &lt;br /&gt;
| [[FILE:bullseyes_vs_cvms5.png|thumb|300px|cvms5 bullseyes vs]]&lt;br /&gt;
| [[FILE:bullseyes_vs_cvmsi.png|thumb|300px|cvmsi bullseyes vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
LBNL: Using taper with -z 0,700 -L 200,700,1500, to mitigate the feature &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vs_taper_ripple_z700-200-500-700.png|thumb|300px|cvmsi taper vs]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Transition depth to high velocity differs between Vp(~400m) and Vs(~250) ===&lt;br /&gt;
&lt;br /&gt;
LBNL has a variable grid size/spacing based on depth: 50m grid for 0-650m depth, &lt;br /&gt;
then 100m grid for 650-2500m, 200m grid for 2500-6000m, 400m grid for 6000m and below. &lt;br /&gt;
The horizontal and vertical spacing is the same within the same depth range.&lt;br /&gt;
&lt;br /&gt;
 Region of interest:   33.599659 -117.505759, 34.311847 -117.501624&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:transition_cvms_vp.png|thumb|300px|cvms transition vp]] &lt;br /&gt;
| [[FILE:transition_cvms_vs.png|thumb|300px|cvms transition vs]]&lt;br /&gt;
| [[FILE:transition_cvms_density.png|thumb|300px|cvms transition density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:transition_cvms5_vp.png|thumb|300px|cvms5 transition vp]] &lt;br /&gt;
| [[FILE:transition_cvms5_vs.png|thumb|300px|cvms5 transition vs]]&lt;br /&gt;
| [[FILE:transition_cvms5_density.png|thumb|300px|cvms5 transition density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:transition_cvmsi_vp_2.png|thumb|300px|cvmsi transition vp]] &lt;br /&gt;
| [[FILE:transition_cvmsi_vs.png|thumb|300px|cvmsi transition vs]]&lt;br /&gt;
| [[FILE:transition_cvmsi_density.png|thumb|300px|cvmsi transition density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Difference in GTL bounds for CVM-S4.26.M01 matches with CVM-S4&lt;br /&gt;
&lt;br /&gt;
about density: https://strike.scec.org/scecpedia/CVM-S4.26#Density_Based_on_Vs&lt;br /&gt;
&lt;br /&gt;
=== Adding elygtl:taper introduces discontinuities ===&lt;br /&gt;
&lt;br /&gt;
just cvmsi&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vp_only.png|thumb|300px|cvmsi vp]] &lt;br /&gt;
| [[FILE:cvmsi_vs_only.png|thumb|300px|cvmsi vs]]&lt;br /&gt;
| [[FILE:cvmsi_density_only.png|thumb|300px|cvmsi density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
cvmsi with -z0,700 and -L 200,700,1500&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vp_taper_z700-200-700-1500.png|thumb|300px|cvmsi taper vp 700z]] &lt;br /&gt;
| [[FILE:cvmsi_vs_taper_z700-200-700-1500.png|thumb|300px|cvmsi taper vs 700z]]&lt;br /&gt;
| [[FILE:cvmsi_density_taper_z700-200-700-1500.png|thumb|300px|cvmsi taper density 700z]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
extending z range with -z0,1000 and -L 200,700,1500&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vp_taper_z1000-200-700-1500.png|thumb|300px|cvmsi taper vp 700z]] &lt;br /&gt;
| [[FILE:cvmsi_vs_taper_z1000-200-700-1500.png|thumb|300px|cvmsi taper vs 700z]]&lt;br /&gt;
| [[FILE:cvmsi_density_taper_z1000-200-700-1500.png|thumb|300px|cvmsi taper density 700z]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Difference in Vp and Vs GTL depth traced to CVM-S4 ===&lt;br /&gt;
&lt;br /&gt;
We selected (33.95575, -117.50369), in the middle of the cross-sections above, and ran this point at a variety of depths through the CVM-S3 and CVM-S4 implementations from https://scedc.caltech.edu/data/3d-velocity.html . &lt;br /&gt;
&lt;br /&gt;
In CVM-S version 3, we can see that there is a transition between 720 and 760m:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
  Lat      Lon         Depth(m)   Vp       Vs      Rho&lt;br /&gt;
33.95575 -117.50369    120.00   1790.2    730.8   1795.2&lt;br /&gt;
33.95575 -117.50369    160.00   1999.7    816.4   1888.2&lt;br /&gt;
33.95575 -117.50369    200.00   2322.3    948.1   2031.5&lt;br /&gt;
33.95575 -117.50369    240.00   2418.5   1000.0   2074.2&lt;br /&gt;
33.95575 -117.50369    280.00   2505.4   1000.0   2129.9&lt;br /&gt;
33.95575 -117.50369    320.00   2585.1   1169.1   2143.7&lt;br /&gt;
33.95575 -117.50369    360.00   2659.2   1218.6   2156.6&lt;br /&gt;
33.95575 -117.50369    400.00   2728.5   1265.4   2168.6&lt;br /&gt;
33.95575 -117.50369    440.00   2794.0   1310.0   2179.9&lt;br /&gt;
33.95575 -117.50369    480.00   2856.0   1352.5   2190.7&lt;br /&gt;
33.95575 -117.50369    520.00   2915.1   1393.3   2200.9&lt;br /&gt;
33.95575 -117.50369    560.00   2971.6   1432.5   2210.7&lt;br /&gt;
33.95575 -117.50369    600.00   3025.8   1470.4   2220.1&lt;br /&gt;
33.95575 -117.50369    640.00   3077.9   1507.0   2229.1&lt;br /&gt;
33.95575 -117.50369    680.00   3128.2   1542.5   2237.9&lt;br /&gt;
33.95575 -117.50369    720.00   3176.7   1577.0   2246.3&lt;br /&gt;
33.95575 -117.50369    760.00   5903.3   3424.9   2718.9&lt;br /&gt;
33.95575 -117.50369    800.00   5903.3   3424.9   2718.9&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In CVM-S version 4, we see that the transition is now between 160 and 200m for Vp and rho, but between 280 and 320m for Vs:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
  Lat      Lon         Depth(m)   Vp       Vs      Rho&lt;br /&gt;
33.95575 -117.50369    120.00   2770.8    980.0   2302.5&lt;br /&gt;
33.95575 -117.50369    160.00   4188.8   1000.0   2526.4&lt;br /&gt;
33.95575 -117.50369    200.00   5903.2   1000.0   2797.1&lt;br /&gt;
33.95575 -117.50369    240.00   5903.3   1000.0   2797.1&lt;br /&gt;
33.95575 -117.50369    280.00   5903.3   1000.0   2797.1&lt;br /&gt;
33.95575 -117.50369    320.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    360.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    400.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    440.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    480.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    520.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    560.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    600.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    640.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    680.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    720.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    760.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    800.00   5903.3   3424.9   2797.1&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The readme for CVM-S version 4 states that changes were made to the San Bernardino Valley, so perhaps that is what led to this change.&lt;br /&gt;
&lt;br /&gt;
== Question 9/2025 ==&lt;br /&gt;
&lt;br /&gt;
Question from Rie to Scott,&lt;br /&gt;
&lt;br /&gt;
..conducting SW4 simulations within the LA Basin for both small and large magnitude earthquake events &lt;br /&gt;
(please see the attached slides for reference where we are interested in). While validating the ground &lt;br /&gt;
motions, we have come across several features in the velocity model—built from the SCEC model—that we &lt;br /&gt;
were hoping to better understand with your insight.&lt;br /&gt;
&lt;br /&gt;
Blocky structure north of LA (highlighted in red in the slides):&lt;br /&gt;
  We observed a blocky structure characterized by elevated Vs compared to the surrounding region. This structure appears to disappear at 50 m depth and then re-emerges below. Is there a specific reason why this structure is not present at the 50 m depth level?&lt;br /&gt;
&lt;br /&gt;
Vp–Vs relationship of the blocky structure (green color):&lt;br /&gt;
  The structure shows higher Vs values, with Vp appearing higher at the surface (Z = 0 m) but becoming lower at 250 m depth. We also noted that the density is relatively low at the surface. &lt;br /&gt;
&lt;br /&gt;
Vp–density relationship west of the blocky structure:&lt;br /&gt;
  In this nearby region, the density is higher than that of the surrounding material, but Vp appears to be lower. (I expect Vp correlates with density)&lt;br /&gt;
&lt;br /&gt;
Would you know of any literature or documentation that describe these characteristics?&lt;br /&gt;
&lt;br /&gt;
=== Background of the posted question ===&lt;br /&gt;
&lt;br /&gt;
Accessing UCVM velocity model via SW4's UCVM reader branch and then run the SW4 simulation.&lt;br /&gt;
&lt;br /&gt;
  CVM-S4.26.M01 (cvmsi), the velocity values are extracted using UCVM &amp;quot;withSCPBR&amp;quot; branch&lt;br /&gt;
&lt;br /&gt;
internal query parameters were:&lt;br /&gt;
&lt;br /&gt;
  ucvm_query -f conf/ucvm.conf -m cvmsi,elygtl:taper -L 200,700,1500&lt;br /&gt;
&lt;br /&gt;
Is this version of velocity model, recommended for ground motion simulation ? (another question)&lt;br /&gt;
&lt;br /&gt;
=== Response from Scott ===&lt;br /&gt;
&lt;br /&gt;
It looks like the blocky structure appears in the original CVM-S4 (https://strike.scec.org/scecpedia/UCVM_ucvm_with_sw4_using_cvmsi#Base_cvm, top row), and therefore also appears in CVM-S4.26 and CVM-S4.26.M01, both of which were derived from it.  I believe this is the northern edge of the LA basin as defined in the original CVM-S4.  When using this model for CyberShake Study 22.12 with the Ely taper, we didn't see this sharp east-west boundary.  You can see our cross sections at https://strike.scec.org/scecpedia/CyberShake_Study_22.12#Cross-sections, in the 3rd column.  However, we ran with a minimum Vs of 500 m/s and 100m grid spacing.&lt;br /&gt;
&lt;br /&gt;
For question 1), I think this has to do with the way the merged taper is applied.  The taper is constrained by the Thompson et al. (2022) Vs30 values, but every mesh point is evaluated when determining if to apply the taper or not.  In other words, at (34.0, -118.0, 0m), the Vs values are calculated with and without the taper, and the method with the lowest Vs is selected.  But then this process is repeated at 50m, 100m, etc., so at some depths the taper might be selected and at others not.  I pulled Vs profiles of the top 500m for (-118, 34.1), which is in the dark blue region; (-118, 34.2), in the teal region; and (-118, 34.3), in the lime green region (attached as taper 34_1.png, taper 34_2.png, taper 34_3.png).  From the smoothness of the profiles, it looks like the taper is always used for 34.1 and 34.3, but not always for 34.2.  As a result, sometimes the points in the teal region are closer to the lime green region, meaning the blocky structure disappears, and sometimes they are closer to the dark blue region, meaning the blocky structure reappears.&lt;br /&gt;
&lt;br /&gt;
For 2), it looks to me like at the surface Vp~1100 m/s and Vs~400 m/s, and at 250m depth Vp~4500 m/s and Vs~1800 m/s, since the color bars change dramatically from z=0 to z=250m.&lt;br /&gt;
&lt;br /&gt;
For 3), I extracted velocity profiles with and without the merged taper.  In the region you highlighted, even without the taper the Vp value hits the floor at 700 m/s, but density is about 2000 km/m3 (see attached &amp;quot;all props no taper.png&amp;quot;).  This is also true in the original CVM-S4 model.  So this isn't a result of the taper, but rather a native feature.  Since I wasn't involved in the original model construction, I'm not sure what the logic is for these values, but they do seem to be what was intended in CVM-S4 and its derivatives.&lt;br /&gt;
&lt;br /&gt;
Please let me know if you have follow-up questions.  We can also discuss modifications to the model if you'd like to avoid some of these features; for example, I know Hu et al. (2022) [https://doi.org/10.1093/gji/ggac175] used an alternative method for deciding where to apply the taper.&lt;br /&gt;
&lt;br /&gt;
=== Plots from explorer with the posted parameters ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_map.png|thumb|300px|cvmsi taper200 map]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_0m.png|thumb|300px|cvmsi taper200 0m]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_50m.png|thumb|300px|cvmsi taper200 50m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_100m.png|thumb|300px|cvmsi taper200 100m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_250m.png|thumb|300px|cvmsi taper200 250m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line1_map.png|thumb|300px|cvmsi taper200 line1]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line1.png|thumb|300px|cvmsi taper200 line1]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line2_map.png|thumb|300px|cvmsi taper200 line2]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line2.png|thumb|300px|cvmsi taper200 line2]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line3_map.png|thumb|300px|cvmsi taper200 line3]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line3.png|thumb|300px|cvmsi taper200 line3]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt_map.png|thumb|300px|cvmsi taper200 pt]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt1.png|thumb|300px|cvmsi taper200 pt1]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt2.png|thumb|300px|cvmsi taper200 pt2]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt3.png|thumb|300px|cvmsi taper200 pt3]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt4.png|thumb|300px|cvmsi taper200 pt4]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Base cvm ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvms_0m.png|thumb|300px|cvms 0m]] &lt;br /&gt;
| [[FILE:sw4_cvms5_0m.png|thumb|300px|cvms5 0m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_0m.png|thumb|300px|cvmsi 0m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_50m.png|thumb|300px|cvmsi 50m]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_100m.png|thumb|300px|cvmsi 100m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_250m.png|thumb|300px|cvmsi 250m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_all_0m.png|thumb|300px|cvmsi all 0m]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_all_50m.png|thumb|300px|cvmsi all 50m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_all_100m.png|thumb|300px|cvmsi all 100m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_all_250m.png|thumb|300px|cvmsi all 250m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
looking at basin,&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_basin_a.png|thumb|300px|cvmsi basin a]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_basin_b.png|thumb|300px|cvmsi basin b]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== using elygtl:ely ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_ely0_0m.png|thumb|300px|cvmsi ely0 0m]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_ucvm_with_sw4_using_cvmsi&amp;diff=30573</id>
		<title>UCVM ucvm with sw4 using cvmsi</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_ucvm_with_sw4_using_cvmsi&amp;diff=30573"/>
		<updated>2025-11-06T22:37:48Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Difference in Vp and Vs GTL depth traced to CVM-S4 */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Questions 11/2025 ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
LBNL TARGET: &lt;br /&gt;
Performing 5 Hz simulations based on CVM.S4.26.M01 and apply additional GTL with &lt;br /&gt;
the Ely tapering method. For high frequency simulation requirements, the velocity model &lt;br /&gt;
needs some additional modifications. &lt;br /&gt;
&lt;br /&gt;
=== What causes the Ripple-like pattern (bullseyes) in the cvmsi? ===&lt;br /&gt;
&lt;br /&gt;
  Region of interest: -118.4232,33.58, -117.3459,34.3519&lt;br /&gt;
&lt;br /&gt;
Due to local borehole information that are in the CVMS(rule-based) with its GTL processing&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:bullseyes_vs_cvms.png|thumb|300px|cvms bullseyes vs]] &lt;br /&gt;
| [[FILE:bullseyes_vs_cvms5.png|thumb|300px|cvms5 bullseyes vs]]&lt;br /&gt;
| [[FILE:bullseyes_vs_cvmsi.png|thumb|300px|cvmsi bullseyes vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
LBNL: Using taper with -z 0,700 -L 200,700,1500, to mitigate the feature &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vs_taper_ripple_z700-200-500-700.png|thumb|300px|cvmsi taper vs]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Transition depth to high velocity differs between Vp(~400m) and Vs(~250) ===&lt;br /&gt;
&lt;br /&gt;
LBNL has a variable grid size/spacing based on depth: 50m grid for 0-650m depth, &lt;br /&gt;
then 100m grid for 650-2500m, 200m grid for 2500-6000m, 400m grid for 6000m and below. &lt;br /&gt;
The horizontal and vertical spacing is the same within the same depth range.&lt;br /&gt;
&lt;br /&gt;
 Region of interest:   33.599659 -117.505759, 34.311847 -117.501624&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:transition_cvms_vp.png|thumb|300px|cvms transition vp]] &lt;br /&gt;
| [[FILE:transition_cvms_vs.png|thumb|300px|cvms transition vs]]&lt;br /&gt;
| [[FILE:transition_cvms_density.png|thumb|300px|cvms transition density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:transition_cvms5_vp.png|thumb|300px|cvms5 transition vp]] &lt;br /&gt;
| [[FILE:transition_cvms5_vs.png|thumb|300px|cvms5 transition vs]]&lt;br /&gt;
| [[FILE:transition_cvms5_density.png|thumb|300px|cvms5 transition density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:transition_cvmsi_vp_2.png|thumb|300px|cvmsi transition vp]] &lt;br /&gt;
| [[FILE:transition_cvmsi_vs.png|thumb|300px|cvmsi transition vs]]&lt;br /&gt;
| [[FILE:transition_cvmsi_density.png|thumb|300px|cvmsi transition density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Difference in GTL bounds for CVM-S4.26.M01 matches with CVM-S4&lt;br /&gt;
&lt;br /&gt;
about density: https://strike.scec.org/scecpedia/CVM-S4.26#Density_Based_on_Vs&lt;br /&gt;
&lt;br /&gt;
=== Adding elygtl:taper introduces discontinuities ===&lt;br /&gt;
&lt;br /&gt;
just cvmsi&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vp_only.png|thumb|300px|cvmsi vp]] &lt;br /&gt;
| [[FILE:cvmsi_vs_only.png|thumb|300px|cvmsi vs]]&lt;br /&gt;
| [[FILE:cvmsi_density_only.png|thumb|300px|cvmsi density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
cvmsi with -z0,700 and -L 200,700,1500&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vp_taper_z700-200-700-1500.png|thumb|300px|cvmsi taper vp 700z]] &lt;br /&gt;
| [[FILE:cvmsi_vs_taper_z700-200-700-1500.png|thumb|300px|cvmsi taper vs 700z]]&lt;br /&gt;
| [[FILE:cvmsi_density_taper_z700-200-700-1500.png|thumb|300px|cvmsi taper density 700z]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
extending z range with -z0,1000 and -L 200,700,1500&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vp_taper_z1000-200-700-1500.png|thumb|300px|cvmsi taper vp 700z]] &lt;br /&gt;
| [[FILE:cvmsi_vs_taper_z1000-200-700-1500.png|thumb|300px|cvmsi taper vs 700z]]&lt;br /&gt;
| [[FILE:cvmsi_density_taper_z1000-200-700-1500.png|thumb|300px|cvmsi taper density 700z]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Difference in Vp and Vs GTL depth traced to CVM-S4 ===&lt;br /&gt;
&lt;br /&gt;
We selected (33.95575, -117.50369), in the middle of the cross-sections above, and ran this point at a variety of depths through the CVM-S3 and CVM-S4 implementations from https://scedc.caltech.edu/data/3d-velocity.html . &lt;br /&gt;
&lt;br /&gt;
CVM-S version 3:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
  Lat      Lon         Depth(m)   Vp       Vs      Rho&lt;br /&gt;
33.95575 -117.50369    120.00   2770.8    980.0   2302.5&lt;br /&gt;
33.95575 -117.50369    160.00   4188.8   1000.0   2526.4&lt;br /&gt;
33.95575 -117.50369    200.00   5903.2   1000.0   2797.1&lt;br /&gt;
33.95575 -117.50369    240.00   5903.3   1000.0   2797.1&lt;br /&gt;
33.95575 -117.50369    280.00   5903.3   1000.0   2797.1&lt;br /&gt;
33.95575 -117.50369    320.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    360.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    400.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    440.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    480.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    520.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    560.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    600.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    640.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    680.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    720.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    760.00   5903.3   3424.9   2797.1&lt;br /&gt;
33.95575 -117.50369    800.00   5903.3   3424.9   2797.1&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Question 9/2025 ==&lt;br /&gt;
&lt;br /&gt;
Question from Rie to Scott,&lt;br /&gt;
&lt;br /&gt;
..conducting SW4 simulations within the LA Basin for both small and large magnitude earthquake events &lt;br /&gt;
(please see the attached slides for reference where we are interested in). While validating the ground &lt;br /&gt;
motions, we have come across several features in the velocity model—built from the SCEC model—that we &lt;br /&gt;
were hoping to better understand with your insight.&lt;br /&gt;
&lt;br /&gt;
Blocky structure north of LA (highlighted in red in the slides):&lt;br /&gt;
  We observed a blocky structure characterized by elevated Vs compared to the surrounding region. This structure appears to disappear at 50 m depth and then re-emerges below. Is there a specific reason why this structure is not present at the 50 m depth level?&lt;br /&gt;
&lt;br /&gt;
Vp–Vs relationship of the blocky structure (green color):&lt;br /&gt;
  The structure shows higher Vs values, with Vp appearing higher at the surface (Z = 0 m) but becoming lower at 250 m depth. We also noted that the density is relatively low at the surface. &lt;br /&gt;
&lt;br /&gt;
Vp–density relationship west of the blocky structure:&lt;br /&gt;
  In this nearby region, the density is higher than that of the surrounding material, but Vp appears to be lower. (I expect Vp correlates with density)&lt;br /&gt;
&lt;br /&gt;
Would you know of any literature or documentation that describe these characteristics?&lt;br /&gt;
&lt;br /&gt;
=== Background of the posted question ===&lt;br /&gt;
&lt;br /&gt;
Accessing UCVM velocity model via SW4's UCVM reader branch and then run the SW4 simulation.&lt;br /&gt;
&lt;br /&gt;
  CVM-S4.26.M01 (cvmsi), the velocity values are extracted using UCVM &amp;quot;withSCPBR&amp;quot; branch&lt;br /&gt;
&lt;br /&gt;
internal query parameters were:&lt;br /&gt;
&lt;br /&gt;
  ucvm_query -f conf/ucvm.conf -m cvmsi,elygtl:taper -L 200,700,1500&lt;br /&gt;
&lt;br /&gt;
Is this version of velocity model, recommended for ground motion simulation ? (another question)&lt;br /&gt;
&lt;br /&gt;
=== Response from Scott ===&lt;br /&gt;
&lt;br /&gt;
It looks like the blocky structure appears in the original CVM-S4 (https://strike.scec.org/scecpedia/UCVM_ucvm_with_sw4_using_cvmsi#Base_cvm, top row), and therefore also appears in CVM-S4.26 and CVM-S4.26.M01, both of which were derived from it.  I believe this is the northern edge of the LA basin as defined in the original CVM-S4.  When using this model for CyberShake Study 22.12 with the Ely taper, we didn't see this sharp east-west boundary.  You can see our cross sections at https://strike.scec.org/scecpedia/CyberShake_Study_22.12#Cross-sections, in the 3rd column.  However, we ran with a minimum Vs of 500 m/s and 100m grid spacing.&lt;br /&gt;
&lt;br /&gt;
For question 1), I think this has to do with the way the merged taper is applied.  The taper is constrained by the Thompson et al. (2022) Vs30 values, but every mesh point is evaluated when determining if to apply the taper or not.  In other words, at (34.0, -118.0, 0m), the Vs values are calculated with and without the taper, and the method with the lowest Vs is selected.  But then this process is repeated at 50m, 100m, etc., so at some depths the taper might be selected and at others not.  I pulled Vs profiles of the top 500m for (-118, 34.1), which is in the dark blue region; (-118, 34.2), in the teal region; and (-118, 34.3), in the lime green region (attached as taper 34_1.png, taper 34_2.png, taper 34_3.png).  From the smoothness of the profiles, it looks like the taper is always used for 34.1 and 34.3, but not always for 34.2.  As a result, sometimes the points in the teal region are closer to the lime green region, meaning the blocky structure disappears, and sometimes they are closer to the dark blue region, meaning the blocky structure reappears.&lt;br /&gt;
&lt;br /&gt;
For 2), it looks to me like at the surface Vp~1100 m/s and Vs~400 m/s, and at 250m depth Vp~4500 m/s and Vs~1800 m/s, since the color bars change dramatically from z=0 to z=250m.&lt;br /&gt;
&lt;br /&gt;
For 3), I extracted velocity profiles with and without the merged taper.  In the region you highlighted, even without the taper the Vp value hits the floor at 700 m/s, but density is about 2000 km/m3 (see attached &amp;quot;all props no taper.png&amp;quot;).  This is also true in the original CVM-S4 model.  So this isn't a result of the taper, but rather a native feature.  Since I wasn't involved in the original model construction, I'm not sure what the logic is for these values, but they do seem to be what was intended in CVM-S4 and its derivatives.&lt;br /&gt;
&lt;br /&gt;
Please let me know if you have follow-up questions.  We can also discuss modifications to the model if you'd like to avoid some of these features; for example, I know Hu et al. (2022) [https://doi.org/10.1093/gji/ggac175] used an alternative method for deciding where to apply the taper.&lt;br /&gt;
&lt;br /&gt;
=== Plots from explorer with the posted parameters ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_map.png|thumb|300px|cvmsi taper200 map]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_0m.png|thumb|300px|cvmsi taper200 0m]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_50m.png|thumb|300px|cvmsi taper200 50m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_100m.png|thumb|300px|cvmsi taper200 100m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_250m.png|thumb|300px|cvmsi taper200 250m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line1_map.png|thumb|300px|cvmsi taper200 line1]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line1.png|thumb|300px|cvmsi taper200 line1]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line2_map.png|thumb|300px|cvmsi taper200 line2]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line2.png|thumb|300px|cvmsi taper200 line2]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line3_map.png|thumb|300px|cvmsi taper200 line3]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line3.png|thumb|300px|cvmsi taper200 line3]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt_map.png|thumb|300px|cvmsi taper200 pt]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt1.png|thumb|300px|cvmsi taper200 pt1]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt2.png|thumb|300px|cvmsi taper200 pt2]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt3.png|thumb|300px|cvmsi taper200 pt3]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt4.png|thumb|300px|cvmsi taper200 pt4]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Base cvm ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvms_0m.png|thumb|300px|cvms 0m]] &lt;br /&gt;
| [[FILE:sw4_cvms5_0m.png|thumb|300px|cvms5 0m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_0m.png|thumb|300px|cvmsi 0m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_50m.png|thumb|300px|cvmsi 50m]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_100m.png|thumb|300px|cvmsi 100m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_250m.png|thumb|300px|cvmsi 250m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_all_0m.png|thumb|300px|cvmsi all 0m]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_all_50m.png|thumb|300px|cvmsi all 50m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_all_100m.png|thumb|300px|cvmsi all 100m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_all_250m.png|thumb|300px|cvmsi all 250m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
looking at basin,&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_basin_a.png|thumb|300px|cvmsi basin a]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_basin_b.png|thumb|300px|cvmsi basin b]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== using elygtl:ely ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_ely0_0m.png|thumb|300px|cvmsi ely0 0m]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCVM_ucvm_with_sw4_using_cvmsi&amp;diff=30572</id>
		<title>UCVM ucvm with sw4 using cvmsi</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCVM_ucvm_with_sw4_using_cvmsi&amp;diff=30572"/>
		<updated>2025-11-06T22:36:14Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== Questions 11/2025 ==&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
LBNL TARGET: &lt;br /&gt;
Performing 5 Hz simulations based on CVM.S4.26.M01 and apply additional GTL with &lt;br /&gt;
the Ely tapering method. For high frequency simulation requirements, the velocity model &lt;br /&gt;
needs some additional modifications. &lt;br /&gt;
&lt;br /&gt;
=== What causes the Ripple-like pattern (bullseyes) in the cvmsi? ===&lt;br /&gt;
&lt;br /&gt;
  Region of interest: -118.4232,33.58, -117.3459,34.3519&lt;br /&gt;
&lt;br /&gt;
Due to local borehole information that are in the CVMS(rule-based) with its GTL processing&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:bullseyes_vs_cvms.png|thumb|300px|cvms bullseyes vs]] &lt;br /&gt;
| [[FILE:bullseyes_vs_cvms5.png|thumb|300px|cvms5 bullseyes vs]]&lt;br /&gt;
| [[FILE:bullseyes_vs_cvmsi.png|thumb|300px|cvmsi bullseyes vs]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
LBNL: Using taper with -z 0,700 -L 200,700,1500, to mitigate the feature &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vs_taper_ripple_z700-200-500-700.png|thumb|300px|cvmsi taper vs]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Transition depth to high velocity differs between Vp(~400m) and Vs(~250) ===&lt;br /&gt;
&lt;br /&gt;
LBNL has a variable grid size/spacing based on depth: 50m grid for 0-650m depth, &lt;br /&gt;
then 100m grid for 650-2500m, 200m grid for 2500-6000m, 400m grid for 6000m and below. &lt;br /&gt;
The horizontal and vertical spacing is the same within the same depth range.&lt;br /&gt;
&lt;br /&gt;
 Region of interest:   33.599659 -117.505759, 34.311847 -117.501624&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:transition_cvms_vp.png|thumb|300px|cvms transition vp]] &lt;br /&gt;
| [[FILE:transition_cvms_vs.png|thumb|300px|cvms transition vs]]&lt;br /&gt;
| [[FILE:transition_cvms_density.png|thumb|300px|cvms transition density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:transition_cvms5_vp.png|thumb|300px|cvms5 transition vp]] &lt;br /&gt;
| [[FILE:transition_cvms5_vs.png|thumb|300px|cvms5 transition vs]]&lt;br /&gt;
| [[FILE:transition_cvms5_density.png|thumb|300px|cvms5 transition density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:transition_cvmsi_vp_2.png|thumb|300px|cvmsi transition vp]] &lt;br /&gt;
| [[FILE:transition_cvmsi_vs.png|thumb|300px|cvmsi transition vs]]&lt;br /&gt;
| [[FILE:transition_cvmsi_density.png|thumb|300px|cvmsi transition density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Difference in GTL bounds for CVM-S4.26.M01 matches with CVM-S4&lt;br /&gt;
&lt;br /&gt;
about density: https://strike.scec.org/scecpedia/CVM-S4.26#Density_Based_on_Vs&lt;br /&gt;
&lt;br /&gt;
=== Adding elygtl:taper introduces discontinuities ===&lt;br /&gt;
&lt;br /&gt;
just cvmsi&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vp_only.png|thumb|300px|cvmsi vp]] &lt;br /&gt;
| [[FILE:cvmsi_vs_only.png|thumb|300px|cvmsi vs]]&lt;br /&gt;
| [[FILE:cvmsi_density_only.png|thumb|300px|cvmsi density]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
cvmsi with -z0,700 and -L 200,700,1500&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vp_taper_z700-200-700-1500.png|thumb|300px|cvmsi taper vp 700z]] &lt;br /&gt;
| [[FILE:cvmsi_vs_taper_z700-200-700-1500.png|thumb|300px|cvmsi taper vs 700z]]&lt;br /&gt;
| [[FILE:cvmsi_density_taper_z700-200-700-1500.png|thumb|300px|cvmsi taper density 700z]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
extending z range with -z0,1000 and -L 200,700,1500&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:cvmsi_vp_taper_z1000-200-700-1500.png|thumb|300px|cvmsi taper vp 700z]] &lt;br /&gt;
| [[FILE:cvmsi_vs_taper_z1000-200-700-1500.png|thumb|300px|cvmsi taper vs 700z]]&lt;br /&gt;
| [[FILE:cvmsi_density_taper_z1000-200-700-1500.png|thumb|300px|cvmsi taper density 700z]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Difference in Vp and Vs GTL depth traced to CVM-S4 ===&lt;br /&gt;
&lt;br /&gt;
Looking at (33.95575, -117.50369), in the &lt;br /&gt;
&lt;br /&gt;
== Question 9/2025 ==&lt;br /&gt;
&lt;br /&gt;
Question from Rie to Scott,&lt;br /&gt;
&lt;br /&gt;
..conducting SW4 simulations within the LA Basin for both small and large magnitude earthquake events &lt;br /&gt;
(please see the attached slides for reference where we are interested in). While validating the ground &lt;br /&gt;
motions, we have come across several features in the velocity model—built from the SCEC model—that we &lt;br /&gt;
were hoping to better understand with your insight.&lt;br /&gt;
&lt;br /&gt;
Blocky structure north of LA (highlighted in red in the slides):&lt;br /&gt;
  We observed a blocky structure characterized by elevated Vs compared to the surrounding region. This structure appears to disappear at 50 m depth and then re-emerges below. Is there a specific reason why this structure is not present at the 50 m depth level?&lt;br /&gt;
&lt;br /&gt;
Vp–Vs relationship of the blocky structure (green color):&lt;br /&gt;
  The structure shows higher Vs values, with Vp appearing higher at the surface (Z = 0 m) but becoming lower at 250 m depth. We also noted that the density is relatively low at the surface. &lt;br /&gt;
&lt;br /&gt;
Vp–density relationship west of the blocky structure:&lt;br /&gt;
  In this nearby region, the density is higher than that of the surrounding material, but Vp appears to be lower. (I expect Vp correlates with density)&lt;br /&gt;
&lt;br /&gt;
Would you know of any literature or documentation that describe these characteristics?&lt;br /&gt;
&lt;br /&gt;
=== Background of the posted question ===&lt;br /&gt;
&lt;br /&gt;
Accessing UCVM velocity model via SW4's UCVM reader branch and then run the SW4 simulation.&lt;br /&gt;
&lt;br /&gt;
  CVM-S4.26.M01 (cvmsi), the velocity values are extracted using UCVM &amp;quot;withSCPBR&amp;quot; branch&lt;br /&gt;
&lt;br /&gt;
internal query parameters were:&lt;br /&gt;
&lt;br /&gt;
  ucvm_query -f conf/ucvm.conf -m cvmsi,elygtl:taper -L 200,700,1500&lt;br /&gt;
&lt;br /&gt;
Is this version of velocity model, recommended for ground motion simulation ? (another question)&lt;br /&gt;
&lt;br /&gt;
=== Response from Scott ===&lt;br /&gt;
&lt;br /&gt;
It looks like the blocky structure appears in the original CVM-S4 (https://strike.scec.org/scecpedia/UCVM_ucvm_with_sw4_using_cvmsi#Base_cvm, top row), and therefore also appears in CVM-S4.26 and CVM-S4.26.M01, both of which were derived from it.  I believe this is the northern edge of the LA basin as defined in the original CVM-S4.  When using this model for CyberShake Study 22.12 with the Ely taper, we didn't see this sharp east-west boundary.  You can see our cross sections at https://strike.scec.org/scecpedia/CyberShake_Study_22.12#Cross-sections, in the 3rd column.  However, we ran with a minimum Vs of 500 m/s and 100m grid spacing.&lt;br /&gt;
&lt;br /&gt;
For question 1), I think this has to do with the way the merged taper is applied.  The taper is constrained by the Thompson et al. (2022) Vs30 values, but every mesh point is evaluated when determining if to apply the taper or not.  In other words, at (34.0, -118.0, 0m), the Vs values are calculated with and without the taper, and the method with the lowest Vs is selected.  But then this process is repeated at 50m, 100m, etc., so at some depths the taper might be selected and at others not.  I pulled Vs profiles of the top 500m for (-118, 34.1), which is in the dark blue region; (-118, 34.2), in the teal region; and (-118, 34.3), in the lime green region (attached as taper 34_1.png, taper 34_2.png, taper 34_3.png).  From the smoothness of the profiles, it looks like the taper is always used for 34.1 and 34.3, but not always for 34.2.  As a result, sometimes the points in the teal region are closer to the lime green region, meaning the blocky structure disappears, and sometimes they are closer to the dark blue region, meaning the blocky structure reappears.&lt;br /&gt;
&lt;br /&gt;
For 2), it looks to me like at the surface Vp~1100 m/s and Vs~400 m/s, and at 250m depth Vp~4500 m/s and Vs~1800 m/s, since the color bars change dramatically from z=0 to z=250m.&lt;br /&gt;
&lt;br /&gt;
For 3), I extracted velocity profiles with and without the merged taper.  In the region you highlighted, even without the taper the Vp value hits the floor at 700 m/s, but density is about 2000 km/m3 (see attached &amp;quot;all props no taper.png&amp;quot;).  This is also true in the original CVM-S4 model.  So this isn't a result of the taper, but rather a native feature.  Since I wasn't involved in the original model construction, I'm not sure what the logic is for these values, but they do seem to be what was intended in CVM-S4 and its derivatives.&lt;br /&gt;
&lt;br /&gt;
Please let me know if you have follow-up questions.  We can also discuss modifications to the model if you'd like to avoid some of these features; for example, I know Hu et al. (2022) [https://doi.org/10.1093/gji/ggac175] used an alternative method for deciding where to apply the taper.&lt;br /&gt;
&lt;br /&gt;
=== Plots from explorer with the posted parameters ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_map.png|thumb|300px|cvmsi taper200 map]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_0m.png|thumb|300px|cvmsi taper200 0m]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_50m.png|thumb|300px|cvmsi taper200 50m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_100m.png|thumb|300px|cvmsi taper200 100m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_250m.png|thumb|300px|cvmsi taper200 250m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line1_map.png|thumb|300px|cvmsi taper200 line1]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line1.png|thumb|300px|cvmsi taper200 line1]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line2_map.png|thumb|300px|cvmsi taper200 line2]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line2.png|thumb|300px|cvmsi taper200 line2]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line3_map.png|thumb|300px|cvmsi taper200 line3]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_line3.png|thumb|300px|cvmsi taper200 line3]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt_map.png|thumb|300px|cvmsi taper200 pt]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt1.png|thumb|300px|cvmsi taper200 pt1]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt2.png|thumb|300px|cvmsi taper200 pt2]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt3.png|thumb|300px|cvmsi taper200 pt3]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_taper_200_pt4.png|thumb|300px|cvmsi taper200 pt4]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Base cvm ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvms_0m.png|thumb|300px|cvms 0m]] &lt;br /&gt;
| [[FILE:sw4_cvms5_0m.png|thumb|300px|cvms5 0m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_0m.png|thumb|300px|cvmsi 0m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_50m.png|thumb|300px|cvmsi 50m]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_100m.png|thumb|300px|cvmsi 100m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_250m.png|thumb|300px|cvmsi 250m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_all_0m.png|thumb|300px|cvmsi all 0m]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_all_50m.png|thumb|300px|cvmsi all 50m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_all_100m.png|thumb|300px|cvmsi all 100m]]&lt;br /&gt;
| [[FILE:sw4_cvmsi_all_250m.png|thumb|300px|cvmsi all 250m]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
looking at basin,&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_basin_a.png|thumb|300px|cvmsi basin a]] &lt;br /&gt;
| [[FILE:sw4_cvmsi_basin_b.png|thumb|300px|cvmsi basin b]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== using elygtl:ely ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[FILE:sw4_cvmsi_ely0_0m.png|thumb|300px|cvmsi ely0 0m]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_24.8&amp;diff=30458</id>
		<title>CyberShake Study 24.8</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_24.8&amp;diff=30458"/>
		<updated>2025-10-01T16:47:03Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Data Products */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;CyberShake Study 24.8 is a study in Northern California which includes deterministic low-frequency (0-1 Hz) and stochastic high-frequency (1-50 Hz) simulations.  We will use the [[Rupture_Variation_Generator_v5.5.2|Graves &amp;amp; Pitarka (2022)]] rupture generator and the high frequency modules from the SCEC Broadband Platform v22.4.  This study includes vertical component seismograms, period-dependent durations, and Fourier spectra IMs, and a reduction in minimum Vs to 400 m/s.&lt;br /&gt;
&lt;br /&gt;
== Status ==&lt;br /&gt;
&lt;br /&gt;
This study was completed on 11/8/24 at 6:15:15 PST.  &lt;br /&gt;
&lt;br /&gt;
The low-frequency calculations finished on 10/30/24 at 04:52:52 PDT.&lt;br /&gt;
&lt;br /&gt;
Study results are posted below.&lt;br /&gt;
&lt;br /&gt;
== Data Products ==&lt;br /&gt;
&lt;br /&gt;
Low-frequency hazard maps are available here: [https://opensha.usc.edu/ftp/kmilner/markdown/cybershake-analysis/study_24_8_lf/hazard_maps/ Low-frequency Hazard Maps].&lt;br /&gt;
&lt;br /&gt;
Low-frequency GMM comparisons are available here, using [https://opensha.usc.edu/ftp/kmilner/markdown/cybershake-analysis/study_24_8_lf/gmpe_comparisons_NGAWest_2014_NoIdr_Vs30Simulation/ top velocity mesh value as Vs30] and [https://opensha.usc.edu/ftp/kmilner/markdown/cybershake-analysis/study_24_8_lf/gmpe_comparisons_NGAWest_2014_NoIdr_Vs30Thompson2020/ Thompson Vs30].&lt;br /&gt;
&lt;br /&gt;
Broadband hazard maps are available here: [https://opensha.usc.edu/ftp/kmilner/markdown/cybershake-analysis/study_24_8_bb/hazard_maps/ Broadband Hazard Maps].&lt;br /&gt;
&lt;br /&gt;
Broadband GMM comparisons are available here: [https://opensha.usc.edu/ftp/kmilner/markdown/cybershake-analysis/study_24_8_bb/gmpe_comparisons_NGAWest_2014_NoIdr_Vs30Thompson2020/ Broadband GMM Comparisons].&lt;br /&gt;
&lt;br /&gt;
Updated statewide maps using Study 24.8 and Study 22.12 results are available here:  [https://opensha.usc.edu/ftp/kmilner/markdown/cybershake-analysis/study_24_8_lf/multi_study_hazard_maps/resources/ 2025 Statewide Maps].&lt;br /&gt;
&lt;br /&gt;
== Science Goals ==&lt;br /&gt;
&lt;br /&gt;
The science goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*To perform an updated broadband study for the greater Bay Area.&lt;br /&gt;
*To use an updated rupture generator and improved velocity model from Study 18.8.&lt;br /&gt;
*To use the same parameters as in Study 22.12 when possible to make comparisons between the studies simple.&lt;br /&gt;
&lt;br /&gt;
== Technical Goals ==&lt;br /&gt;
&lt;br /&gt;
The technical goals for this study are:&lt;br /&gt;
&lt;br /&gt;
*Use Frontier for the SGTs and Frontera for the post-processing and high-frequency calculations.&lt;br /&gt;
*Use a modified approach for the production database, to improve performance.&lt;br /&gt;
&lt;br /&gt;
== Sites ==&lt;br /&gt;
&lt;br /&gt;
For this study, we chose to focus on a smaller region than in [[CyberShake_Study_18.8#Sites | Study 18.8]].  Starting with the Study 18.8 region, we selected a smaller (180 km x 100 km) box extending roughly from San Jose to Santa Rosa, containing 315 sites.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_24_1_site_map.png|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
*[[Media:Study_24.1_sites.csv|CSV site list]]&lt;br /&gt;
*[[Media:Study_24.1_sites_names.kml|KML site list with names]]&lt;br /&gt;
*[[Media:Study_24.1_sites_no_names.kml|KML site list without names]]&lt;br /&gt;
&lt;br /&gt;
== Ruptures to Include ==&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Summary: we decided to exclude the southern San Andreas events from Study 24.8.  This was implemented by creating a new ERF with ID 64, which includes all the ERF 36 ruptures except for the southern San Andreas events.&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Historically, we have determined which ruptures to include in a CyberShake run by calculating the distance between the site and the closest part of the rupture surface.  If that distance is less than 200 km, we then include all ruptures which take place on that surface, including ruptures which may extend much farther away from the site than 200 km.&lt;br /&gt;
&lt;br /&gt;
For Northern California sites, this means that sites around San Jose and south include southern San Andreas events (events which rupture the northernmost segment of the southern San Andreas) within 200 km.  Since there are some UCERF2 ruptures which extend from the Parkfield segment all the way down to Bombay Beach, the simulation volumes for some of these Northern California sites cover most of the state.  This was the case for Study 18.8 (sample volumes can be seen [[Study_18.5_Velocity_Model_Comparisons#Cross-sections_from_production_velocity_mesh|on this page]]).  This required tiling together 3 3D models and a background 1D model.&lt;br /&gt;
&lt;br /&gt;
To simplify the velocity model and reduce the volumes, we are investigating omitting southern San Andreas events from this study.&lt;br /&gt;
&lt;br /&gt;
=== Source Contribution Curves ===&lt;br /&gt;
&lt;br /&gt;
Below are source contribution curves for 3 sites: s3430 (southwest corner of the study region), s3446 (southeast corner of the study region), and SJO (San Jose).  In general, the sSAF events are about the 3rd largest contributor at long periods and medium-to-long return periods.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! 2 sec !! 3 sec !! 5 sec !! 10 sec&lt;br /&gt;
|-&lt;br /&gt;
! s3430&lt;br /&gt;
| [[File:s3430_run6408_2sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3430_run6408_3sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3430_run6408_5sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3430_run6408_10sec_contributions.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3446&lt;br /&gt;
| [[File:s3446_run6452_2sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_run6452_3sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_run6452_5sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_run6452_10sec_contributions.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SJO&lt;br /&gt;
| [[File:SJO_run6987_2sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_run6987_3sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_run6987_5sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_run6987_10sec_contributions.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
We also looked at the source contributions for these 3 sites from ASK 2014.  In general, the sSAF events play a reduced role compared to the CyberShake results.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! 2 sec !! 3 sec !! 5 sec !! 10 sec&lt;br /&gt;
|-&lt;br /&gt;
! s3430&lt;br /&gt;
| [[File:s3430_ASK2014_2sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3430_ASK2014_3sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3430_ASK2014_5sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3430_ASK2014_10sec_contributions.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3446&lt;br /&gt;
| [[File:s3446_ASK2014_2sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_ASK2014_3sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_ASK2014_5sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_ASK2014_10sec_contributions.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SJO&lt;br /&gt;
| [[File:SJO_ASK2014_2sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_ASK2014_3sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_ASK2014_5sec_contributions.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_ASK2014_10sec_contributions.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Disaggregations ===&lt;br /&gt;
&lt;br /&gt;
From Study 18.8, we looked at disaggregations for s3430, s3446, and SJO at 1e-3 (1000 yr), 4e-4 (2500 yr), and 1e-4 (10000 yr) probability levels, at 2 and 10 seconds.  We list the top 3 contributing sources from the southern SAF, their magnitude ranges, and their contributing percentages.&lt;br /&gt;
&lt;br /&gt;
The only significant contributions are for site s3446 at 10 second period.  Those come from large events, with median magnitude 7.85 or higher.&lt;br /&gt;
&lt;br /&gt;
==== s3430 ====&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;3&amp;quot;&lt;br /&gt;
! Period !! 1e-3 !! 4e-4 !! 1e-4&lt;br /&gt;
|-&lt;br /&gt;
! 2 sec&lt;br /&gt;
| 80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&amp;lt;/br&amp;gt;82 (S. San Andreas;PK+CH+CC, M7.15-7.65), &amp;lt;0.01%&lt;br /&gt;
| 80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&amp;lt;/br&amp;gt;82 (S. San Andreas;PK+CH+CC, M7.15-7.65), &amp;lt;0.01%&lt;br /&gt;
| 80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&amp;lt;/br&amp;gt;82 (S. San Andreas;PK+CH+CC, M7.15-7.65), &amp;lt;0.01%&lt;br /&gt;
|-&lt;br /&gt;
! 10 sec&lt;br /&gt;
| 86 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB, M7.65-8.25), 0.01%&amp;lt;br/&amp;gt;89 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB+SSB+BG+CO, M7.75-8.45), 0.01%&amp;lt;br/&amp;gt;85 (S. San Andreas;PK+CH+CC+BB+NM+SM, M7.55-8.15), &amp;lt;0.01%&lt;br /&gt;
| 89 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB+SSB+BG+CO, M7.75-8.45), &amp;lt;0.01%&amp;lt;br/&amp;gt;80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&lt;br /&gt;
| 80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&amp;lt;/br&amp;gt;82 (S. San Andreas;PK+CH+CC, M7.15-7.65), &amp;lt;0.01%&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== s3446 ====&lt;br /&gt;
{| border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;3&amp;quot;&lt;br /&gt;
! Period !! 1e-3 !! 4e-4 !! 1e-4&lt;br /&gt;
|-&lt;br /&gt;
! 2 sec &lt;br /&gt;
| 84 (S. San Andreas;PK+CH+CC+BB+NM, M7.45-7.95), &amp;lt;0.01%&amp;lt;br/&amp;gt;89 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB+SSB+BG+CO, M7.75-8.45), &amp;lt;0.01%&amp;lt;br/&amp;gt;80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&lt;br /&gt;
| 80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&amp;lt;/br&amp;gt;82 (S. San Andreas;PK+CH+CC, M7.15-7.65), &amp;lt;0.01%&lt;br /&gt;
| 80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&amp;lt;/br&amp;gt;82 (S. San Andreas;PK+CH+CC, M7.15-7.65), &amp;lt;0.01%&lt;br /&gt;
|-&lt;br /&gt;
! 10 sec&lt;br /&gt;
| 85 (S. San Andreas;PK+CH+CC+BB+NM+SM, M7.55-8.15), 6.83%&amp;lt;br/&amp;gt;86 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB, M7.65-8.25), 4.18%&amp;lt;br/&amp;gt;84 (S. San Andreas;PK+CH+CC+BB+NM, M7.45-7.95), 1.53%&lt;br /&gt;
| 85 (S. San Andreas;PK+CH+CC+BB+NM+SM, M7.55-8.15), 4.27%&amp;lt;br/&amp;gt;86 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB, M7.65-8.25), 3.09%&amp;lt;br/&amp;gt;89 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB+SSB+BG+CO, M7.75-8.45), 1.08%&lt;br /&gt;
| 86 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB, M7.65-8.25), 1.19%&amp;lt;br/&amp;gt;85 (S. San Andreas;PK+CH+CC+BB+NM+SM, M7.55-8.15), 1.19%&amp;lt;br/&amp;gt;89 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB+SSB+BG+CO, M7.75-8.45), 0.55%&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== SJO ====&lt;br /&gt;
{|border=&amp;quot;1&amp;quot; cellpadding=&amp;quot;3&amp;quot;&lt;br /&gt;
! Period !! 1e-3 !! 4e-4 !! 1e-4&lt;br /&gt;
|-&lt;br /&gt;
! 2 sec&lt;br /&gt;
| 80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&amp;lt;/br&amp;gt;82 (S. San Andreas;PK+CH+CC, M7.15-7.65), &amp;lt;0.01%&lt;br /&gt;
| 80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&amp;lt;/br&amp;gt;82 (S. San Andreas;PK+CH+CC, M7.15-7.65), &amp;lt;0.01%&lt;br /&gt;
| 80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&amp;lt;/br&amp;gt;82 (S. San Andreas;PK+CH+CC, M7.15-7.65), &amp;lt;0.01%&lt;br /&gt;
|-&lt;br /&gt;
! 10 sec&lt;br /&gt;
| 85 (S. San Andreas;PK+CH+CC+BB+NM+SM, M7.55-8.15), 0.09%&amp;lt;br/&amp;gt;86 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB, M7.65-8.25), 0.07%&amp;lt;br/&amp;gt;89 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB+SSB+BG+CO, M7.75-8.45), 0.05%&lt;br /&gt;
| 89 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB+SSB+BG+CO, M7.75-8.45), 0.01%&amp;lt;br/&amp;gt;88 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB+SSB+BG, M7.75-8.35), 0.01%&amp;lt;br/&amp;gt;86 (S. San Andreas;PK+CH+CC+BB+NM+SM+NSB, M7.65-8.25), &amp;lt;0.01%&lt;br /&gt;
| 80 (S. San Andreas;PK, M5.65-6.35), &amp;lt;0.01%&amp;lt;br/&amp;gt;81 (S. San Andreas;PK+CH, M6.75-7.35), &amp;lt;0.01%&amp;lt;/br&amp;gt;82 (S. San Andreas;PK+CH+CC, M7.15-7.65), &amp;lt;0.01%&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Velocity Model ==&lt;br /&gt;
&lt;br /&gt;
We will perform validation of the proposed velocity model using northern California BBP events.&lt;br /&gt;
&lt;br /&gt;
To line up closely with the USGS SF model angle, we will generate volumes using an angle of -36 degrees.&lt;br /&gt;
&lt;br /&gt;
For generating sample meshes, we will use site s3446, a site in the SE corner of the study region with one of the larger volumes.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:s3446_erf64_volume.png|thumb|400px|Simulation region for s3446 is in &amp;lt;span style=&amp;quot;color:#FFBB00&amp;quot;&amp;gt;yellow&amp;lt;/span&amp;gt;.  Also plotted are the extents of the &amp;lt;span style=&amp;quot;color:darkgreen&amp;quot;&amp;gt;USGS SF model&amp;lt;/span&amp;gt;, the &amp;lt;span style=&amp;quot;color:limegreen&amp;quot;&amp;gt;USGS regional model&amp;lt;/span&amp;gt;, &amp;lt;span style=&amp;quot;color:blue&amp;quot;&amp;gt;CCA-06&amp;lt;/span&amp;gt;, and &amp;lt;span style=&amp;quot;color:red&amp;quot;&amp;gt;CVM-S4.26.M01&amp;lt;/span&amp;gt;.]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Primary 3D model ===&lt;br /&gt;
&lt;br /&gt;
Given the extensive low-velocity near-surface regions in the USGS SF CVM, we plan to use a minimum Vs of 400 m/s (and therefore a grid spacing of 80 m).&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Sfcvm_h0_2.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Initial slices with the USGS SF CVM are available here: [[UCVM_sfcvm_geomodelgrid]].  We found two sharply defined high-velocity patches visible on the surface slice, one in the East Bay near the mountains, and another near Gilroy.  These are regions where the gabbro type goes to the surface, and so the SF CVM geological rules dictate that the high velocities go to the surface as well.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:eastbay_high_vs_vertical_profile.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
These patches are not present in the Vs30 models - for instance, for the point (37.6827, -122.086) SF CVM gives a surface Vs of about 3500 m/s, but Wills (2015) has Vs30=710 and Thompson (2018) has Vs30=702.&lt;br /&gt;
&lt;br /&gt;
=== Potential modification to gabbro regions ===&lt;br /&gt;
&lt;br /&gt;
A candidate modification to the gabbro regions to reduce the near-surface velocities is to apply the approach used by Arben Pitarka and Rie Nakata, detailed below.&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
! Component !! At surface !! At 7.75 km depth !! derivation&lt;br /&gt;
|-&lt;br /&gt;
! Vp&lt;br /&gt;
| 4.2 km/s ||  5.7 km/s || Linear interpolation&lt;br /&gt;
|-&lt;br /&gt;
! Vs&lt;br /&gt;
| 2.44 km/s || 3.4 km/s || Vp/Vs relationship&lt;br /&gt;
|-&lt;br /&gt;
! Density&lt;br /&gt;
| 2.76 g/cm3 || 2.87 g/cm3 || Vp/Density relationship&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
The Vs (km/s) values are derived from Vp (km/s) using the San Leandro Gabbro relationship:&lt;br /&gt;
&amp;lt;pre&amp;gt;Vs = 0.7858 - 1.2344*Vp + 0.7949*Vp^2 - 0.1238*Vp^3 + 0.0064*Vp^4&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
The density (g/cm3) values are derived from Vp (km/s) using the San Leandro Gabbro relationship:&lt;br /&gt;
&amp;lt;pre&amp;gt;density = 2.4372 + 0.0761*Vp&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
A sample plot is below.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Nakata_and_Pitarka_modification_slice.png|thumb|800px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Background model ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;For Study 24.8, we decided to go with the [[Sierra Foothills 1D]] model, which extends the Sierra region of SFCVM.&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There are several candidates to use as a background model for the regions outside of the 3D model region.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;1D models&amp;lt;/b&amp;gt;&lt;br /&gt;
#1D Broadband Platform model - either Northern California, Central California, or the Southern Sierras.&lt;br /&gt;
#1D CVM-S4 background model&lt;br /&gt;
#Extend eastern edge of SFCVM model to fill the remaining volume&lt;br /&gt;
#1D CCA model (derived from averaging CCA-06), used in Study 17.3&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;3D models&amp;lt;/b&amp;gt;&lt;br /&gt;
#3D CANVAS long-period tomography model&lt;br /&gt;
#3D National Crustal model&lt;br /&gt;
&lt;br /&gt;
Plots of these options are available below.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! 1D Model !! BBP NorCal || BBP CenCal !! BBP SouthernSierras !! CVM-S4.26.M01 1D background !! CCA 1D&lt;br /&gt;
|-&lt;br /&gt;
! Plot&lt;br /&gt;
| [[File:nocal500.png|thumb|300px]]&lt;br /&gt;
| [[File:centralcal500.png|thumb|300px]]&lt;br /&gt;
| [[File:ssn2-500.png|thumb|300px]]&lt;br /&gt;
| [[File:cvms426_1dbackground.png|thumb|300px]]&lt;br /&gt;
| [[File:Cs_cca_ucvm_1d_all.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Cross-sections ===&lt;br /&gt;
&lt;br /&gt;
==== Cross-sections, no smoothing, CVM-S4.26.M01 1D background ====&lt;br /&gt;
&lt;br /&gt;
Below are horizontal cross-sections at various depths taken from a model for s3446 generated without smoothing, with the tiling SFCVM, CCA-06, CVM-S4.26.M01.  This model was extracted on 2/28/24, and is one of the largest volumes needed for the study.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! 0m !! 80m !! 800m !! 2000m !! 4000m !! 10000m&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_0m_nosmooth_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_80m_nosmooth_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_800m_nosmooth_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_2000m_nosmooth_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_4000m_nosmooth_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_10000m_nosmooth_vs.png|thumb|250px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Below are vertical cross-sections taken from a model for s3446 generated without smoothing, with the tiling SFCVM, CCA-06, CVM-S4.26.M01.  This model was extracted on 2/28/24, and is one of the largest volumes needed for the study.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:s3446_cross_section_locations.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Y=2400 !! Y=4800 !! Y=7200&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_x_2400_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_x_4800_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_x_7200_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! X=1400 !! X=2800&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_y_1400_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_y_2800_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== Cross-sections, smoothing, CVM-S4.26.M01 1D background ====&lt;br /&gt;
&lt;br /&gt;
Below are horizontal cross-sections at various depths taken from a model for s3446 generated with smoothing, with the tiling SFCVM, CCA-06, CVM-S4.26.M01.  This model was extracted on 2/28/24.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! 0m !! 80m !! 800m !! 2000m !! 4000m !! 10000m&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_0m_smooth_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_80m_smooth_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_800m_smooth_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_2000m_smooth_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_4000m_smooth_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_10000m_smooth_vs.png|thumb|250px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Below are vertical cross-sections taken from a model for s3446 generated with smoothing, with the tiling SFCVM, CCA-06, CVM-S4.26.M01.  This model was extracted on 2/28/24.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:s3446_smoothed_cross_section_locations.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Y=2400 !! Y=4800 !! Y=7200&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_x_2400_smoothed_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_x_4800_smoothed_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_x_7200_smoothed_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! X=1400 !! X=2800&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_y_1400_smoothed_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_y_2800_smoothed_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== Cross-sections, no smoothing, Southern Sierra BBP 1D background ====&lt;br /&gt;
&lt;br /&gt;
Below are horizontal cross-sections at various depths taken from a model for s3446 generated without smoothing, with the tiling SFCVM, CCA-06, Southern Sierra BBP1D model.  This model was extracted on 3/13/24.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! 0m !! 80m !! 800m !! 2000m !! 4000m !! 10000m&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_0m_nosmooth_ss_bbp1d_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_80m_nosmooth_ss_bbp1d_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_800m_nosmooth_ss_bbp1d_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_2000m_nosmooth_ss_bbp1d_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_4000m_nosmooth_ss_bbp1d_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_10000m_nosmooth_ss_bbp1d_vs.png|thumb|250px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Below are horizontal cross-sections at various depths taken from a model for s3446 generated without smoothing, with the tiling SFCVM, CCA-06, Southern Sierra BBP1D model.  This model was extracted on 3/13/24.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Y=2400 !! Y=4800 !! Y=7200&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_x_2400_nosmooth_ss_bbp1d_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_x_4800_nosmooth_ss_bbp1d_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_x_7200_nosmooth_ss_bbp1d_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! X=1400 !! X=2800&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_y_1400_nosmooth_ss_bbp1d_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_y_2800_nosmooth_ss_bbp1d_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== Cross-sections, no smoothing, extended SFCVM Sierra 1D background, CCA + taper ====&lt;br /&gt;
&lt;br /&gt;
Below are horizontal cross-sections at various depths taken from a model for s3446 generated without smoothing, with the tiling SFCVM, CCA-06 + taper, 1D representation of the Sierra foothills in SFCVM.  This model was extracted on 4/2/24.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! 0m !! 80m !! 800m !! 2000m !! 4000m !! 10000m&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_0m_nosmooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_80m_nosmooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_800m_nosmooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_2000m_nosmooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_4000m_nosmooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_10000m_nosmooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Below are vertical cross-sections.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Y=2400 !! Y=4800 !! Y=7200&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_x_2400_nosmooth_sfcvm_extended_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_x_4800_nosmooth_sfcvm_extended_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_x_7200_nosmooth_sfcvm_extended_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! X=1400 !! X=2800&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_y_1400_nosmooth_sfcvm_extended_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_y_2800_nosmooth_sfcvm_extended_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== Cross-sections, smoothing, extended SFCVM Sierra 1D background, CCA + taper ====&lt;br /&gt;
&lt;br /&gt;
Below are horizontal cross-sections at various depths taken from a model for s3446 generated with smoothing, with the tiling SFCVM, CCA-06 + taper, 1D representation of the Sierra foothills in SFCVM.  This model was extracted on 4/2/24.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! 0m !! 80m !! 800m !! 2000m !! 4000m !! 10000m&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_0m_smooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_80m_smooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_800m_smooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_2000m_smooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_4000m_smooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:s3446_10000m_smooth_sfcvm_extended_vs.png|thumb|250px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Below are vertical cross-sections.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Y=2400 !! Y=4800 !! Y=7200&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_x_2400_smooth_sfcvm_extended_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_x_4800_smooth_sfcvm_extended_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_x_7200_smooth_sfcvm_extended_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! X=1400 !! X=2800&lt;br /&gt;
|-&lt;br /&gt;
| [[File:s3446_y_1400_smooth_sfcvm_extended_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_y_2800_smooth_sfcvm_extended_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== Cross-sections, no smoothing, extended SFCVM Sierra 1D background with taper, CCA + taper ====&lt;br /&gt;
&lt;br /&gt;
Below are horizontal cross-sections at various depths taken from a model for s3446 generated without smoothing, with the tiling SFCVM, CCA-06 + taper, 1D representation of the Sierra foothills in SFCVM + taper.  This plot includes our typical practice of populating the surface point by querying the models at a depth of grid_spacing/4, or 20m.  This model was extracted on 4/9/24.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Surface (20m) !! 80m !! 800m !! 2000m !! 4000m !! 10000m&lt;br /&gt;
|-&lt;br /&gt;
| [[File:4_9 s3446_0m_nosmooth_sfcvm_extended_taper_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:4_9_s3446_80m_nosmooth_sfcvm_extended_taper_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:4_9_s3446_800m_nosmooth_sfcvm_extended_taper_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:4_9_s3446_2000m_nosmooth_sfcvm_extended_taper_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:4_9_s3446_4000m_nosmooth_sfcvm_extended_taper_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:4_9_s3446_10000m_nosmooth_sfcvm_extended_taper_vs.png|thumb|250px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Y=2400 !! Y=4800 !! Y=7200&lt;br /&gt;
|-&lt;br /&gt;
| [[File:4_9_s3446_x_2400_nosmooth_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:4_9_s3446_x_4800_nosmooth_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:4_9_s3446_x_7200_nosmooth_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! X=1400 !! X=2800&lt;br /&gt;
|-&lt;br /&gt;
| [[File:4_9_s3446_y_1400_nosmooth_vs.png|thumb|400px]]&lt;br /&gt;
| [[File:4_9_s3446_y_2800_nosmooth_vs.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Candidate Model (RC1) ===&lt;br /&gt;
&lt;br /&gt;
Our candidate model is generated using (1) SFCVM with the gabbro modifications; (2) CCA-06 with the merged taper in the top 700m; (3) the [[NC1D]] model with the merged taper in the top 700m.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Surface (20m) !! 80m !! 800m !! 2000m !! 4000m !! 10000m&lt;br /&gt;
|-&lt;br /&gt;
| [[File:4_19 s3446_20m_smooth_RC1_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:4_19_s3446_80m_smooth_RC1_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:4_19_s3446_800m_smooth_RC1_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:4_19_s3446_2000m_smooth_RC1_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:4_19_s3446_4000m_smooth_RC1_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:4_19_s3446_10000m_smooth_RC1_vs.png|thumb|250px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Vp/Vs Ratio Adjustment ===&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;For this study, we are modifying our approach to preserving the Vp/Vs ratio when applying a Vs floor to avoid very large Vp/Vs ratios.&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Our previous approach for applying the Vs floor is as follows:&lt;br /&gt;
&lt;br /&gt;
#If Vs &amp;lt; Vs_floor (400 m/s):&lt;br /&gt;
##Calculate the Vp/Vs ratio&lt;br /&gt;
##Change Vs to the floor&lt;br /&gt;
##Calculate a new Vp using Vp = Vs_floor * Vp/Vs ratio&lt;br /&gt;
#Apply Vp and density floors&lt;br /&gt;
&lt;br /&gt;
However, there are some sites with very high Vp/Vs ratios near the surface where Vs is low.  Therefore, if this algorithm is applied, unexpectedly high Vp values may result.&lt;br /&gt;
&lt;br /&gt;
Here is a vertical profile at site s3240 (Moffett Field).  You can see the high surface Vp value:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:s3240_vert_profile.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Looking in more detail, here are the Vp and Vs values in the top 90m at s3240:&lt;br /&gt;
&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
! Depth (m) !! Vp !! Vs !! Vp/Vs ratio !! Adjusted Vs !! Adjusted Vp&lt;br /&gt;
|-&lt;br /&gt;
| 0 || 739 || 81 || 9.12 || 400 || 3649&lt;br /&gt;
|-&lt;br /&gt;
| 10 || 1009 || 82 || 12.30 || 400 || 4922&lt;br /&gt;
|-&lt;br /&gt;
| &amp;lt;b&amp;gt; 20 &amp;lt;/b&amp;gt; || 1309 || 84 || 15.58 || &amp;lt;b&amp;gt; 400 &amp;lt;/b&amp;gt; || &amp;lt;b&amp;gt; 6233 &amp;lt;/b&amp;gt;&lt;br /&gt;
|-&lt;br /&gt;
| 30 || 1502 || 152 || 9.88 || 400 || 3953&lt;br /&gt;
|-&lt;br /&gt;
| 40 || 1590 || 285 || 5.58 || 400 || 3649&lt;br /&gt;
|-&lt;br /&gt;
| 50 || 1678 || 418 || 4.01 &lt;br /&gt;
|-&lt;br /&gt;
| 60 || 1711 || 436 || 3.92 &lt;br /&gt;
|-&lt;br /&gt;
| 70 || 1744 || 453 || 3.85 &lt;br /&gt;
|-&lt;br /&gt;
| &amp;lt;b&amp;gt; 80 &amp;lt;/b&amp;gt; || &amp;lt;b&amp;gt; 1776 &amp;lt;/b&amp;gt; || &amp;lt;b&amp;gt; 471 &amp;lt;/b&amp;gt; || 3.77 &lt;br /&gt;
|-&lt;br /&gt;
| 90 || 1807 || 489 || 3.70 &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Since we are using 80m grid spacing, and populating the surface point at a depth of (grid spacing)/4 = 20m, the values at 20m and 80m are being used.  The very high Vp/Vs ratio at 20m means that a very high Vp value is used.&lt;br /&gt;
&lt;br /&gt;
These high Vp/Vs ratios occur at a number of locations around San Francisco Bay, illustrated in the surface Vp plot below:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:s3240_surface_vp.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
To solve this problem, we will modify the process for applying the Vs floor.  We will use the Vp/Vs ratio at 80m depth instead of at the surface, and use a maximum ratio of 4.  A similar process was used in the HighF project.&lt;br /&gt;
&lt;br /&gt;
#If Vs &amp;lt; Vs_floor (400 m/s):&lt;br /&gt;
##If surface grid point:&lt;br /&gt;
###Calculate Vp/Vs ratio at 1 grid point depth (80m)&lt;br /&gt;
###If Vp/Vs ratio &amp;gt; 4:&lt;br /&gt;
####Lower Vp/Vs ratio to 4&lt;br /&gt;
##Else:&lt;br /&gt;
###Calculate Vp/Vs ratio at this grid point&lt;br /&gt;
##New Vs = Vs_floor&lt;br /&gt;
##New Vp = Vs * (potentially modified) Vp/Vs ratio.&lt;br /&gt;
#Apply Vp and density floors.&lt;br /&gt;
&lt;br /&gt;
Below is a surface Vp plot with the ratio modification.  Note that the areas of previous high Vp have been reduced.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:s3240_modifiedratio_surface_vp.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
=== Candidate Model (RC2) ===&lt;br /&gt;
&lt;br /&gt;
Candidate model RC2 is generated using the following procedure:&lt;br /&gt;
#Tiling with:&lt;br /&gt;
##USGS SFCVM v21.1&lt;br /&gt;
##CCA-06&lt;br /&gt;
##1D background model, derived as an extension of the Sierra section of the SFCVM model and described [[NC1D|here]].&lt;br /&gt;
#Surface points are populated using a depth of (grid spacing)/4, which is 20m for these meshes.&lt;br /&gt;
#An Ely-Jordan taper is applied to the top 700m across all models using Vs30 values from Thompson et al. (2020).&lt;br /&gt;
#Application of a Vs floor of 400 m/s, using the procedure outlined in [[CyberShake_Study_24.8#Vp/Vs Ratio Adjustment]].&lt;br /&gt;
#Smoothing is applied within 20km of a model boundary.&lt;br /&gt;
&lt;br /&gt;
Vs plots:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Surface (20m) !! 80m !! 160m !! 320m !! 640m !! 2000m !! 4000m !! 10000m&lt;br /&gt;
|-&lt;br /&gt;
| [[File:8_5_s3446_20m_smooth_RC2_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:8_5_s3446_80m_smooth_RC2_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:8_5_s3446_160m_smooth_RC2_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:8_5_s3446_320m_smooth_RC2_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:8_5_s3446_640m_smooth_RC2_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:8_5_s3446_2000m_smooth_RC2_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:8_5_s3446_4000m_smooth_RC2_vs.png|thumb|250px]]&lt;br /&gt;
| [[File:8_5_s3446_10000m_smooth_RC2_vs.png|thumb|250px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Surface Vp plot:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:8_5_s3446_20m_smooth_RC2_vp.png|thumb|250px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Taper impact ===&lt;br /&gt;
&lt;br /&gt;
Below are velocity profiles for the 20 stress test sites, with and without the near-surface taper.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! Without taper !! With taper (top 700m) !! Vs overlay&lt;br /&gt;
|-&lt;br /&gt;
! ALBY&lt;br /&gt;
| [[File:ALBY_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:ALBY_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:ALBY_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! BLMT&lt;br /&gt;
| [[File:BLMT_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:BLMT_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:BLMT_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CFCS&lt;br /&gt;
| [[File:CFCS_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:CFCS_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:CFCS_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSU1&lt;br /&gt;
| [[File:CSU1_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:CSU1_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:CSU1_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSUEB&lt;br /&gt;
| [[File:CSUEB_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:CSUEB_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:CSUEB_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! DALY&lt;br /&gt;
| [[File:DALY_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:DALY_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:DALY_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! HAYW&lt;br /&gt;
| [[File:HAYW_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:HAYW_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:HAYW_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! LICK&lt;br /&gt;
| [[File:LICK_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:LICK_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:LICK_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! LVMR&lt;br /&gt;
| [[File:LVMR_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:LVMR_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:LVMR_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! MSRA&lt;br /&gt;
| [[File:MSRA_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:MSRA_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:MSRA_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! NAPA&lt;br /&gt;
| [[File:NAPA_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:NAPA_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:NAPA_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! PTRY&lt;br /&gt;
| [[File:PTRY_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:PTRY_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:PTRY_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3171&lt;br /&gt;
| [[File:s3171_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:s3171_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:s3171_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3240&lt;br /&gt;
| [[File:s3240_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:s3240_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:s3240_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3446&lt;br /&gt;
| [[File:s3446_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:s3446_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SFRH&lt;br /&gt;
| [[File:SFRH_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:SFRH_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:SFRH_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SSFO&lt;br /&gt;
| [[File:SSFO_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:SSFO_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:SSFO_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SSOL&lt;br /&gt;
| [[File:SSOL_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:SSOL_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:SSOL_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SRSA&lt;br /&gt;
| [[File:SRSA_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:SRSA_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:SRSA_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SJO&lt;br /&gt;
| [[File:SJO_notaper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:SJO_taper_profile.png|thumb|400px]]&lt;br /&gt;
| [[File:SJO_overlay_profile.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Strain Green Tensors ==&lt;br /&gt;
&lt;br /&gt;
We will use the HIP implementation of AWP-ODC-SGT, with the following parameters:&lt;br /&gt;
*Grid spacing: 80 m&lt;br /&gt;
*DT: 0.004 sec&lt;br /&gt;
*NT: 50000 timesteps by default, increased to 75000 for sites with any site-to-hypocenter distance greater than 450 km.&lt;br /&gt;
*Minimum Vs: 400 m/s&lt;br /&gt;
&lt;br /&gt;
SGTs will be saved with a time decimation of 10, so every 0.04 sec.&lt;br /&gt;
&lt;br /&gt;
== Vertical Component ==&lt;br /&gt;
&lt;br /&gt;
We will include vertical (Z) component seismograms in this study.  To support this, we will produce Z-component SGTs and include Z-component synthesis in the post-processing and broadband stages.&lt;br /&gt;
&lt;br /&gt;
Verification that the Z component codes are working correctly is documented at [[Vertical component verification]].&lt;br /&gt;
&lt;br /&gt;
== Rupture Generator ==&lt;br /&gt;
&lt;br /&gt;
We will use the same version of the rupture generator that we used for Study 22.12, v5.5.2.&lt;br /&gt;
&lt;br /&gt;
To match the SGT timestep, SRFs will be generated with dt=0.04s, deterministic seismograms will be output with dt=0.04s, and broadband seismograms will use dt=0.01s.&lt;br /&gt;
&lt;br /&gt;
== High-frequency codes ==&lt;br /&gt;
&lt;br /&gt;
We will use the Graves &amp;amp; Pitarka high-frequency codes from the BBP v22.4.&lt;br /&gt;
&lt;br /&gt;
However, since we are using a denser mesh (80m) and a lower minimum Vs (400 m/s), we will not apply site correction to the low-frequency seismograms before combining.  Rob states, &amp;lt;i&amp;gt;You may recall that there is a klugy process we have used to estimate the Vref value based on the Vsmin and grid spacing of the model. But, it has only been applied for the case of h=100 m and Vsmin=500 m/s. What I found here is that estimating Vref using the 80m &amp;amp; 400m/s model is that Vref is almost always less than Vs30 (i.e., Vsite). This means that when the site adjustment is applied, the motions are deamplified. This is why we see such a poor fit for the 3D case when using the estimated Vref values.  My conclusion at this point is that if we run the 3D calculation with 80 m grid spacing and Vsmin of 400 m/s (or lower), then we probably do not need to apply any site adjustments.&amp;lt;/i&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Validation ==&lt;br /&gt;
&lt;br /&gt;
We performed [[Loma Prieta Validation | validation runs for the Loma Prieta BBP event]].&lt;br /&gt;
&lt;br /&gt;
== Hazard Curve Tests ==&lt;br /&gt;
&lt;br /&gt;
We are calculating test hazard curves for the following sites:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_24.6_test_sites_map.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Results using velocity model RC1 ===&lt;br /&gt;
&lt;br /&gt;
==== Low-frequency curves ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! 2 sec !! 3 sec !! 5 sec !! 10 sec !! Vertical profile&lt;br /&gt;
|-&lt;br /&gt;
! s3446&lt;br /&gt;
| [[File:s3446_10676_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_10676_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_10676_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_10676_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:s3446_10676_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_10676_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_10676_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_10676_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3240&lt;br /&gt;
| [[File:s3240_10677_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10677_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10677_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10677_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_vert_profile.png|thumb|250px]]&lt;br /&gt;
|- &lt;br /&gt;
|&lt;br /&gt;
| [[File:s3240_10677_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10677_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10677_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10677_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! ALBY&lt;br /&gt;
| [[File:ALBY_10678_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10678_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10678_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10678_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:ALBY_10678_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10678_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10678_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10678_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SJO&lt;br /&gt;
| [[File:SJO_10683_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10683_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10683_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10683_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:SJO_10683_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10683_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10683_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10683_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CFCS&lt;br /&gt;
| [[File:CFCS_10684_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10684_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10684_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10684_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:CFCS_10684_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10684_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10684_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10684_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3171&lt;br /&gt;
| [[File:s3171_10685_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10685_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10685_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10685_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:s3171_10685_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10685_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10685_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10685_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSUEB&lt;br /&gt;
| [[File:CSUEB_10687_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10687_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10687_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10687_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:CSUEB_10687_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10687_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10687_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10687_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSU1&lt;br /&gt;
| [[File:CSU1_10686_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10686_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10686_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10686_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:CSU1_10686_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10686_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10686_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10686_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SFRH&lt;br /&gt;
| [[File:SFRH_10681_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10681_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10681_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10681_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:SFRH_10681_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10681_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10681_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10681_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! LVMR&lt;br /&gt;
| [[File:LVMR_10682_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10682_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10682_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10682_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:LVMR_10682_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10682_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10682_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10682_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! HAYW&lt;br /&gt;
| [[File:HAYW_10679_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10679_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10679_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10679_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_vert_profile.png|thumb|250px]] &lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:HAYW_10679_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10679_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10679_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10679_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== High-frequency curves ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! 1 sec !! 0.5 sec !! 0.2 sec !! 0.1 sec !! Vertical profile&lt;br /&gt;
|-&lt;br /&gt;
! s3446&lt;br /&gt;
| [[File:s3446_10676_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_10676_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_10676_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_10676_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3240&lt;br /&gt;
| [[File:s3240_10689_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10689_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10689_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10689_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_vert_profile.png|thumb|250px]]&lt;br /&gt;
|- &lt;br /&gt;
! ALBY&lt;br /&gt;
| [[File:ALBY_10694_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10694_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10694_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10694_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! SJO&lt;br /&gt;
| [[File:SJO_10695_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10695_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10695_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10695_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! CFCS&lt;br /&gt;
| [[File:CFCS_10693_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10693_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10693_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10693_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3171&lt;br /&gt;
| [[File:s3171_10688_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10688_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10688_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10688_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSUEB&lt;br /&gt;
| [[File:CSUEB_10691_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10691_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10691_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10691_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSU1&lt;br /&gt;
| [[File:CSU1_10692_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10692_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10692_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10692_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! SFRH&lt;br /&gt;
| [[File:SFRH_10690_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10690_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10690_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10690_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! LVMR&lt;br /&gt;
| [[File:LVMR_10696_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10696_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10696_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10696_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! HAYW&lt;br /&gt;
| [[File:HAYW_10697_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10697_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10697_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10697_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_vert_profile.png|thumb|250px]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Results using velocity model RC2 ===&lt;br /&gt;
&lt;br /&gt;
==== Low-frequency curves ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! 2 sec !! 3 sec !! 5 sec !! 10 sec !! Vertical profile&lt;br /&gt;
|-&lt;br /&gt;
! s3240&lt;br /&gt;
| [[File:s3240_10708_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10708_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10708_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10708_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|- &lt;br /&gt;
|&lt;br /&gt;
| [[File:s3240_10708_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10708_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10708_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10708_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! ALBY&lt;br /&gt;
| [[File:ALBY_10709_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10709_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10709_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10709_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:ALBY_10709_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10709_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10709_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10709_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SJO&lt;br /&gt;
| [[File:SJO_10710_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10710_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10710_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10710_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:SJO_10710_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10710_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10710_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10710_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CFCS&lt;br /&gt;
| [[File:CFCS_10711_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10711_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10711_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10711_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:CFCS_10711_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10711_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10711_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10711_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3171&lt;br /&gt;
| [[File:s3171_10712_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10712_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10712_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10712_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:s3171_10712_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10712_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10712_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10712_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSUEB&lt;br /&gt;
| [[File:CSUEB_10713_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10713_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10713_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10713_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:CSUEB_10713_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10713_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10713_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10713_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSU1&lt;br /&gt;
| [[File:CSU1_10714_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10714_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10714_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10714_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:CSU1_10714_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10714_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10714_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10714_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SFRH&lt;br /&gt;
| [[File:SFRH_10715_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10715_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10715_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10715_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:SFRH_10715_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10715_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10715_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10715_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! LVMR&lt;br /&gt;
| [[File:LVMR_10715_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10715_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10715_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10715_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:LVMR_10715_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10715_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10715_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10715_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! HAYW&lt;br /&gt;
| [[File:HAYW_10717_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10717_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10717_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10717_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_RC2_vert_profile.png|thumb|250px]] &lt;br /&gt;
|-&lt;br /&gt;
|&lt;br /&gt;
| [[File:HAYW_10717_2sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10717_3sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10717_5sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10717_10sec_RotD50_log.png|thumb|300px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== Comparisons with Study 18.8 ====&lt;br /&gt;
&lt;br /&gt;
Study 18.8 curves are in red, and new curves are in black.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! 2 sec !! 3 sec !! 5 sec !! 10 sec !! Vertical profile&lt;br /&gt;
|-&lt;br /&gt;
! s3240&lt;br /&gt;
| [[File:s3240_Aug9_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_Aug9_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_Aug9_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_Aug9_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! ALBY&lt;br /&gt;
| [[File:ALBY_Aug9_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_Aug9_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_Aug9_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_Aug9_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! SJO&lt;br /&gt;
| [[File:SJO_Aug9_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_Aug9_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_Aug9_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_Aug9_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! CFCS&lt;br /&gt;
| [[File:CFCS_Aug9_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_Aug9_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_Aug9_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_Aug9_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3171&lt;br /&gt;
| [[File:s3171_Aug9_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_Aug9_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_Aug9_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_Aug9_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSUEB&lt;br /&gt;
| [[File:CSUEB_Aug9_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_Aug9_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_Aug9_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_Aug9_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSU1&lt;br /&gt;
| [[File:CSU1_Aug9_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_Aug9_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_Aug9_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_Aug9_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! SFRH&lt;br /&gt;
| [[File:SFRH_Aug9_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_Aug9_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_Aug9_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_Aug9_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! LVMR&lt;br /&gt;
| [[File:LVMR_Aug9_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_Aug9_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_Aug9_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_Aug9_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! HAYW&lt;br /&gt;
| [[File:HAYW_Aug9_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_Aug9_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_Aug9_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_Aug9_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_RC2_vert_profile.png|thumb|250px]] &lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== High-frequency curves ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! 1 sec !! 0.5 sec !! 0.2 sec !! 0.1 sec !! Vertical profile&lt;br /&gt;
|-&lt;br /&gt;
! s3240&lt;br /&gt;
| [[File:s3240_10724_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10724_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10724_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_10724_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_vert_profile.png|thumb|250px]]&lt;br /&gt;
|- &lt;br /&gt;
! ALBY&lt;br /&gt;
| [[File:ALBY_10725_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10725_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10725_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_10725_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! SJO&lt;br /&gt;
| [[File:SJO_10723_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10723_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10723_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_10723_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! CFCS&lt;br /&gt;
| [[File:CFCS_10718_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10718_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10718_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_10718_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3171&lt;br /&gt;
| [[File:s3171_10719_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10719_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10719_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_10719_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSUEB&lt;br /&gt;
| [[File:CSUEB_10720_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10720_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10720_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_10720_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSU1&lt;br /&gt;
| [[File:CSU1_10721_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10721_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10721_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_10721_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! SFRH&lt;br /&gt;
| [[File:SFRH_10722_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10722_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10722_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_10722_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! LVMR&lt;br /&gt;
| [[File:LVMR_10726_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10726_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10726_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_10726_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! HAYW&lt;br /&gt;
| [[File:HAYW_10727_1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10727_0.5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10727_0.2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_10727_0.1sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_vert_profile.png|thumb|250px]] &lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Updates and Enhancements ==&lt;br /&gt;
&lt;br /&gt;
*Used smaller study region than in Study 18.8.&lt;br /&gt;
*Removed southern San Andreas events, and created a new ERF.&lt;br /&gt;
&lt;br /&gt;
== Output Data Products ==&lt;br /&gt;
&lt;br /&gt;
=== File-based data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to produce the following data products, which will be stored at CARC:&lt;br /&gt;
&lt;br /&gt;
==== Deterministic ====&lt;br /&gt;
&lt;br /&gt;
*Seismograms: 3-component seismograms, 10000 timesteps (400 sec, dt=0.04s) each.&lt;br /&gt;
*PSA: We are removing geometric mean PSA calculations from this study.&lt;br /&gt;
*RotD: PGV, and RotD50, RotD100, and the RotD100 azimuth at 27 periods (20, 17, 15, 13, 12, 10, 8.5, 7.5, 6.5, 6, 5.5, 5, 4.4, 4, 3.5, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.7, 1.5, 1.3, 1.2, 1.1, 1)&lt;br /&gt;
*Vertical response spectra at 27 periods (20, 17, 15, 13, 12, 10, 8.5, 7.5, 6.5, 6, 5.5, 5, 4.4, 4, 3.5, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.7, 1.5, 1.3, 1.2, 1.1, 1).&lt;br /&gt;
*Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.  Also, period-dependent acceleration 5-75%, 5-95%, and 20-80% for 27 periods (20, 17, 15, 13, 12, 10, 8.5, 7.5, 6.5, 6, 5.5, 5, 4.4, 4, 3.5, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.7, 1.5, 1.3, 1.2, 1.1, 1).&lt;br /&gt;
&lt;br /&gt;
==== Broadband ====&lt;br /&gt;
&lt;br /&gt;
*Seismograms: 3-component seismograms, 40000 timesteps (400 sec, dt=0.01s) each.&lt;br /&gt;
*PSA: We are removing geometric mean PSA calculations from this study.&lt;br /&gt;
*RotD: PGA, PGV, and RotD50, RotD100, and the the RotD100 azimuth at 68 periods (20, 17, 15, 13, 12, 10, 8.5, 7.5, 6.5, 6, 5.5, 5, 4.4, 4, 3.5, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.7, 1.5, 1.3, 1.2, 1.1, 1, 0.85, 0.75, 0.65, 0.6, 0.55, 0.5, 0.45, 0.4, 0.35, 0.3, 0.28, 0.26, 0.24, 0.22, 0.2, 0.17, 0.15, 0.13, 0.12, 0.11, 0.1, 0.085, 0.075, 0.065, 0.06, 0.055, 0.05, 0.045, 0.04, 0.035, 0.032, 0.029, 0.025, 0.022, 0.02, 0.017, 0.015, 0.013, 0.012, 0.011, 0.01)&lt;br /&gt;
*Vertical response spectra at 68 periods (20, 17, 15, 13, 12, 10, 8.5, 7.5, 6.5, 6, 5.5, 5, 4.4, 4, 3.5, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.7, 1.5, 1.3, 1.2, 1.1, 1).&lt;br /&gt;
*Durations: for X and Y components, energy integral, Arias intensity, cumulative absolute velocity (CAV), and for both velocity and acceleration, 5-75%, 5-95%, and 20-80%.  Also, period-dependent acceleration 5-75%, 5-95%, and 20-80% for 68 periods (20, 17, 15, 13, 12, 10, 8.5, 7.5, 6.5, 6, 5.5, 5, 4.4, 4, 3.5, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.7, 1.5, 1.3, 1.2, 1.1, 1, 0.85, 0.75, 0.65, 0.6, 0.55, 0.5, 0.45, 0.4, 0.35, 0.3, 0.28, 0.26, 0.24, 0.22, 0.2, 0.17, 0.15, 0.13, 0.12, 0.11, 0.1, 0.085, 0.075, 0.065, 0.06, 0.055, 0.05, 0.045, 0.04, 0.035, 0.032, 0.029, 0.025, 0.022, 0.02, 0.017, 0.015, 0.013, 0.012, 0.011, 0.01)&lt;br /&gt;
&lt;br /&gt;
=== Database data products ===&lt;br /&gt;
&lt;br /&gt;
We plan to store the following data products in the database on moment-carc:&lt;br /&gt;
&lt;br /&gt;
==== Deterministic ====&lt;br /&gt;
&lt;br /&gt;
*RotD50 for 6 periods (10, 7.5, 5, 4, 3, 2).  Note that we are NOT storing RotD100.&lt;br /&gt;
*Duration: acceleration 5-75% and 5-95% for both X and Y&lt;br /&gt;
&lt;br /&gt;
==== Broadband ====&lt;br /&gt;
&lt;br /&gt;
*RotD50 for PGA, PGV, and 19 periods (10, 7.5, 5, 4, 3, 2, 1, 0.75, 0.5, 0.4, 0.3, 0.2, 0.1, 0.075, 0.05, 0.04, 0.03, 0.02, 0.01).  Note that we are NOT storing RotD100.&lt;br /&gt;
&lt;br /&gt;
=== Hazard products ===&lt;br /&gt;
&lt;br /&gt;
For each site, we will produce hazard curves from the deterministic results at 10, 5, 3, and 2 seconds, and from the broadband results at 10, 5, 3, 2, 1, 0.5, 0.2, and 0.1 seconds.&lt;br /&gt;
&lt;br /&gt;
When the study is complete, we will produce maps from the deterministic results at 10, 5, 3, and 2 seconds, and from the broadband results at 10, 5, 3, 2, 1, 0.5, 0.2, and 0.1 seconds.&lt;br /&gt;
&lt;br /&gt;
=== Data products after the study ===&lt;br /&gt;
&lt;br /&gt;
After the study completes, we plan to compute Fourier spectra for all 3 components:&lt;br /&gt;
&lt;br /&gt;
*For deterministic, at 27 periods (20, 17, 15, 13, 12, 10, 8.5, 7.5, 6.5, 6, 5.5, 5, 4.4, 4, 3.5, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.7, 1.5, 1.3, 1.2, 1.1, 1).&lt;br /&gt;
*For broadband at 68 periods (20, 17, 15, 13, 12, 10, 8.5, 7.5, 6.5, 6, 5.5, 5, 4.4, 4, 3.5, 3, 2.8, 2.6, 2.4, 2.2, 2, 1.7, 1.5, 1.3, 1.2, 1.1, 1, 0.85, 0.75, 0.65, 0.6, 0.55, 0.5, 0.45, 0.4, 0.35, 0.3, 0.28, 0.26, 0.24, 0.22, 0.2, 0.17, 0.15, 0.13, 0.12, 0.11, 0.1, 0.085, 0.075, 0.065, 0.06, 0.055, 0.05, 0.045, 0.04, 0.035, 0.032, 0.029, 0.025, 0.022, 0.02, 0.017, 0.015, 0.013, 0.012, 0.011, 0.01)&lt;br /&gt;
&lt;br /&gt;
== Computational and Data Estimates ==&lt;br /&gt;
&lt;br /&gt;
=== Computational Estimates ===&lt;br /&gt;
&lt;br /&gt;
We based these estimates on the test sites.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+SGT calculation&lt;br /&gt;
! !! UCVM runtime !! UCVM nodes !! SGT runtime (3 components) !! SGT nodes !! Other SGT workflow jobs !! SGT Total&lt;br /&gt;
|-&lt;br /&gt;
! Average of 11 test sites&lt;br /&gt;
| 761 sec || 96 || 7134 sec || 100 || 7200 node-sec || 220 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
220 node-hrs/site x 315 sites + 10% overrun = 76,230 node-hours for SGT workflows on Frontier.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+PP calculation&lt;br /&gt;
! !! DirectSynth runtime !! DirectSynth nodes !! Additional runtime for period-dependent calculation !! PP Total&lt;br /&gt;
|-&lt;br /&gt;
! 75th percentile of 10 test sites&lt;br /&gt;
| 13914 sec || 60 || 61938 core-sec || 232.2 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+BB calculation&lt;br /&gt;
! !! BB runtime !! BB nodes !! Additional runtime for period-dependent calculation !! BB Total&lt;br /&gt;
|-&lt;br /&gt;
! 75th percentile of 10 test sites&lt;br /&gt;
| 8019 sec || 40 || 61938 core-sec || 89.4 node-hrs&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
(232.2+89.7) node-hrs/site x 315 sites + 10% overrun = 111,500 node-hours for PP and BB calculations on Frontera.&lt;br /&gt;
&lt;br /&gt;
=== Data Estimates ===&lt;br /&gt;
&lt;br /&gt;
We use the 75th percentile estimate of 206,500 rupture variations.&lt;br /&gt;
&lt;br /&gt;
We are generating 3-component SGTs and 3-component seismograms, with 10k timesteps (400 sec) for LF and 40k timesteps (400 sec) for BB.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot;&lt;br /&gt;
|+Data estimates&lt;br /&gt;
! !! Velocity mesh !! SGTs size !! Temp data !! LF Output data !! BB Output data !! Total output data&lt;br /&gt;
|-&lt;br /&gt;
! Per site, derived from 10 site average (GB)&lt;br /&gt;
| 359 || 1131 || 1490 || 23.7 || 93.6 || 117.3&lt;br /&gt;
|-&lt;br /&gt;
! Total for 335 sites (TB)&lt;br /&gt;
| 110 || 348 || 458 || 7.3 || 28.8 || 36.1&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CARC ====&lt;br /&gt;
&lt;br /&gt;
We estimate 117.3 GB/site x 315 sites = 36.1 TB in output data, which will be transferred back to CARC.  We currently have 29 TB free.&lt;br /&gt;
&lt;br /&gt;
==== shock-carc ====&lt;br /&gt;
&lt;br /&gt;
We estimate (3 MB SGT + 32 MB PP + 219 MB BB logs + 3 MB output products) x 315 sites = 79 GB in workflow log space on /home/shock.  This drive has approximately 1.2 TB free.&lt;br /&gt;
&lt;br /&gt;
==== moment-carc database ====&lt;br /&gt;
&lt;br /&gt;
The PeakAmplitudes table uses approximately 183 bytes for data + 179 bytes for index = 363 bytes per entry.&lt;br /&gt;
&lt;br /&gt;
362 bytes/entry * 35 entries/event (10 det + 25 stoch) * 206,500 events/site * 315 sites = 768 GB.  The drive on moment-carc with the mysql database has 6.6 TB free.&lt;br /&gt;
&lt;br /&gt;
== Lessons Learned ==&lt;br /&gt;
*The SRFs used in this study weren't quite what was intended; they had subfault relocation on and used a different risetime.  Consider running an SRF check before starting the study.  More details about the subfault perturbation issue are given on the [[CyberShake_Study_22.12#Subfault_Perturbation | Study 22.12 wiki page]].&lt;br /&gt;
*The runtime requested for post-processing jobs was higher than what was needed, increasing queue times for awhile.&lt;br /&gt;
*The scottcal account went over quota on /home at Frontera, an issue for writing rvGAHP files.&lt;br /&gt;
*Due to the lack of support for outgoing connections, there was not an easy way to bundle jobs on Frontier and we ran the SGT jobs individually.&lt;br /&gt;
*Due to an issue with migrating the Study 22.12 changes back to the main Github branch, FP was being overwritten as 0.5 instead of using the 1.0 value from the IN3D file.&lt;br /&gt;
&lt;br /&gt;
== Stress Test ==&lt;br /&gt;
&lt;br /&gt;
For the stress test, we will run the first 20 sites and check for scientific and technical issues.&lt;br /&gt;
&lt;br /&gt;
Usage before the stress test:&lt;br /&gt;
*On Frontier, 587,856 of 700,000 node-hours for project GEO156. callag has used 22,166 node-hours.&lt;br /&gt;
*On Frontera, 316,715 of 600,000 node-hours for project EAR20006.  scottcal has used 11,274 node-hours.&lt;br /&gt;
&lt;br /&gt;
The stress test began on 8/27/24 at 11:25:23 PDT.&lt;br /&gt;
&lt;br /&gt;
To help us finish the stress test before the SCEC AM, TACC granted us a Frontera reservation for 200 nodes for 8 days, beginning on 8/28/24 at 8 am PDT.  We didn't realized we needed to specify the --reservation tag, so we began utilizing it around 9 am PDT.&lt;br /&gt;
&lt;br /&gt;
We gave the reservation back in the morning of 8/30.  After the stress test, callag had used 32,660 node-hours on Frontier and scottcal used 18,493 node-hours on Frontera.&lt;br /&gt;
&lt;br /&gt;
Stress test computational cost:&lt;br /&gt;
&lt;br /&gt;
*SGTs: 10494/20 sites = 524.7 node-hours per site, about 2.4x what we estimated.  This is mostly due to having used meshes with a -55 degree angle, resulting in larger meshes requiring more computation time.&lt;br /&gt;
*PP and BB: 370.0 node-hours per site, about 15% more than we estimated.&lt;br /&gt;
&lt;br /&gt;
=== Changes from stress test ===&lt;br /&gt;
&lt;br /&gt;
Based on the stress test, we made the following changes:&lt;br /&gt;
*Reduced ramp-up time for DirectSynth worker processes.&lt;br /&gt;
*Fixed issue with PGV writing to files in the BB codes.&lt;br /&gt;
*Changed mesh angle from -55 to -36 degrees.&lt;br /&gt;
*Changed smoothing zone from 10 km on either side of the boundary to 20 km.&lt;br /&gt;
*Fixed issue with vertical response and period duration files not automatically transferred.&lt;br /&gt;
*Discovered issue with accessing moment-carc through the USC VPN - confirmed that neither Xiaofeng nor Kevin can access it.&lt;br /&gt;
&lt;br /&gt;
=== Stress test results ===&lt;br /&gt;
&lt;br /&gt;
Below are hazard curves for the 20 test sites in the stress test.  &lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:study_24_8_stress_test_sites.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
A KML file with the stress test sites is available [[Media:Study_24_8_stress_test_sites.kml|here]].&lt;br /&gt;
&lt;br /&gt;
==== Low-frequency ====&lt;br /&gt;
&lt;br /&gt;
These curves (in black) include comparisons with Study 18.8 (in red)&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! 2 sec !! 3 sec !! 5 sec !! 10 sec &lt;br /&gt;
|-&lt;br /&gt;
! s3240&lt;br /&gt;
| [[File:s3240_r10730_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_r10730_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_r10730_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3240_r10730_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! ALBY&lt;br /&gt;
| [[File:ALBY_r10731_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_r10731_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_r10731_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:ALBY_r10731_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SJO&lt;br /&gt;
| [[File:SJO_r10732_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_r10732_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_r10732_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_r10732_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CFCS&lt;br /&gt;
| [[File:CFCS_r10733_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_r10733_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_r10733_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_r10733_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3171&lt;br /&gt;
| [[File:s3171_r10734_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_r10734_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_r10734_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3171_r10734_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSUEB&lt;br /&gt;
| [[File:CSUEB_r10735_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_r10735_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_r10735_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSUEB_r10735_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! CSU1&lt;br /&gt;
| [[File:CSU1_r10736_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_r10736_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_r10736_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CSU1_r10736_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SFRH&lt;br /&gt;
| [[File:SFRH_r10737_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_r10737_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_r10737_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SFRH_r10737_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! LVMR&lt;br /&gt;
| [[File:LVMR_r10738_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_r10738_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_r10738_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LVMR_r10738_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! HAYW&lt;br /&gt;
| [[File:HAYW_r10739_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_r10739_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_r10739_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:HAYW_r10739_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3446&lt;br /&gt;
| [[File:s3446_r10740_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_r10740_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_r10740_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_r10740_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! NAPA&lt;br /&gt;
| [[File:NAPA_r10741_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:NAPA_r10741_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:NAPA_r10741_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:NAPA_r10741_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SRSA&lt;br /&gt;
| [[File:SRSA_r10742_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SRSA_r10742_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SRSA_r10742_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SRSA_r10742_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! MSRA&lt;br /&gt;
| [[File:MSRA_r10743_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:MSRA_r10743_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:MSRA_r10743_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:MSRA_r10743_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! PTRY&lt;br /&gt;
| [[File:PTRY_r10744_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:PTRY_r10744_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:PTRY_r10744_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:PTRY_r10744_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! DALY&lt;br /&gt;
| [[File:DALY_r10745_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:DALY_r10745_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:DALY_r10745_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:DALY_r10745_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SSOL&lt;br /&gt;
| [[File:SSOL_r10746_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SSOL_r10746_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SSOL_r10746_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SSOL_r10746_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! LICK&lt;br /&gt;
| [[File:LICK_r10747_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LICK_r10747_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LICK_r10747_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LICK_r10747_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SSFO&lt;br /&gt;
| [[File:SSFO_r10748_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SSFO_r10748_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SSFO_r10748_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SSFO_r10748_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! BLMT&lt;br /&gt;
| [[File:BLMT_r10749_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:BLMT_r10749_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:BLMT_r10749_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:BLMT_r10749_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== Sites with notable differences ====&lt;br /&gt;
&lt;br /&gt;
We identified 3 sites with notable differences between Study 24.8 and Study 18.8:&lt;br /&gt;
*LVMR is higher at 10 sec&lt;br /&gt;
*CFCS is higher 2, 3, 5 sec&lt;br /&gt;
*s3171 is lower at 3, 5, 10 sec&lt;br /&gt;
&lt;br /&gt;
We believe the difference in LVMR is due to the expanded basin added in SFCVM v21.1, and s3171 can be explained by the change in profile as well:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! Profile, SFCVM v21.1 !! Profile, Cencal model used in Study 18.8&lt;br /&gt;
|-&lt;br /&gt;
! LVMR&lt;br /&gt;
| [[File:LVMR_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
| [[File:LVMR_cencal_vert_profile.png|thumb|250px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3171&lt;br /&gt;
| [[File:s3171_RC2_vert_profile.png|thumb|250px]]&lt;br /&gt;
| [[File:s3171_cencal_vert_profile.png|thumb|250px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
A scatterplot comparing the average 2 sec RotD50 value for each rupture shows that the Study 24.8 results are consistently higher than the 18.8 results for CFCS:&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:CFCS_scatterplot.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
We extracted vertical slices for the top from Cencal (floor = 500 m/s) and the Study 24.8 model (taper, floor = 400 m/s) for the top 5000 m along an east-west line running through CFCS.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Vertical slice, Study 24.8 !! Vertical slice, Cencal&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CFCS_RC2_vert_slice.png|thumb|400px]]&lt;br /&gt;
| [[File:CFCS_cencal_vert_slice.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== Corrected Study 18.8 comparisons ====&lt;br /&gt;
&lt;br /&gt;
In an attempt to replicate the Study 18.8 results, we discovered that the Study 18.8 velocity models were tiled in a different order than for Study 24.8: CCA-06, then CenCal, then CVM-S4.26.M01, as described [[CyberShake_Study_18.8#Configuration_for_this_study | here]].  Several of the stress test sites fall within the 20 km smoothing zone surrounding the Study 18.8 CCA-06/Cencal interface (interface in blue, 20km zone in white):&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Study_18_8_20km_smoothing_region_stress_test_sites.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
We compared the velocity profiles of Study 24.8 and the correct Study 18.8 configuration for the 5 stress test sites.  For CFCS, this explains the sharp increase in hazard.  The Study 18.8 profile is in blue, and the Study 24.8 profile is in orange.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! Vs overlay, top 700m !! 2 sec curve !! 3 sec curve !! 5 sec curve !! 10 sec curve&lt;br /&gt;
|-&lt;br /&gt;
! CFCS &lt;br /&gt;
| [[File:CFCS_248_188_overlay.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_r10733_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_r10733_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_r10733_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:CFCS_r10733_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! SJO &lt;br /&gt;
| [[File:SJO_248_188_overlay.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_r10732_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_r10732_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_r10732_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:SJO_r10732_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! s3446 &lt;br /&gt;
| [[File:s3446_248_188_overlay.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_r10740_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_r10740_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_r10740_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:s3446_r10740_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! BLMT&lt;br /&gt;
| [[File:BLMT_248_188_overlay.png|thumb|300px]]&lt;br /&gt;
| [[File:BLMT_r10749_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:BLMT_r10749_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:BLMT_r10749_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:BLMT_r10749_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
! LICK &lt;br /&gt;
| [[File:LICK_248_188_overlay.png|thumb|300px]]&lt;br /&gt;
| [[File:LICK_r10747_248_v_188_2sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LICK_r10747_248_v_188_3sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LICK_r10747_248_v_188_5sec_RotD50.png|thumb|300px]]&lt;br /&gt;
| [[File:LICK_r10747_248_v_188_10sec_RotD50.png|thumb|300px]]&lt;br /&gt;
|-&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== Impact of rotation angle and smoothing zone ====&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Based on the below analysis, we will rerun the stress test sites with the intended angle of -36 degrees and a 20 km smoothing zone so that all Study 24.8 results will be consistent.&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
We discovered that the stress test sites were run with a volume rotation angle of -55 degrees and a smoothing zone width of 10 km on either side of the interface.  This is different from our intended rotation angle of -36 degrees and smoothing zone width of 20 km.  To quantify the impact of this, we ran CFCS with the updated values.  Comparisons are below.  The black curves are the stress test run, the blue curves are the corrected run.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! 2 sec !! 3 sec !! 5 sec !! 10 sec&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CFCS_10733_v_10771_2sec_curve.png|thumb|350px]]&lt;br /&gt;
| [[File:CFCS_10733_v_10771_3sec_curve.png|thumb|350px]]&lt;br /&gt;
| [[File:CFCS_10733_v_10771_5sec_curve.png|thumb|350px]]&lt;br /&gt;
| [[File:CFCS_10733_v_10771_10sec_curve.png|thumb|350px]]&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CFCS_10733_v_10771_2sec_scatter.png|thumb|350px]]&lt;br /&gt;
| [[File:CFCS_10733_v_10771_3sec_scatter.png|thumb|350px]]&lt;br /&gt;
| [[File:CFCS_10733_v_10771_5sec_scatter.png|thumb|350px]]&lt;br /&gt;
| [[File:CFCS_10733_v_10771_10sec_scatter.png|thumb|350px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Events During Study ==&lt;br /&gt;
&lt;br /&gt;
The main study began on 9/24/24 at 11:49:59 PDT.&lt;br /&gt;
&lt;br /&gt;
We began with a maximum of 5 workflows on Frontier and 20 on Frontera.  Later on 9/24 we increased the number of Frontier workflows to 15.&lt;br /&gt;
&lt;br /&gt;
On 9/30, moment-carc wasn't passed to the Check Duration job for finishing PP workflows.  I updated the dax generator to correctly pass through the desired database for this job.&lt;br /&gt;
&lt;br /&gt;
On 9/30, I ran out of /home quota on Frontera.  This was an issue because some of the rvGAHP-related logs are written here, and it resulted in &amp;quot;Failed to start transfer GAHP&amp;quot; errors.  I cleaned up some space on /home and released the jobs.&lt;br /&gt;
&lt;br /&gt;
On 9/30, the merge_pmc jobs were held because they still included the stress test reservation on Frontera.  I removed the reservation string and released the jobs.&lt;br /&gt;
&lt;br /&gt;
On 10/1, I found that the cronjob on Frontier to monitor job usage was removed from the crontab, so I added it back.&lt;br /&gt;
&lt;br /&gt;
On 10/2, I turned off stochastic calculations on Frontera while we investigated the unexpected BB results from the stress test.&lt;br /&gt;
&lt;br /&gt;
On 10/10, the work2 filesystem on Frontera, where the executables and rupture files are hosted, had a problem, causing all the Frontera jobs to be held.  Additionally, login4, the host which connects to shock, was unavailable.  When login4 returned we restarted the rvGAHP process.&lt;br /&gt;
&lt;br /&gt;
On 10/16, the scottcal account was added to the CyberShake DesignSafe project.  This meant that scottcal is now part of 3 allocations, leading to a Slurm error when jobs were submitted without specifying an account, meaning Condor jobs were held.  I added the EAR20006 account to all Frontera jobs, and had to manually add it for jobs which were already planned.&lt;br /&gt;
&lt;br /&gt;
On 10/16, I realized that DirectSynth jobs were requesting a wallclock time of 10 hours, but these jobs are usually only taking a bit more than 4.  I changed the requested job length to 5 hours.&lt;br /&gt;
&lt;br /&gt;
On 10/17, I discovered that the PP Error jobs were because the corresponding SGT workflow had finished its Update and Handoff jobs, but then the stage-out of the SGTs to Frontera failed because of Globus consents.  The registration job is dependent on stage-out, so it didn't run either.  Then, when the PP workflow tries to plan, it can't find a copy of the SGT files on Frontera and aborts.  I couldn't use the standard CyberShake scripts to restart these workflows, since in the database they were already recorded as SGT Generated, so instead I just reran the pegasus-run commands in the log-plan* files.&lt;br /&gt;
&lt;br /&gt;
On 10/18, I got an email from TACC staff that the jobs are causing too much load on the scratch filesystem, and a request to reduce the number of simultaneous jobs to 4.  I reduced the number of Frontera slots to 4 in the workflow auto-submission tool, and am following up to learn the specifics of the problem and if there's a way to fix it.&lt;br /&gt;
&lt;br /&gt;
SGT calculations on Frontier finished on 10/18/24 at 12:59:43 PDT.&lt;br /&gt;
&lt;br /&gt;
On 10/21, the broadband transfers to Corral for Study 22.12 resulted in going over quota on CARC scratch1 again, interfering with the workflows for this study.  I cleaned up scratch1 and fixed the issue.&lt;br /&gt;
&lt;br /&gt;
On 10/28-30, the work2 filesystem and scheduler on Frontera had a number of issues, requiring restarts of the rvGAHP daemon and some work to recover the job state.&lt;br /&gt;
&lt;br /&gt;
=== Renewal of Globus consents ===&lt;br /&gt;
&lt;br /&gt;
We will track when we renew the Globus consents.&lt;br /&gt;
&lt;br /&gt;
10/6&lt;br /&gt;
&lt;br /&gt;
10/9&lt;br /&gt;
&lt;br /&gt;
10/13&lt;br /&gt;
&lt;br /&gt;
10/17&lt;br /&gt;
&lt;br /&gt;
10/24&lt;br /&gt;
&lt;br /&gt;
10/29&lt;br /&gt;
&lt;br /&gt;
11/5&lt;br /&gt;
&lt;br /&gt;
=== Restarts of the rvgahp daemons ===&lt;br /&gt;
&lt;br /&gt;
We will track when we restart the rvgahp daemons on login10@frontier and login4@frontera.&lt;br /&gt;
&lt;br /&gt;
10/9 on Frontera&lt;br /&gt;
&lt;br /&gt;
10/10 on Frontera (node crashed)&lt;br /&gt;
&lt;br /&gt;
10/29 on Frontera (filesystem issues)&lt;br /&gt;
&lt;br /&gt;
10/30 on Frontera (continuing filesystem issues)&lt;br /&gt;
&lt;br /&gt;
11/7 on Frontera&lt;br /&gt;
&lt;br /&gt;
== Performance Metrics ==&lt;br /&gt;
&lt;br /&gt;
At the start of the main study, project geo156 on Frontier has used 609101 node-hours (of 700,000).  User callag has used 33398 node-hours.  Project EAR20006 has used 12925.147 node-hours (of 676,000).  User scottcal has used 503.796 node-hours.&lt;br /&gt;
&lt;br /&gt;
At the end of the main study, project geo156 had used 671,656 node-hours.  User callag has used 88881.2 node-hours.  project EAR20006 had 511480 node-hours remaining (out of 676,000).  User scottcal used 125411.21 node-hours.&lt;br /&gt;
&lt;br /&gt;
In total, the study used 55483 node-hours on Frontier and 124907 node-hours on Frontera.  This is difficult to translate into core-hours.  Each Frontier node has 64 cores plus 8 GPUs.  Each GPU has 110 compute units, so we will assume each node has 64 + 110*8 = 944 'cores'.  Each Frontera node has 56 cores.  We translate this to 52.4M core-hours on Frontier and 7.0M core-hours on Frontera.&lt;br /&gt;
&lt;br /&gt;
The study ran from 9/24/24 11:49:59 PDT to 11/8/24 6:15:15 PST, a duration of 1075.4 hrs (44.8 days).&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Our high-water marks were 4130 nodes on Frontier (44%) and 1029 nodes on Frontera (12%).  Neither used a reservation or priority bump.&lt;br /&gt;
&lt;br /&gt;
Pegasus managed about 1 PB of data.&lt;br /&gt;
&lt;br /&gt;
We staged about 9 million output files totaling 36 TB back to long-term storage at USC CARC.&lt;br /&gt;
&lt;br /&gt;
We ran 945 top-level workflows, with an average of 15 running at a time.&lt;br /&gt;
&lt;br /&gt;
We ran approximately 27,720 jobs.  Of these, 7245 were remote, 8190 were local, 5985 were directory creation, 5355 were staging, and 945 were registration.&lt;br /&gt;
&lt;br /&gt;
126,818,608 three-component seismograms (half LF, half BB), 11.5 billion low-frequency IMs, and 22.8 billion broadband IMs were calculated.&lt;br /&gt;
&lt;br /&gt;
Total number of rupture variations is 63,409,304.&lt;br /&gt;
&lt;br /&gt;
== Production Checklist ==&lt;br /&gt;
&lt;br /&gt;
*&amp;lt;s&amp;gt;Run test workflow against moment-carc.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Investigate velocity profile differences with Mei's plots (s3171 and HAYW)&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Wrap period-dependent duration code and test in PP.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Wrap period-dependent duration code and test in BB.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Integrate period-dependent duration code into workflows.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Update data and compute estimates.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Determine where to copy output data.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify workflows to remove double-staging of SGTs to Frontera.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Schedule and hold readiness reviews with TACC and OLCF.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Modify workflows to only insert RotD50, not RotD100.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Science readiness review&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Technical readiness review&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Verify that vertical component response is being calculated correctly.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Set up cronjobs to monitor usage.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Solve issue with putting velocity parameters into the database from Frontier.&amp;lt;/s&amp;gt;&lt;br /&gt;
*&amp;lt;s&amp;gt;Tag code in github repository&amp;lt;/s&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Presentations, Posters, and Papers ==&lt;br /&gt;
&lt;br /&gt;
== References ==&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-2025050731-dist.png&amp;diff=30453</id>
		<title>File:CS-gof-LOMAP-2025050731-dist.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-2025050731-dist.png&amp;diff=30453"/>
		<updated>2025-09-26T16:36:11Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: Scottcal uploaded a new version of File:CS-gof-LOMAP-2025050731-dist.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-2025050731-dist.png&amp;diff=30452</id>
		<title>File:CS-gof-LOMAP-2025050731-dist.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-2025050731-dist.png&amp;diff=30452"/>
		<updated>2025-09-26T16:34:11Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: Scottcal uploaded a new version of File:CS-gof-LOMAP-2025050731-dist.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-2025050731-map.png&amp;diff=30451</id>
		<title>File:CS-gof-LOMAP-2025050731-map.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-2025050731-map.png&amp;diff=30451"/>
		<updated>2025-09-26T16:32:56Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-2025050731-dist.png&amp;diff=30450</id>
		<title>File:CS-gof-LOMAP-2025050731-dist.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-2025050731-dist.png&amp;diff=30450"/>
		<updated>2025-09-26T16:32:49Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-2025050731.png&amp;diff=30449</id>
		<title>File:CS-gof-LOMAP-2025050731.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-2025050731.png&amp;diff=30449"/>
		<updated>2025-09-26T16:32:12Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Loma_Prieta_Validation&amp;diff=30448</id>
		<title>Loma Prieta Validation</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Loma_Prieta_Validation&amp;diff=30448"/>
		<updated>2025-09-26T16:28:43Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;As part of the Study 24.8 validation process, we performed runs for the Loma Prieta event in the BBP for both the BBP and CyberShake.&lt;br /&gt;
&lt;br /&gt;
== BBP 1D ==&lt;br /&gt;
&lt;br /&gt;
{| &lt;br /&gt;
! Overall GoF&lt;br /&gt;
! GoF, best realization (#7)&lt;br /&gt;
! Residuals by distance (#7)&lt;br /&gt;
! Residual map (#7)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-gof-LOMAP-20250424-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-gof-LOMAP-2025042407.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-gof-LOMAP-2025042407-dist.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-gof-LOMAP-2025042407-map.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== CyberShake 3D ==&lt;br /&gt;
&lt;br /&gt;
{| &lt;br /&gt;
! Overall GoF&lt;br /&gt;
! GoF, best realization (#31)&lt;br /&gt;
! Residuals by distance (#31)&lt;br /&gt;
! Residual map (#31)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-gof-LOMAP-20250507-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-gof-LOMAP-2025050731.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-gof-LOMAP-2025050731-dist.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-gof-LOMAP-2025050731-map.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== GMPEs ==&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:GMPE-gof-LOMAP.png|thumb|400px]]&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-20250507-combined.png&amp;diff=30447</id>
		<title>File:CS-gof-LOMAP-20250507-combined.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:CS-gof-LOMAP-20250507-combined.png&amp;diff=30447"/>
		<updated>2025-09-26T16:27:55Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: Scottcal uploaded a new version of File:CS-gof-LOMAP-20250507-combined.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30446</id>
		<title>Callaghan Presentations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30446"/>
		<updated>2025-09-26T06:31:55Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Below are links to presentations and related resources given by Scott Callaghan.&lt;br /&gt;
&lt;br /&gt;
== 2025 ==&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2025_IHPCSS_workflows.pptx | PPTX]], [[:File:2025_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*USGS NorCal Earthquake Hazards workshop CyberShake presentation: [[:File:2025_USGS_NorCal_CyberShake.pptx | PPTX]], [[:File:2025_USGS_NorCal_CyberShake.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2024 ==&lt;br /&gt;
*AGU24 and December staff meeting presentation: [[:File:AGU24_CyberShake_24_8_presentation.pptx | PPTX]], [[:File:AGU24_CyberShake_24_8_presentation.pdf | PDF]]&lt;br /&gt;
*SC24 presentation at OSU booth: [[:File:SC24_OSU_MPI_compression.pptx | PPTX]], [[:File:SC24_OSU_MPI_compression.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake Study 24.8 overview: [[:File:Study_24.8_overview_for_NGAW3.odp | ODP]], [[:File:Study_24.8_overview_for_NGAW3.pdf | PDF]]&lt;br /&gt;
*Geo-INQUIRE Data Lake workshop CyberShake presentation: [[:File:CyberShake_Data_Lake_workshop.pptx | PPTX]], [[:File:CyberShake_Data_Lake_workshop.pdf | PDF]]&lt;br /&gt;
*IHPCSS Workflow talk: [[:File:2024_IHPCSS_workflows.pptx | PPTX]], [[:File:2024_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2023 ==&lt;br /&gt;
*December Staff Meeting CyberShake updates: [[:File:Dec_Staffmtg_CyberShake_update.pptx | PPTX]], [[:File:Dec_Staffmtg_CyberShake_update.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake presentation: [[:File:2023_NGA_West3.pptx | PPTX]], [[:File:2023_NGA_West3.pdf | PDF]]&lt;br /&gt;
*SC23 early career talk: [[:File:SC23_ECP_career_talk.pptx | PPTX]], [[:File:SC23_ECP_career_talk.pdf | PDF]]&lt;br /&gt;
*IHPCSS seismology presentation: [[:File:2023_IHPCSS_seismology.pptx | PPTX]], [[:File:2023_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2023_IHPCSS_workflows.pptx | PPTX]], [[:File:2023_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*GC11 conference (Solid Earth and Geohazards in the Exascale Era): [[:File:GC11_Callaghan_workflows.pptx | PPTX]]&lt;br /&gt;
*CyberTraining for Seismology talk on CyberShake Data Access tool: [[:File:CyberShake_tutorial_for_2023_CyberTraining.pptx | PPTX]]&lt;br /&gt;
*SSA CyberShake Study 22.12 talk: [[:File:2023_SSA_CyberShake_22_12.pptx | PPTX]], [[:File:2023_SSA_CyberShake_22_12.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2022 ==&lt;br /&gt;
&lt;br /&gt;
*SC22 Early Career talk (also given to ECP Work/Life balance group, and Sandia Parents Group): [[:File:2022_parental_balance.ppt | PPT]], [[:File:2022_parental_balance.pdf | PDF]]&lt;br /&gt;
*SC22 SIGHPC Education Chapter overview: [[:File:SC22_SIGHPC_Edu_overview.pptx | PPTX]], [[:File:SC22_SIGHPC_Edu_overview.pdf | PDF]]&lt;br /&gt;
*SOURCES career talk: [[:File:2022_Sources_career_talk.odp | ODP ]]&lt;br /&gt;
*IHPCSS workflow talk: [[:File:2022_IHPCSS_talk.pptx | PPTX]]&lt;br /&gt;
*SSA Broadband CyberShake Validation talk: [[:File:2022_SSA_Broadband_CyberShake.pptx | PPTX]], [[:File:2022_SSA_Broadband_CyberShake.pdf | PDF]]&lt;br /&gt;
*SSA CyberShake Study 21.12 talk: [[:File:2022_SSA_CyberShake_21_12.pptx | PPTX]], [[:File:2022_SSA_CyberShake_21_12.pdf | PDF]]&lt;br /&gt;
*SCEC staff meeting talk: [[:File:Feb_2022_staff_meeting.pptx | PPTX]], [[:File:Feb_2022_staff_meeting.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2021 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2021_CyberShake.pptx | PPTX full]], [[:File:AGU_2021_CyberShake_lighting.pptx | PPTX lightning]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2021_IHPCSS_workflow.pptx | PPTX]], [[:File:2021_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2020 ==&lt;br /&gt;
* Polytechnic talk resources:  [[Poly 2020 outreach discussion]]&lt;br /&gt;
* AGU CyberShake talk: [[File:AGU_2020_CyberShake.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2019 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2019_CyberShake.pptx | PPTX]]&lt;br /&gt;
* SC19 USC booth talk: [[:File:SC19_Callaghan_USC_booth.pptx | PPTX]] or [[:File:SC19_Callaghan_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SCEC Research Computing workshop lightning talk: [[:File:2019_SCEC_Research_Computing_CyberShake_lightning.pdf | PDF]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2019_IHPCSS_workflow.pptx | PPTX]]&lt;br /&gt;
* UseIT talk about HPC at SCEC: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202019.pptx slides (PPTX), external link]&lt;br /&gt;
* SSA CyberShake science talk: [[:File:2019_SSA_CyberShake_Science_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Science_Presentation.pdf | PDF]].  Here are links to the [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/point_src_gtl_v2.wmv 10 km smoothing movie] and [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/pt_src_gtl_20km_v2.wmv 20 km smoothing movie]&lt;br /&gt;
* SSA CyberShake technical talk: [[:File:2019_SSA_CyberShake_Technical_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Technical_Presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2018 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2018_IHPCSS_workflow.pptx | PPTX]] or [[:File:2018_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* Blue Waters Symposium talk: [[:File:Callaghan_Blue_Waters_Symposium.pptx | PPTX]]&lt;br /&gt;
* QuakeCore GMS&amp;amp;V talk: [[:File:CyberShake_QuakeCore_Presentation.pptx | PPTX]] or [[:File:CyberShake_QuakeCore_Presentation.pdf | PDF]]&lt;br /&gt;
* Machine learning with Keras overview: [[:File:Machine_Learning_with_Keras.odp | ODP]]&lt;br /&gt;
&lt;br /&gt;
== 2017 ==&lt;br /&gt;
&lt;br /&gt;
* AGU CyberShake talk: [[:File:2017_AGU_CyberShake.pptx | PPTX]]&lt;br /&gt;
* PG&amp;amp;E CyberShake update: [[:File:PGE_CyberShake_update.pptx | PPTX]] or [[:File:PGE_CyberShake_update.pdf | PDF]]&lt;br /&gt;
* SC17 USC booth talk: [[:File:SC17_USC_booth.pptx | PPTX]] or [[:File:SC17_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SC17 WORKS'17 talk on rvGAHP: [[:File:WORKS17_rvGAHP.pptx | PPTX]] or [[:File:WORKS17_rvGAHP.pdf | PDF]]&lt;br /&gt;
* SC17 Women in HPC Mentoring talk: [[:File:2017_WHPC_Workshop.pptx | PPTX]] or [[:File:2017_WHPC_Workshop.pdf | PDF]]&lt;br /&gt;
* SCEC Annual Meeting plenary CyberShake presentation: [[:File:2017_SCEC_AM_CyberShake.pptx | PPTX]] or [[:File:2017_SCEC_AM_CyberShake.pdf | PDF]]&lt;br /&gt;
* SCEC Nonlinear Workshop presentation: [[:File:2017_SCEC_Nonlinear_Workshop.pptx | PPTX]] or [[:File:2017_SCEC_Nonlinear_Workshop.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2017_IHPCSS_workflow.pptx | PPTX]] or [[:File:2017_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT Blue Waters tutorial: [[:File:2017_UseIT_HPC_tutorial.odt | tutorial text (ODT)]] and [[:File:2017_UseIT_Linux_commands.doc | Linux Command guide (DOC)]]&lt;br /&gt;
* UseIT HPC at SCEC talk: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202017.pptx slides (PPTX), external link] and [[:File:2017_UseIT_HPC_spreadsheet.xlsx | supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* SSA CyberShake talk: [[:File:Callaghan_2017_SSA_CyberShake.pptx | PPTX]] or [[:File:Callaghan_2017_SSA_CyberShake.pdf | PDF]]&lt;br /&gt;
* Blue Waters workflow seminar: [[:File:Blue_Waters_Workflow_Seminar_Overview.pptx | PPTX]] or [[:File:Blue_Waters_Workflow_Seminar_Overview.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2016 ==&lt;br /&gt;
&lt;br /&gt;
* SC16 USC booth talk: [[:File:SC16_RSQSim_UseIT_USC_booth.pdf | PDF]] or [[:File:SC16_RSQSim_UseIT_USC_booth.odp | ODP]]&lt;br /&gt;
* SCEC Annual Meeting: [[:File:SCEC_2016_AM_CyberShake_CISM.pptx | PPTX]] or [[:File:SCEC_2016_AM_CyberShake_CISM.pdf | PDF]]&lt;br /&gt;
* XSEDE Workflow overview talk: [[File:2016_Callaghan_overview_of_workflows.pptx | PPTX]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2016_IHPCSS_workflow.pptx | PPTX]] or [[:File:2016_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC.pptx | HPC talk (PPTX)]] and [[:File:2016_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC_tutorial.odt | HPC tutorial (ODT)]] and [[:File:2016_UseIT_Linux_commands.odt | Sample Linux Commands (ODT)]]&lt;br /&gt;
* CyberShake [[:File:2016_CCSP.odp | CCSP presentation (ODP)]]&lt;br /&gt;
* CyberShake [[:File:2016_UCERF3_downsampling.odp | UCERF3 downsampling presentation (ODP)]]&lt;br /&gt;
&lt;br /&gt;
== 2015 ==&lt;br /&gt;
&lt;br /&gt;
* SC15 USC booth talk:  [[:File:SC15_CyberShake_USC_booth.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2014 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS seismology talk: [[:File:2014_IHPCSS_seismology.pptx | PPTX]] or [[:File:2014_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2014_IHPCSS_workflow.pptx | PPTX]] or [[:File:2014_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2014_UseIT_HPC.pptx |  HPC tutorial (PPTX)]]  [[:File:2014_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]  [[:File:2014_UseIT_HPC_matrix_mult.docx | Supplemental matrix multiplication (DOCX)]]&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
*[[SC16]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:2025_IHPCSS_workflows.pdf&amp;diff=30445</id>
		<title>File:2025 IHPCSS workflows.pdf</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:2025_IHPCSS_workflows.pdf&amp;diff=30445"/>
		<updated>2025-09-26T06:31:23Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:2025_IHPCSS_workflows.pptx&amp;diff=30444</id>
		<title>File:2025 IHPCSS workflows.pptx</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:2025_IHPCSS_workflows.pptx&amp;diff=30444"/>
		<updated>2025-09-26T06:31:12Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:2025_USGS_NorCal_CyberShake.pdf&amp;diff=30443</id>
		<title>File:2025 USGS NorCal CyberShake.pdf</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:2025_USGS_NorCal_CyberShake.pdf&amp;diff=30443"/>
		<updated>2025-09-26T06:29:01Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:2025_USGS_NorCal_CyberShake.pptx&amp;diff=30442</id>
		<title>File:2025 USGS NorCal CyberShake.pptx</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:2025_USGS_NorCal_CyberShake.pptx&amp;diff=30442"/>
		<updated>2025-09-26T06:28:16Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30441</id>
		<title>Callaghan Presentations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30441"/>
		<updated>2025-09-26T06:27:52Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Below are links to presentations and related resources given by Scott Callaghan.&lt;br /&gt;
&lt;br /&gt;
== 2025 ==&lt;br /&gt;
*IHPCSS workflow presentation: [[File:2025_IHPCSS_workflows.pptx | PPTX]], [[File:2025_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*USGS NorCal Earthquake Hazards workshop CyberShake presentation: [[File:2025_USGS_NorCal_CyberShake.pptx | PPTX]], [[File:2025_USGS_NorCal_CyberShake.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2024 ==&lt;br /&gt;
*AGU24 and December staff meeting presentation: [[:File:AGU24_CyberShake_24_8_presentation.pptx | PPTX]], [[:File:AGU24_CyberShake_24_8_presentation.pdf | PDF]]&lt;br /&gt;
*SC24 presentation at OSU booth: [[:File:SC24_OSU_MPI_compression.pptx | PPTX]], [[:File:SC24_OSU_MPI_compression.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake Study 24.8 overview: [[:File:Study_24.8_overview_for_NGAW3.odp | ODP]], [[:File:Study_24.8_overview_for_NGAW3.pdf | PDF]]&lt;br /&gt;
*Geo-INQUIRE Data Lake workshop CyberShake presentation: [[:File:CyberShake_Data_Lake_workshop.pptx | PPTX]], [[:File:CyberShake_Data_Lake_workshop.pdf | PDF]]&lt;br /&gt;
*IHPCSS Workflow talk: [[:File:2024_IHPCSS_workflows.pptx | PPTX]], [[:File:2024_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2023 ==&lt;br /&gt;
*December Staff Meeting CyberShake updates: [[:File:Dec_Staffmtg_CyberShake_update.pptx | PPTX]], [[:File:Dec_Staffmtg_CyberShake_update.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake presentation: [[:File:2023_NGA_West3.pptx | PPTX]], [[:File:2023_NGA_West3.pdf | PDF]]&lt;br /&gt;
*SC23 early career talk: [[:File:SC23_ECP_career_talk.pptx | PPTX]], [[:File:SC23_ECP_career_talk.pdf | PDF]]&lt;br /&gt;
*IHPCSS seismology presentation: [[:File:2023_IHPCSS_seismology.pptx | PPTX]], [[:File:2023_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2023_IHPCSS_workflows.pptx | PPTX]], [[:File:2023_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*GC11 conference (Solid Earth and Geohazards in the Exascale Era): [[:File:GC11_Callaghan_workflows.pptx | PPTX]]&lt;br /&gt;
*CyberTraining for Seismology talk on CyberShake Data Access tool: [[:File:CyberShake_tutorial_for_2023_CyberTraining.pptx | PPTX]]&lt;br /&gt;
*SSA CyberShake Study 22.12 talk: [[:File:2023_SSA_CyberShake_22_12.pptx | PPTX]], [[:File:2023_SSA_CyberShake_22_12.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2022 ==&lt;br /&gt;
&lt;br /&gt;
*SC22 Early Career talk (also given to ECP Work/Life balance group, and Sandia Parents Group): [[:File:2022_parental_balance.ppt | PPT]], [[:File:2022_parental_balance.pdf | PDF]]&lt;br /&gt;
*SC22 SIGHPC Education Chapter overview: [[:File:SC22_SIGHPC_Edu_overview.pptx | PPTX]], [[:File:SC22_SIGHPC_Edu_overview.pdf | PDF]]&lt;br /&gt;
*SOURCES career talk: [[:File:2022_Sources_career_talk.odp | ODP ]]&lt;br /&gt;
*IHPCSS workflow talk: [[:File:2022_IHPCSS_talk.pptx | PPTX]]&lt;br /&gt;
*SSA Broadband CyberShake Validation talk: [[:File:2022_SSA_Broadband_CyberShake.pptx | PPTX]], [[:File:2022_SSA_Broadband_CyberShake.pdf | PDF]]&lt;br /&gt;
*SSA CyberShake Study 21.12 talk: [[:File:2022_SSA_CyberShake_21_12.pptx | PPTX]], [[:File:2022_SSA_CyberShake_21_12.pdf | PDF]]&lt;br /&gt;
*SCEC staff meeting talk: [[:File:Feb_2022_staff_meeting.pptx | PPTX]], [[:File:Feb_2022_staff_meeting.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2021 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2021_CyberShake.pptx | PPTX full]], [[:File:AGU_2021_CyberShake_lighting.pptx | PPTX lightning]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2021_IHPCSS_workflow.pptx | PPTX]], [[:File:2021_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2020 ==&lt;br /&gt;
* Polytechnic talk resources:  [[Poly 2020 outreach discussion]]&lt;br /&gt;
* AGU CyberShake talk: [[File:AGU_2020_CyberShake.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2019 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2019_CyberShake.pptx | PPTX]]&lt;br /&gt;
* SC19 USC booth talk: [[:File:SC19_Callaghan_USC_booth.pptx | PPTX]] or [[:File:SC19_Callaghan_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SCEC Research Computing workshop lightning talk: [[:File:2019_SCEC_Research_Computing_CyberShake_lightning.pdf | PDF]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2019_IHPCSS_workflow.pptx | PPTX]]&lt;br /&gt;
* UseIT talk about HPC at SCEC: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202019.pptx slides (PPTX), external link]&lt;br /&gt;
* SSA CyberShake science talk: [[:File:2019_SSA_CyberShake_Science_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Science_Presentation.pdf | PDF]].  Here are links to the [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/point_src_gtl_v2.wmv 10 km smoothing movie] and [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/pt_src_gtl_20km_v2.wmv 20 km smoothing movie]&lt;br /&gt;
* SSA CyberShake technical talk: [[:File:2019_SSA_CyberShake_Technical_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Technical_Presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2018 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2018_IHPCSS_workflow.pptx | PPTX]] or [[:File:2018_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* Blue Waters Symposium talk: [[:File:Callaghan_Blue_Waters_Symposium.pptx | PPTX]]&lt;br /&gt;
* QuakeCore GMS&amp;amp;V talk: [[:File:CyberShake_QuakeCore_Presentation.pptx | PPTX]] or [[:File:CyberShake_QuakeCore_Presentation.pdf | PDF]]&lt;br /&gt;
* Machine learning with Keras overview: [[:File:Machine_Learning_with_Keras.odp | ODP]]&lt;br /&gt;
&lt;br /&gt;
== 2017 ==&lt;br /&gt;
&lt;br /&gt;
* AGU CyberShake talk: [[:File:2017_AGU_CyberShake.pptx | PPTX]]&lt;br /&gt;
* PG&amp;amp;E CyberShake update: [[:File:PGE_CyberShake_update.pptx | PPTX]] or [[:File:PGE_CyberShake_update.pdf | PDF]]&lt;br /&gt;
* SC17 USC booth talk: [[:File:SC17_USC_booth.pptx | PPTX]] or [[:File:SC17_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SC17 WORKS'17 talk on rvGAHP: [[:File:WORKS17_rvGAHP.pptx | PPTX]] or [[:File:WORKS17_rvGAHP.pdf | PDF]]&lt;br /&gt;
* SC17 Women in HPC Mentoring talk: [[:File:2017_WHPC_Workshop.pptx | PPTX]] or [[:File:2017_WHPC_Workshop.pdf | PDF]]&lt;br /&gt;
* SCEC Annual Meeting plenary CyberShake presentation: [[:File:2017_SCEC_AM_CyberShake.pptx | PPTX]] or [[:File:2017_SCEC_AM_CyberShake.pdf | PDF]]&lt;br /&gt;
* SCEC Nonlinear Workshop presentation: [[:File:2017_SCEC_Nonlinear_Workshop.pptx | PPTX]] or [[:File:2017_SCEC_Nonlinear_Workshop.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2017_IHPCSS_workflow.pptx | PPTX]] or [[:File:2017_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT Blue Waters tutorial: [[:File:2017_UseIT_HPC_tutorial.odt | tutorial text (ODT)]] and [[:File:2017_UseIT_Linux_commands.doc | Linux Command guide (DOC)]]&lt;br /&gt;
* UseIT HPC at SCEC talk: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202017.pptx slides (PPTX), external link] and [[:File:2017_UseIT_HPC_spreadsheet.xlsx | supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* SSA CyberShake talk: [[:File:Callaghan_2017_SSA_CyberShake.pptx | PPTX]] or [[:File:Callaghan_2017_SSA_CyberShake.pdf | PDF]]&lt;br /&gt;
* Blue Waters workflow seminar: [[:File:Blue_Waters_Workflow_Seminar_Overview.pptx | PPTX]] or [[:File:Blue_Waters_Workflow_Seminar_Overview.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2016 ==&lt;br /&gt;
&lt;br /&gt;
* SC16 USC booth talk: [[:File:SC16_RSQSim_UseIT_USC_booth.pdf | PDF]] or [[:File:SC16_RSQSim_UseIT_USC_booth.odp | ODP]]&lt;br /&gt;
* SCEC Annual Meeting: [[:File:SCEC_2016_AM_CyberShake_CISM.pptx | PPTX]] or [[:File:SCEC_2016_AM_CyberShake_CISM.pdf | PDF]]&lt;br /&gt;
* XSEDE Workflow overview talk: [[File:2016_Callaghan_overview_of_workflows.pptx | PPTX]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2016_IHPCSS_workflow.pptx | PPTX]] or [[:File:2016_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC.pptx | HPC talk (PPTX)]] and [[:File:2016_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC_tutorial.odt | HPC tutorial (ODT)]] and [[:File:2016_UseIT_Linux_commands.odt | Sample Linux Commands (ODT)]]&lt;br /&gt;
* CyberShake [[:File:2016_CCSP.odp | CCSP presentation (ODP)]]&lt;br /&gt;
* CyberShake [[:File:2016_UCERF3_downsampling.odp | UCERF3 downsampling presentation (ODP)]]&lt;br /&gt;
&lt;br /&gt;
== 2015 ==&lt;br /&gt;
&lt;br /&gt;
* SC15 USC booth talk:  [[:File:SC15_CyberShake_USC_booth.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2014 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS seismology talk: [[:File:2014_IHPCSS_seismology.pptx | PPTX]] or [[:File:2014_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2014_IHPCSS_workflow.pptx | PPTX]] or [[:File:2014_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2014_UseIT_HPC.pptx |  HPC tutorial (PPTX)]]  [[:File:2014_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]  [[:File:2014_UseIT_HPC_matrix_mult.docx | Supplemental matrix multiplication (DOCX)]]&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
*[[SC16]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30440</id>
		<title>Callaghan Presentations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30440"/>
		<updated>2025-09-26T06:25:43Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Below are links to presentations and related resources given by Scott Callaghan.&lt;br /&gt;
&lt;br /&gt;
== 2025 ==&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2025_IHPCSS_workflows.pptx | PPTX]], [[:File:2025_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*USGS NorCal Earthquake Hazards workshop CyberShake presentation: [[:File:2025_USGS_NorCal_CyberShake.pptx | PPTX]], [[:File:2025_USGS_NorCal_CyberShake.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2024 ==&lt;br /&gt;
*AGU24 and December staff meeting presentation: [[:File:AGU24_CyberShake_24_8_presentation.pptx | PPTX]], [[:File:AGU24_CyberShake_24_8_presentation.pdf | PDF]]&lt;br /&gt;
*SC24 presentation at OSU booth: [[:File:SC24_OSU_MPI_compression.pptx | PPTX]], [[:File:SC24_OSU_MPI_compression.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake Study 24.8 overview: [[:File:Study_24.8_overview_for_NGAW3.odp | ODP]], [[:File:Study_24.8_overview_for_NGAW3.pdf | PDF]]&lt;br /&gt;
*Geo-INQUIRE Data Lake workshop CyberShake presentation: [[:File:CyberShake_Data_Lake_workshop.pptx | PPTX]], [[:File:CyberShake_Data_Lake_workshop.pdf | PDF]]&lt;br /&gt;
*IHPCSS Workflow talk: [[:File:2024_IHPCSS_workflows.pptx | PPTX]], [[:File:2024_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2023 ==&lt;br /&gt;
*December Staff Meeting CyberShake updates: [[:File:Dec_Staffmtg_CyberShake_update.pptx | PPTX]], [[:File:Dec_Staffmtg_CyberShake_update.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake presentation: [[:File:2023_NGA_West3.pptx | PPTX]], [[:File:2023_NGA_West3.pdf | PDF]]&lt;br /&gt;
*SC23 early career talk: [[:File:SC23_ECP_career_talk.pptx | PPTX]], [[:File:SC23_ECP_career_talk.pdf | PDF]]&lt;br /&gt;
*IHPCSS seismology presentation: [[:File:2023_IHPCSS_seismology.pptx | PPTX]], [[:File:2023_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2023_IHPCSS_workflows.pptx | PPTX]], [[:File:2023_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*GC11 conference (Solid Earth and Geohazards in the Exascale Era): [[:File:GC11_Callaghan_workflows.pptx | PPTX]]&lt;br /&gt;
*CyberTraining for Seismology talk on CyberShake Data Access tool: [[:File:CyberShake_tutorial_for_2023_CyberTraining.pptx | PPTX]]&lt;br /&gt;
*SSA CyberShake Study 22.12 talk: [[:File:2023_SSA_CyberShake_22_12.pptx | PPTX]], [[:File:2023_SSA_CyberShake_22_12.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2022 ==&lt;br /&gt;
&lt;br /&gt;
*SC22 Early Career talk (also given to ECP Work/Life balance group, and Sandia Parents Group): [[:File:2022_parental_balance.ppt | PPT]], [[:File:2022_parental_balance.pdf | PDF]]&lt;br /&gt;
*SC22 SIGHPC Education Chapter overview: [[:File:SC22_SIGHPC_Edu_overview.pptx | PPTX]], [[:File:SC22_SIGHPC_Edu_overview.pdf | PDF]]&lt;br /&gt;
*SOURCES career talk: [[:File:2022_Sources_career_talk.odp | ODP ]]&lt;br /&gt;
*IHPCSS workflow talk: [[:File:2022_IHPCSS_talk.pptx | PPTX]]&lt;br /&gt;
*SSA Broadband CyberShake Validation talk: [[:File:2022_SSA_Broadband_CyberShake.pptx | PPTX]], [[:File:2022_SSA_Broadband_CyberShake.pdf | PDF]]&lt;br /&gt;
*SSA CyberShake Study 21.12 talk: [[:File:2022_SSA_CyberShake_21_12.pptx | PPTX]], [[:File:2022_SSA_CyberShake_21_12.pdf | PDF]]&lt;br /&gt;
*SCEC staff meeting talk: [[:File:Feb_2022_staff_meeting.pptx | PPTX]], [[:File:Feb_2022_staff_meeting.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2021 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2021_CyberShake.pptx | PPTX full]], [[:File:AGU_2021_CyberShake_lighting.pptx | PPTX lightning]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2021_IHPCSS_workflow.pptx | PPTX]], [[:File:2021_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2020 ==&lt;br /&gt;
* Polytechnic talk resources:  [[Poly 2020 outreach discussion]]&lt;br /&gt;
* AGU CyberShake talk: [[File:AGU_2020_CyberShake.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2019 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2019_CyberShake.pptx | PPTX]]&lt;br /&gt;
* SC19 USC booth talk: [[:File:SC19_Callaghan_USC_booth.pptx | PPTX]] or [[:File:SC19_Callaghan_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SCEC Research Computing workshop lightning talk: [[:File:2019_SCEC_Research_Computing_CyberShake_lightning.pdf | PDF]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2019_IHPCSS_workflow.pptx | PPTX]]&lt;br /&gt;
* UseIT talk about HPC at SCEC: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202019.pptx slides (PPTX), external link]&lt;br /&gt;
* SSA CyberShake science talk: [[:File:2019_SSA_CyberShake_Science_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Science_Presentation.pdf | PDF]].  Here are links to the [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/point_src_gtl_v2.wmv 10 km smoothing movie] and [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/pt_src_gtl_20km_v2.wmv 20 km smoothing movie]&lt;br /&gt;
* SSA CyberShake technical talk: [[:File:2019_SSA_CyberShake_Technical_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Technical_Presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2018 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2018_IHPCSS_workflow.pptx | PPTX]] or [[:File:2018_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* Blue Waters Symposium talk: [[:File:Callaghan_Blue_Waters_Symposium.pptx | PPTX]]&lt;br /&gt;
* QuakeCore GMS&amp;amp;V talk: [[:File:CyberShake_QuakeCore_Presentation.pptx | PPTX]] or [[:File:CyberShake_QuakeCore_Presentation.pdf | PDF]]&lt;br /&gt;
* Machine learning with Keras overview: [[:File:Machine_Learning_with_Keras.odp | ODP]]&lt;br /&gt;
&lt;br /&gt;
== 2017 ==&lt;br /&gt;
&lt;br /&gt;
* AGU CyberShake talk: [[:File:2017_AGU_CyberShake.pptx | PPTX]]&lt;br /&gt;
* PG&amp;amp;E CyberShake update: [[:File:PGE_CyberShake_update.pptx | PPTX]] or [[:File:PGE_CyberShake_update.pdf | PDF]]&lt;br /&gt;
* SC17 USC booth talk: [[:File:SC17_USC_booth.pptx | PPTX]] or [[:File:SC17_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SC17 WORKS'17 talk on rvGAHP: [[:File:WORKS17_rvGAHP.pptx | PPTX]] or [[:File:WORKS17_rvGAHP.pdf | PDF]]&lt;br /&gt;
* SC17 Women in HPC Mentoring talk: [[:File:2017_WHPC_Workshop.pptx | PPTX]] or [[:File:2017_WHPC_Workshop.pdf | PDF]]&lt;br /&gt;
* SCEC Annual Meeting plenary CyberShake presentation: [[:File:2017_SCEC_AM_CyberShake.pptx | PPTX]] or [[:File:2017_SCEC_AM_CyberShake.pdf | PDF]]&lt;br /&gt;
* SCEC Nonlinear Workshop presentation: [[:File:2017_SCEC_Nonlinear_Workshop.pptx | PPTX]] or [[:File:2017_SCEC_Nonlinear_Workshop.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2017_IHPCSS_workflow.pptx | PPTX]] or [[:File:2017_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT Blue Waters tutorial: [[:File:2017_UseIT_HPC_tutorial.odt | tutorial text (ODT)]] and [[:File:2017_UseIT_Linux_commands.doc | Linux Command guide (DOC)]]&lt;br /&gt;
* UseIT HPC at SCEC talk: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202017.pptx slides (PPTX), external link] and [[:File:2017_UseIT_HPC_spreadsheet.xlsx | supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* SSA CyberShake talk: [[:File:Callaghan_2017_SSA_CyberShake.pptx | PPTX]] or [[:File:Callaghan_2017_SSA_CyberShake.pdf | PDF]]&lt;br /&gt;
* Blue Waters workflow seminar: [[:File:Blue_Waters_Workflow_Seminar_Overview.pptx | PPTX]] or [[:File:Blue_Waters_Workflow_Seminar_Overview.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2016 ==&lt;br /&gt;
&lt;br /&gt;
* SC16 USC booth talk: [[:File:SC16_RSQSim_UseIT_USC_booth.pdf | PDF]] or [[:File:SC16_RSQSim_UseIT_USC_booth.odp | ODP]]&lt;br /&gt;
* SCEC Annual Meeting: [[:File:SCEC_2016_AM_CyberShake_CISM.pptx | PPTX]] or [[:File:SCEC_2016_AM_CyberShake_CISM.pdf | PDF]]&lt;br /&gt;
* XSEDE Workflow overview talk: [[File:2016_Callaghan_overview_of_workflows.pptx | PPTX]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2016_IHPCSS_workflow.pptx | PPTX]] or [[:File:2016_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC.pptx | HPC talk (PPTX)]] and [[:File:2016_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC_tutorial.odt | HPC tutorial (ODT)]] and [[:File:2016_UseIT_Linux_commands.odt | Sample Linux Commands (ODT)]]&lt;br /&gt;
* CyberShake [[:File:2016_CCSP.odp | CCSP presentation (ODP)]]&lt;br /&gt;
* CyberShake [[:File:2016_UCERF3_downsampling.odp | UCERF3 downsampling presentation (ODP)]]&lt;br /&gt;
&lt;br /&gt;
== 2015 ==&lt;br /&gt;
&lt;br /&gt;
* SC15 USC booth talk:  [[:File:SC15_CyberShake_USC_booth.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2014 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS seismology talk: [[:File:2014_IHPCSS_seismology.pptx | PPTX]] or [[:File:2014_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2014_IHPCSS_workflow.pptx | PPTX]] or [[:File:2014_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2014_UseIT_HPC.pptx |  HPC tutorial (PPTX)]]  [[:File:2014_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]  [[:File:2014_UseIT_HPC_matrix_mult.docx | Supplemental matrix multiplication (DOCX)]]&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
*[[SC16]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30439</id>
		<title>Callaghan Presentations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Callaghan_Presentations&amp;diff=30439"/>
		<updated>2025-09-26T06:04:02Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Below are links to presentations and related resources given by Scott Callaghan.&lt;br /&gt;
&lt;br /&gt;
== 2025 ==&lt;br /&gt;
*IHPCSS workflow presentation:&lt;br /&gt;
*USGS NorCal Earthquake Hazards workshop CyberShake presentation:&lt;br /&gt;
&lt;br /&gt;
== 2024 ==&lt;br /&gt;
*AGU24 and December staff meeting presentation: [[:File:AGU24_CyberShake_24_8_presentation.pptx | PPTX]], [[:File:AGU24_CyberShake_24_8_presentation.pdf | PDF]]&lt;br /&gt;
*SC24 presentation at OSU booth: [[:File:SC24_OSU_MPI_compression.pptx | PPTX]], [[:File:SC24_OSU_MPI_compression.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake Study 24.8 overview: [[:File:Study_24.8_overview_for_NGAW3.odp | ODP]], [[:File:Study_24.8_overview_for_NGAW3.pdf | PDF]]&lt;br /&gt;
*Geo-INQUIRE Data Lake workshop CyberShake presentation: [[:File:CyberShake_Data_Lake_workshop.pptx | PPTX]], [[:File:CyberShake_Data_Lake_workshop.pdf | PDF]]&lt;br /&gt;
*IHPCSS Workflow talk: [[:File:2024_IHPCSS_workflows.pptx | PPTX]], [[:File:2024_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2023 ==&lt;br /&gt;
*December Staff Meeting CyberShake updates: [[:File:Dec_Staffmtg_CyberShake_update.pptx | PPTX]], [[:File:Dec_Staffmtg_CyberShake_update.pdf | PDF]]&lt;br /&gt;
*NGA-West3 CyberShake presentation: [[:File:2023_NGA_West3.pptx | PPTX]], [[:File:2023_NGA_West3.pdf | PDF]]&lt;br /&gt;
*SC23 early career talk: [[:File:SC23_ECP_career_talk.pptx | PPTX]], [[:File:SC23_ECP_career_talk.pdf | PDF]]&lt;br /&gt;
*IHPCSS seismology presentation: [[:File:2023_IHPCSS_seismology.pptx | PPTX]], [[:File:2023_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
*IHPCSS workflow presentation: [[:File:2023_IHPCSS_workflows.pptx | PPTX]], [[:File:2023_IHPCSS_workflows.pdf | PDF]]&lt;br /&gt;
*GC11 conference (Solid Earth and Geohazards in the Exascale Era): [[:File:GC11_Callaghan_workflows.pptx | PPTX]]&lt;br /&gt;
*CyberTraining for Seismology talk on CyberShake Data Access tool: [[:File:CyberShake_tutorial_for_2023_CyberTraining.pptx | PPTX]]&lt;br /&gt;
*SSA CyberShake Study 22.12 talk: [[:File:2023_SSA_CyberShake_22_12.pptx | PPTX]], [[:File:2023_SSA_CyberShake_22_12.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2022 ==&lt;br /&gt;
&lt;br /&gt;
*SC22 Early Career talk (also given to ECP Work/Life balance group, and Sandia Parents Group): [[:File:2022_parental_balance.ppt | PPT]], [[:File:2022_parental_balance.pdf | PDF]]&lt;br /&gt;
*SC22 SIGHPC Education Chapter overview: [[:File:SC22_SIGHPC_Edu_overview.pptx | PPTX]], [[:File:SC22_SIGHPC_Edu_overview.pdf | PDF]]&lt;br /&gt;
*SOURCES career talk: [[:File:2022_Sources_career_talk.odp | ODP ]]&lt;br /&gt;
*IHPCSS workflow talk: [[:File:2022_IHPCSS_talk.pptx | PPTX]]&lt;br /&gt;
*SSA Broadband CyberShake Validation talk: [[:File:2022_SSA_Broadband_CyberShake.pptx | PPTX]], [[:File:2022_SSA_Broadband_CyberShake.pdf | PDF]]&lt;br /&gt;
*SSA CyberShake Study 21.12 talk: [[:File:2022_SSA_CyberShake_21_12.pptx | PPTX]], [[:File:2022_SSA_CyberShake_21_12.pdf | PDF]]&lt;br /&gt;
*SCEC staff meeting talk: [[:File:Feb_2022_staff_meeting.pptx | PPTX]], [[:File:Feb_2022_staff_meeting.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2021 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2021_CyberShake.pptx | PPTX full]], [[:File:AGU_2021_CyberShake_lighting.pptx | PPTX lightning]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2021_IHPCSS_workflow.pptx | PPTX]], [[:File:2021_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2020 ==&lt;br /&gt;
* Polytechnic talk resources:  [[Poly 2020 outreach discussion]]&lt;br /&gt;
* AGU CyberShake talk: [[File:AGU_2020_CyberShake.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2019 ==&lt;br /&gt;
* AGU CyberShake talk: [[:File:AGU_2019_CyberShake.pptx | PPTX]]&lt;br /&gt;
* SC19 USC booth talk: [[:File:SC19_Callaghan_USC_booth.pptx | PPTX]] or [[:File:SC19_Callaghan_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SCEC Research Computing workshop lightning talk: [[:File:2019_SCEC_Research_Computing_CyberShake_lightning.pdf | PDF]]&lt;br /&gt;
* IHPCSS Workflow talk: [[:File:2019_IHPCSS_workflow.pptx | PPTX]]&lt;br /&gt;
* UseIT talk about HPC at SCEC: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202019.pptx slides (PPTX), external link]&lt;br /&gt;
* SSA CyberShake science talk: [[:File:2019_SSA_CyberShake_Science_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Science_Presentation.pdf | PDF]].  Here are links to the [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/point_src_gtl_v2.wmv 10 km smoothing movie] and [http://hypocenter.usc.edu/research/cybershake/study_18_5/fwd_sims/pt_src_gtl_20km_v2.wmv 20 km smoothing movie]&lt;br /&gt;
* SSA CyberShake technical talk: [[:File:2019_SSA_CyberShake_Technical_Presentation.pptx | PPTX]] or [[:File:2019_SSA_CyberShake_Technical_Presentation.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2018 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2018_IHPCSS_workflow.pptx | PPTX]] or [[:File:2018_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* Blue Waters Symposium talk: [[:File:Callaghan_Blue_Waters_Symposium.pptx | PPTX]]&lt;br /&gt;
* QuakeCore GMS&amp;amp;V talk: [[:File:CyberShake_QuakeCore_Presentation.pptx | PPTX]] or [[:File:CyberShake_QuakeCore_Presentation.pdf | PDF]]&lt;br /&gt;
* Machine learning with Keras overview: [[:File:Machine_Learning_with_Keras.odp | ODP]]&lt;br /&gt;
&lt;br /&gt;
== 2017 ==&lt;br /&gt;
&lt;br /&gt;
* AGU CyberShake talk: [[:File:2017_AGU_CyberShake.pptx | PPTX]]&lt;br /&gt;
* PG&amp;amp;E CyberShake update: [[:File:PGE_CyberShake_update.pptx | PPTX]] or [[:File:PGE_CyberShake_update.pdf | PDF]]&lt;br /&gt;
* SC17 USC booth talk: [[:File:SC17_USC_booth.pptx | PPTX]] or [[:File:SC17_USC_Booth.pdf | PDF]]&lt;br /&gt;
* SC17 WORKS'17 talk on rvGAHP: [[:File:WORKS17_rvGAHP.pptx | PPTX]] or [[:File:WORKS17_rvGAHP.pdf | PDF]]&lt;br /&gt;
* SC17 Women in HPC Mentoring talk: [[:File:2017_WHPC_Workshop.pptx | PPTX]] or [[:File:2017_WHPC_Workshop.pdf | PDF]]&lt;br /&gt;
* SCEC Annual Meeting plenary CyberShake presentation: [[:File:2017_SCEC_AM_CyberShake.pptx | PPTX]] or [[:File:2017_SCEC_AM_CyberShake.pdf | PDF]]&lt;br /&gt;
* SCEC Nonlinear Workshop presentation: [[:File:2017_SCEC_Nonlinear_Workshop.pptx | PPTX]] or [[:File:2017_SCEC_Nonlinear_Workshop.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2017_IHPCSS_workflow.pptx | PPTX]] or [[:File:2017_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT Blue Waters tutorial: [[:File:2017_UseIT_HPC_tutorial.odt | tutorial text (ODT)]] and [[:File:2017_UseIT_Linux_commands.doc | Linux Command guide (DOC)]]&lt;br /&gt;
* UseIT HPC at SCEC talk: [http://hypocenter.usc.edu/research/presentations/SCEC%20HPC%202017.pptx slides (PPTX), external link] and [[:File:2017_UseIT_HPC_spreadsheet.xlsx | supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* SSA CyberShake talk: [[:File:Callaghan_2017_SSA_CyberShake.pptx | PPTX]] or [[:File:Callaghan_2017_SSA_CyberShake.pdf | PDF]]&lt;br /&gt;
* Blue Waters workflow seminar: [[:File:Blue_Waters_Workflow_Seminar_Overview.pptx | PPTX]] or [[:File:Blue_Waters_Workflow_Seminar_Overview.pdf | PDF]]&lt;br /&gt;
&lt;br /&gt;
== 2016 ==&lt;br /&gt;
&lt;br /&gt;
* SC16 USC booth talk: [[:File:SC16_RSQSim_UseIT_USC_booth.pdf | PDF]] or [[:File:SC16_RSQSim_UseIT_USC_booth.odp | ODP]]&lt;br /&gt;
* SCEC Annual Meeting: [[:File:SCEC_2016_AM_CyberShake_CISM.pptx | PPTX]] or [[:File:SCEC_2016_AM_CyberShake_CISM.pdf | PDF]]&lt;br /&gt;
* XSEDE Workflow overview talk: [[File:2016_Callaghan_overview_of_workflows.pptx | PPTX]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2016_IHPCSS_workflow.pptx | PPTX]] or [[:File:2016_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC.pptx | HPC talk (PPTX)]] and [[:File:2016_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]&lt;br /&gt;
* UseIT [[:File:2016_UseIT_HPC_tutorial.odt | HPC tutorial (ODT)]] and [[:File:2016_UseIT_Linux_commands.odt | Sample Linux Commands (ODT)]]&lt;br /&gt;
* CyberShake [[:File:2016_CCSP.odp | CCSP presentation (ODP)]]&lt;br /&gt;
* CyberShake [[:File:2016_UCERF3_downsampling.odp | UCERF3 downsampling presentation (ODP)]]&lt;br /&gt;
&lt;br /&gt;
== 2015 ==&lt;br /&gt;
&lt;br /&gt;
* SC15 USC booth talk:  [[:File:SC15_CyberShake_USC_booth.pptx | PPTX]]&lt;br /&gt;
&lt;br /&gt;
== 2014 ==&lt;br /&gt;
&lt;br /&gt;
* IHPCSS seismology talk: [[:File:2014_IHPCSS_seismology.pptx | PPTX]] or [[:File:2014_IHPCSS_seismology.pdf | PDF]]&lt;br /&gt;
* IHPCSS workflow talk: [[:File:2014_IHPCSS_workflow.pptx | PPTX]] or [[:File:2014_IHPCSS_workflow.pdf | PDF]]&lt;br /&gt;
* UseIT [[:File:2014_UseIT_HPC.pptx |  HPC tutorial (PPTX)]]  [[:File:2014_UseIT_HPC_spreadsheet.xlsx | Supplemental spreadsheet (XLSX)]]  [[:File:2014_UseIT_HPC_matrix_mult.docx | Supplemental matrix multiplication (DOCX)]]&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
*[[SC16]]&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Broadband_CyberShake_Validation&amp;diff=30435</id>
		<title>Broadband CyberShake Validation</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Broadband_CyberShake_Validation&amp;diff=30435"/>
		<updated>2025-09-16T00:12:02Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Multi-fault GMPEs */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page follows on [[CyberShake BBP Verification]], moving from 1D comparisons to 3D CyberShake comparisons with both the BBP and observations.&lt;br /&gt;
&lt;br /&gt;
== Process ==&lt;br /&gt;
&lt;br /&gt;
These are the steps involved in comparing CyberShake and BBP results to observations for a verification event.&lt;br /&gt;
&lt;br /&gt;
=== BBP ===&lt;br /&gt;
&lt;br /&gt;
#Create a working directory.&lt;br /&gt;
#Create an src_files directory inside it.&lt;br /&gt;
#Copy the src file from the BBP validation directory into the src_files directory.&lt;br /&gt;
#Copy the station list into the working directory.&lt;br /&gt;
#Run run_bbp.py with the '--expert -g' arguments to create an XML file description of the run.  Instead of using the defaults for the validation event, point to the station list file in the working directory and the src file in the src_files directory.  Include the GP GOF, but not FAS.&lt;br /&gt;
#Copy the created XML file into the working directory.&lt;br /&gt;
#Run helper_scripts/create_srcs.sh, which will create 64 random seeds and then create 64 realizations by copying over the initial src file and changing the random seed&lt;br /&gt;
#Create 64 directories, one for each source.&lt;br /&gt;
#Run helper_scripts/prepare_lsf_files.sh script.  This will copy both the XML file and the run_bbp_src.lsf script into each realization directory, and make the needed changes to each for that realization. &lt;br /&gt;
#Submit all 64 src_*/run_bbp_src*.lsf files.&lt;br /&gt;
#Once all 64 run successfully, make a purple_plot/Sims directory, and logs, indata, tmpdata, and outdata directories inside it.&lt;br /&gt;
#Edit stage_purple_plot.sh to have the right sim_id.&lt;br /&gt;
#Run stage_purple_plot.sh .&lt;br /&gt;
#Run utils/batch/combine_gof_gen.py -d purple_plot -o . -c gp&lt;br /&gt;
#Examine the .png file produced to make sure it looks OK.&lt;br /&gt;
&lt;br /&gt;
=== CyberShake ===&lt;br /&gt;
&lt;br /&gt;
==== Set up and run BBP validation event ====&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;b&amp;gt;Setup&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create working directory on local system.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy in the following scripts:&lt;br /&gt;
*insert_sites_from_list.py&lt;br /&gt;
*pop_seeds.py&lt;br /&gt;
*update_dist.py&lt;br /&gt;
*pop_rvs.py&lt;br /&gt;
*pop_site_ruptures.py&amp;lt;/li&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;&amp;lt;br/&amp;gt;Create the sites.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy over *.stl list from BBP validation event.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy stl list into a bbp_to_cs version of the stl list.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run the script  ./check_for_duplicate_stations.py to see if any stations used by this event are already in the CyberShake site list.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;For the stations not in the CyberShake site list, determine their full name from https://www.strongmotioncenter.org/wserv/stations/builder/ .  If no full name exists, look at Google maps and make one up.  Add the abbreviation to the bbp_to_cs file, and add a full entry (lon, lat, short name, long name, 3) for each new site into a sites_to_insert.txt file.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Check each new site short name to make sure one doesn’t already exist in the DB.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run insert_sites_from_list.py on the sites_to_insert.txt file.&amp;lt;/li&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;&amp;lt;br/&amp;gt;Set up the rupture variations remotely.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;On the remote system, create a directory inside the Ruptures_erf60 directory corresponding to the source_id of the run.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Inside the source_id directory, create one directory for each realization, which will each be its own rupture.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy the realizations from the BBP tmpdirs in which the realizations were created, into the corresponding source directory.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a symlink to each SRF with the name &amp;lt;src_id&amp;gt;_&amp;lt;rup_id&amp;gt;_event&amp;lt;rup_id&amp;gt;.srf .&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create rupture geometries, &amp;lt;src_id&amp;gt;_&amp;lt;rup_id&amp;gt;.txt, for each file.  Note that each of these rupture geometries will be slightly different due to the subfaults shifting.  To create the rupture geometry files, run RuptureCodes/utils/extract_rup_geom on each rupture.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine the distances from each station to each rupture variation by running calc_distance.py for each variation on a site file in lon, lat, short name format.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy the rupture geometry files and the distance files back locally.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy the random_seeds.txt file from the BBP validation dir/src_files locally.&amp;lt;/li&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;&amp;lt;br/&amp;gt;Insert rupture variations into database.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run pop_rvs.py, pointing to the rupture geometry files, to insert the rupture variations into the DB.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run pop_site_ruptures.py, pointing to a site list file, the distance files, and rupture geometry files.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run pop_seeds.py on the random_seeds.txt file to populate the seeds into the DB.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a new RLS file on shock with the LFNs for both the rupture geometries and the SRFs.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run pegasus-rc-client to insert the LFNs into the DB.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Perform GoF analysis on results ====&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a working directory.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy the src_files directory from the Broadband validation run directory.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy in the *.stl file from the BBP for this validation event.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create site_list.txt, with one CyberShake site per line.&lt;br /&gt;
&amp;lt;li&amp;gt;From another CyberShake validation directory, copy in the following scripts:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;import.lsf&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;prep.lsf&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;prep_stat.sh&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;rename.py&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;run_gof.sh&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;stage_purple_plot.sh&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;run_cs_src0.lsf&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a .tgz file on shock with the seismograms for all the stations and all the events.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Unzip this .tgz file into the working directory.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Submit prep.lsf.  This will copy the seismograms from the station directories into the src directories, and then process them into BBP-friendly format.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Edit rename.py to point to the BB-to-CS STL file, and then run it to rename the seismograms to include the full BBP site names.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Edit import.lsf to use the correct Run ID prefix, src_id, directory name, and BBP STL file, then submit it.  This script will import the seismograms into BBP working directories, so that the BBP can run GoF on them.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run the BBP to create an XML file which includes running the GoF.  To do this, run &amp;lt;pre&amp;gt;run_bbp.py --expert -g&amp;lt;/pre&amp;gt; When prompted, select&lt;br /&gt;
&amp;lt;ul&amp;gt;&amp;lt;li&amp;gt;'validation simulation'&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;the validation simulation created in the first part from the list&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;'GP'&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes custom source&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;point to BBP-provided source installed in src_files directory&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes rupture generator&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;custom list&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;point to list in the working directory&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes site response&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes velocity seismogram plots&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes acceleration seismogram plots&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;no GMPE comparisons&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes goodness-of-fit&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;'GP'&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;no additional metrics&amp;lt;/li&amp;gt;&amp;lt;/ul&amp;gt;&lt;br /&gt;
Copy the created XML file into the working directory.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Edit run_cs_src0.lsf and set SIM_ID_PREFIX, DIR_NAME, and XML_PREFIX to the correct values for this event.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Edit run_gof.sh to set DIR_NAME and XML_PREFIX to the same values, and modify the paths to the *.src and *.srf files used in the sed commands.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run run_gof.sh, which will submit a job to calculate GoF for each of the 64 realizations.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create the directory purple_plot/Sims.  Inside there, create 4 directories: logs, indata, tmpdata, and outdata.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Edit stage_purple_plot.sh to use the correct sim id.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run stage_purple_plot.sh.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create the directory purple_plot_output.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run bbp/utils/batch/combine_gof_gen.py -d purple_plot -o purple_plot_output -c gp to produce a purple plot, combining the GoF results from all stations and events.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run bbp/utils/batch/compare_bbp_runs.py -d purple_plot -o purple_plot_output/ -c gp to generate the bias values.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Multi-fault GMPEs ===&lt;br /&gt;
&lt;br /&gt;
The BBP, as of v22.4, isn't set up to produce realistic GMPE estimates for multi-fault ruptures.  Therefore, producing meaningful GMPE comparisons requires generating the GMPE estimates outside of the BBP and then importing them.  The steps for this are detailed below:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ask someone (like Kevin) to produce GMPE estimates at a variety of periods for the event.  The station list and event information must be provided.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;This information is delivered in a series of CSV files, one per period.  Convert the CSV files into .ri50 files.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy the .ri50 files into a working directory.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;To get the observation files in *.rd50 format, you can either copy them in from a previous run, or you can generate them.  To generate them, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;$&amp;gt; mkdir $BBP_DATA_DIR/logs/&amp;lt;arbitrary run ID&amp;gt;&lt;br /&gt;
$&amp;gt; $BBP_DIR/comps/obs_seismograms.py $BBP_VAL_DIR/path/to/site_list.stl $BBP_VAL_DIR/path/to/observations/Acc acc_peer &amp;quot;&amp;quot; &amp;lt;arbitrary run ID&amp;gt;&lt;br /&gt;
$&amp;gt; cp -r $BBP_DATA_DIR/&amp;lt;arbitrary run id&amp;gt;/obs_seis_&amp;lt;event&amp;gt; .&amp;lt;/pre&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Next, run gmpe_gof.py to generate the comparison metrics:&lt;br /&gt;
&amp;lt;pre&amp;gt;$GMSVTOOLKIT_DIR/stats/gmpe_gof.py --gmpe-dir . --comp-dir obs_seis_&amp;lt;event&amp;gt; --station-list /path/to/site_list.stl --src-file /path/to/first/segment/src_file.src --comp-label &amp;lt;label&amp;gt; --gmpe-group nga-west2&amp;lt;/pre&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Finally, plot the results using plot_gmpe_gof.py:&lt;br /&gt;
&amp;lt;pre&amp;gt;$GMSVTOOLKIT_DIR/plots/plot_gmpe_gof.py --input-dir . --output-dir . --comp-label &amp;lt;label&amp;gt; --gmpe-group nga-west2 --station-list /path/to/site_list.stl&amp;lt;/pre&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Northridge ==&lt;br /&gt;
&lt;br /&gt;
=== 1D BBP comparisons ===&lt;br /&gt;
&lt;br /&gt;
We calculated 64 realizations for Northridge for these [[Media:northridge_stations.kml | 38 stations]].&lt;br /&gt;
&lt;br /&gt;
Some of the differences between the 1D BBP and 3D CyberShake results can be attributed to differences in site response, which is calculated based on the reference velocity ('vref') and the vs30 of the site.  A spreadsheet with vref and Vs30 for both the BBP and CyberShake is available [[File:BBP_v_CyberShake_Northridge_vs30.xls | here.]]&lt;br /&gt;
&lt;br /&gt;
==== V1 (2/14/22) ====&lt;br /&gt;
&lt;br /&gt;
Initially, we used vref=500 for both the high-frequency and low-frequency site response.  However, this is incorrect; vref for the low-frequency should vary depending on the properties of the 3D velocity mesh.&lt;br /&gt;
&lt;br /&gt;
==== V2 (2/25/22) ====&lt;br /&gt;
&lt;br /&gt;
We continued to use vref=500 for the high-frequency site response.  For the low-frequency site response, we are now using the same vref we used in Study 15.12:&lt;br /&gt;
&lt;br /&gt;
vref_LF_eff = Vs30 * [ VsD5H / Vs5H ]&lt;br /&gt;
&lt;br /&gt;
Vs30 = 30 / Sum (1/(Vs sampled from [0.5,29.5] in 1 meter increments))&lt;br /&gt;
&lt;br /&gt;
H = grid spacing&lt;br /&gt;
&lt;br /&gt;
Vs5H = travel time averaged Vs, computed from the CVM in 1 meter increments down to a depth of 5*H&lt;br /&gt;
&lt;br /&gt;
VsD5H = discrete travel time averaged Vs computed from 3D velocity mesh used in the SGT calculation over the upper 5 grid points&lt;br /&gt;
&lt;br /&gt;
So, for H=100m Vs5H would be:&lt;br /&gt;
&lt;br /&gt;
Vs500 = 500 / ( Sum ( 1 / Vs sampled from [0.5,499.5] in 1 meter increments ))&lt;br /&gt;
&lt;br /&gt;
And then VsD5H is given by:&lt;br /&gt;
&lt;br /&gt;
VsD500 = 5/{ 0.5/Vs(Z=0m) + 1/Vs(Z=100m) + 1/Vs(Z=200m) + 1/Vs(Z=300m) + 1/Vs(Z=400m) + 0.5/Vs(Z=500m) } &lt;br /&gt;
&lt;br /&gt;
Below are the ts_process plots for a subset of 10 stations, comparing the 3D CyberShake with 1D BBP results.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! TS Process plot&lt;br /&gt;
|-&lt;br /&gt;
! SCE || [[File:ts_process_SCE_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! SYL || [[File:ts_process_SYL_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! LDM || [[File:ts_process_LDM_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! PAC || [[File:ts_process_PAC_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! PKC || [[File:ts_process_PKC_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! SPV || [[File:ts_process_SPV_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! WON || [[File:ts_process_WON_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! KAT || [[File:ts_process_KAT_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! RO3 || [[File:ts_process_RO3_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! ANA || [[File:ts_process_ANA_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== V3 (3/2/22) ====&lt;br /&gt;
&lt;br /&gt;
Next, we recalculated the CyberShake results for 10 sites, using the BBP vs30 values for both the low-frequency and high-frequency elements.  Note that the vref low-frequency value for CyberShake is still being derived from the velocity model.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! TS Process plot&lt;br /&gt;
|-&lt;br /&gt;
! SCE || [[File:ts_process_SCE_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! SYL || [[File:ts_process_SYL_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! LDM || [[File:ts_process_LDM_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! PAC || [[File:ts_process_PAC_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! PKC || [[File:ts_process_PKC_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! SPV || [[File:ts_process_SPV_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! WON || [[File:ts_process_WON_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! KAT || [[File:ts_process_KAT_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! RO3 || [[File:ts_process_RO3_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! ANA || [[File:ts_process_ANA_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Additionally, here are vertical cross-sections through CVM-S4.26.M01 with 1 km and 10 km depth.  LDM is at the left-hand edge of the plot, and SYL is precisely in the middle.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:LDM-SYL_cross_section_1km.png|thumb|400px|1 km depth]] || [[File:LDM-SYL_cross_section_10km.png|thumb|400px|10 km depth]]&lt;br /&gt;
|} //--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Observational Comparisons ===&lt;br /&gt;
&lt;br /&gt;
We calculated goodness-of-fit results for both Broadband CyberShake and the BBP against observations for Northridge, using the 64 realizations and 38 stations.&lt;br /&gt;
&lt;br /&gt;
TS process plots comparing the 3 results are available here: [https://g-c662a6.a78b8.36fe.data.globus.org/cybershake/Broadband_CyberShake/validation/Northridge/ts_process_nr.tgz v1 ts process plots].&lt;br /&gt;
&lt;br /&gt;
==== Broadband Platform ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#7) !! GoF, worse realization (#15)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-NR-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-NR-real7.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-NR-real15.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake v4_8 ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#53) !! GoF, worse realization (#54)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_real53.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_real54.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== Broadband Platform, updated frequency bands ====&lt;br /&gt;
&lt;br /&gt;
We realized that our GoF comparisons were run using hard-coded frequency bands of [0.05, 50] Hz.  This was correct when doing CS-to-BBP comparisons, but not correct when using observations.  We updated the frequency bands and reran the GoF.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best BBP realization (#47) !! GoF, worst BBP realization (#25) !! GoF, best CS realization (#40) !! GoF, worst CS realization (#37)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-freqbands-GoF-NR-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-freqbands-GoF-NR-real47.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-freqbands-GoF-NR-real25.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-freqbands-GoF-NR-real40.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-freqbands-GoF-NR-real37.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== CyberShake v4_8, updated frequency bands ====&lt;br /&gt;
&lt;br /&gt;
As above, reran the GoF.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#40) !! GoF, worst CS realization (#37) !! GoF, best BBP realization (#47) !! GoF, worst BBP realization (#25)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_freqbands_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_freqbands_real40.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_freqbands_real37.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_freqbands_real47.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_freqbands_real25.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== CyberShake v4_28, BBP Vs30 ====&lt;br /&gt;
&lt;br /&gt;
We recalculated the CyberShake GoF results, using the BBP Vs30 values.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#40) !! GoF, worst CS realization (#51) !! GoF, best BBP realization (#47) !! GoF, worst BBP realization (#25)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_28_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_28_real40.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_28_real51.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_28_real47.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_28_real25.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== BBP, FAS ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:CyberShake-GoF-NR-BBP_FAS.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake v4_7, FAS ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_7_FAS.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Observational Comparisons, BBP v22.4 ===&lt;br /&gt;
&lt;br /&gt;
We repeated the comparisons, using BBP v22.4 for both the 1D BBP and 3D CyberShake calculations.&lt;br /&gt;
&lt;br /&gt;
==== BBP ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best BBP realization (#6) !! GoF, worst BBP realization (#51) !! GoF, best CS realization (#40) !! GoF, worst CS realization (#51)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-v22.4-GoF-NR-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-v22.4-GoF-NR-real6.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-v22.4-GoF-NR-real51.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-v22.4-GoF-NR-real40.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-v22.4-GoF-NR-real51.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#40) !! GoF, worst CS realization (#51) !! GoF, best BBP realization (#6) !! GoF, worst BBP realization (#51)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-9_1_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-9_1_real40.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-9_1_real51.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-9_1_real6.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-9_1_real51.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, updated velocity model ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#33) !! GoF, worst CS realization (#51)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-11_28_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-11_28_real33.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-11_28_real51.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Study 22.12 configuration ===&lt;br /&gt;
&lt;br /&gt;
These are the validation results using the Study 22.12 configuration, with risetime_coef=2.3 and hb_high_v6.1.1.&lt;br /&gt;
&lt;br /&gt;
==== BBP (12/9/22) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#40) !! GoF, worst CS realization (#9) !! GoF, best BBP realization (#6) !! GoF, worst BBP realization (#51)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_9_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_9_real40.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_9_real9.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_9_real6.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_9_real51.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake (12/11/22) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#40) !! GoF, worst CS realization (#9) !! GoF, best BBP realization (#6) !! GoF, worst BBP realization (#51)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_11_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_11_real40.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_11_real9.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_11_real6.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_11_real51.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== SRF plots ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Best CS realization (#40)&lt;br /&gt;
| [[File:NR_v14_02_1_40.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! Best BBP realization (#6)&lt;br /&gt;
| [[File:NR_v14_02_1_6.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Chino Hills ==&lt;br /&gt;
&lt;br /&gt;
For our second event, we selected Chino Hills.&lt;br /&gt;
&lt;br /&gt;
=== Sites ===&lt;br /&gt;
&lt;br /&gt;
The BBP validation event for Chino Hills has 40 stations.  A KML file with the stations is available [[:File:chino_hills_stations.kml|here]].&lt;br /&gt;
&lt;br /&gt;
=== BBP ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#24) !! GoF, worst realization (#35) !! GoF, worst CS realization (#34)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-CH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-CH-real24.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-CH-real35.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-CH-real34.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:BBP-GoF-CH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#24) !! GoF, worst realization (#34) !! GoF, worst BBP realization (#35)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-GoF-CH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-CH-real24.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-CH-real34.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-CH-real35.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:CS-GoF-CH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP Vs30 values ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#24) !! GoF, worst realization (#34) !! GoF, worst BBP realization (#35)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-BBPvs30-GoF-CH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-BBPvs30-GoF-CH-real24.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-BBPvs30-GoF-CH-real34.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-BBPvs30-GoF-CH-real35.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:CS-BBPvs30-GoF-CH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP v22.4 ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#24) !! GoF, worst realization (#34) !! GoF, best BBP realization (#24) !! GoF, worst BBP realization (#35)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-9_2-GoF-CH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_2-GoF-CH-real24.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_2-GoF-CH-real34.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_2-GoF-CH-real24.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_2-GoF-CH-real35.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP v22.4, BBP Vs30 ===&lt;br /&gt;
&lt;br /&gt;
== Whittier ==&lt;br /&gt;
&lt;br /&gt;
Our third event is Whittier.&lt;br /&gt;
&lt;br /&gt;
=== Sites ===&lt;br /&gt;
&lt;br /&gt;
The BBP validation event for Whittier has 39 stations, one of which is a duplicate.  A KML file with the 38 stations used in these tests is available [[:File:whittier_stations.kml|here]].&lt;br /&gt;
&lt;br /&gt;
=== BBP ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#15) !! GoF, worst realization (#7) !! GoF, best CS realization (#62) !! GoF, worst CS realization (#3)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-WH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-WH-2022052315.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-WH-2022052307.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-WH-2022052362.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-WH-2022052303.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:BBP-GoF-WH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake ===&lt;br /&gt;
&lt;br /&gt;
These results are incomplete, without 2 of the stations (VER and PMN).&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#62) !! GoF, worst realization (#3) !! GoF, best BBP realization (#15) !! GoF, worst BBP realization (#7)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-GoF-WH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-WH-2022061362.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-WH-2022061303.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-WH-2022061315.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-WH-2022061307.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:CS-GoF-WH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP Vs30 values ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization !! GoF, worst realization !! GoF, best BBP realization !! GoF, worst BBP realization&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-BBPvs30-GoF-WH-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:CS-BBPvs30-GoF-WH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP v22.4 ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#62) !! GoF, worst realization (#39) !! GoF, best BBP realization (#15) !! GoF, worst BBP realization (#7)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-9_4-GoF-WH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_4-GoF-WH-real62.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_4-GoF-WH-real39.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_4-GoF-WH-real15.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_4-GoF-WH-real7.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP v22.4, BBP Vs30 values ===&lt;br /&gt;
&lt;br /&gt;
== Landers ==&lt;br /&gt;
&lt;br /&gt;
=== Single fault realization ===&lt;br /&gt;
&lt;br /&gt;
==== CS (2/15/22) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#6) !! GoF, best BBP realization (#13)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-GoF-Landers-2_15_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-2_15_real6.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-2_15_real13.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== BBP (12/10/22) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#9) !! GoF, best BBP realization (#13) &lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-Landers-12_10_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-12_10_real9.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-12_10_real13.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake (Study 22.12 configuration, 12/12/22) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#9) !! GoF, worst CS realization (#22) !! GoF, best BBP realization (#13) !! GoF, worst BBP realization (#22)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-GoF-Landers-12_12_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-12_12_real9.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-12_12_real22.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-12_12_real13.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-12_12_real22.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== GMPE ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:GMPE-GoF-Landers-single.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Multi fault realization ===&lt;br /&gt;
&lt;br /&gt;
==== BBP ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#54) !! GoF, worst CS realization (#11) !! GoF, best BBP realization (#55) !! GoF, worst BBP realization (#11)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-Landers-9_22_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-9_22_real54.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-9_22_real11.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-9_22_real55.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-9_22_real11.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#54) !! GoF, worst CS realization (#11) !! GoF, best BBP realization (#55) !! GoF, worst BBP realization (#11)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-Landers-8_25_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-Landers-8_25_real54.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-Landers-8_25_real11.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-Landers-8_25_real55.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-Landers-8_25_real11.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== GMPE ====&lt;br /&gt;
&lt;br /&gt;
Fabio notes that the GMPE codes we have on the BBP are not set up for multi-segment events, and probably only use the first segment in the rupture.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:GMPE-GoF-Landers-multi.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Ridgecrest ==&lt;br /&gt;
&lt;br /&gt;
We looked at both the smaller Ridgecrest (7/4/19, M6.4) 'A' event and the larger (7/6/19, M7.1) multi-fault 'C' event.&lt;br /&gt;
&lt;br /&gt;
=== Ridgecrest A ===&lt;br /&gt;
&lt;br /&gt;
==== BBP (6/19/23) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#6) !! GoF, worst CS realization (#16) !! GoF, best BBP realization (#27) !! GoF, worst BBP realization (#16)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestA-6_19_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestA-6_19_real06.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestA-6_19_real16.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestA-6_19_real27.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestA-6_19_real16.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake (9/29/23) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#6) !! GoF, worst CS realization (#16) !! GoF, best BBP realization (#27) !! GoF, worst BBP realization (#16)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestA-9_29_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestA-9_29_real06.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestA-9_29_real16.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestA-9_29_real27.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestA-9_29_real16.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== GMPE ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:GMPE-GoF-RidgecrestA.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Ridgecrest C multi-fault ===&lt;br /&gt;
&lt;br /&gt;
==== BBP (10/18/23) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#32) !! GoF, best BBP realization (#49)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestC-10_18_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestC-10_18_real32.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestC-10_18_real49.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake (11/29/23) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#32) !! GoF, best BBP realization (#49) &lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestC-11_29_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestC-11_29_real32.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestC-11_29_real49.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== North Palm Springs ==&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Broadband_CyberShake_Validation&amp;diff=30434</id>
		<title>Broadband CyberShake Validation</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Broadband_CyberShake_Validation&amp;diff=30434"/>
		<updated>2025-09-15T21:39:26Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* Process */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page follows on [[CyberShake BBP Verification]], moving from 1D comparisons to 3D CyberShake comparisons with both the BBP and observations.&lt;br /&gt;
&lt;br /&gt;
== Process ==&lt;br /&gt;
&lt;br /&gt;
These are the steps involved in comparing CyberShake and BBP results to observations for a verification event.&lt;br /&gt;
&lt;br /&gt;
=== BBP ===&lt;br /&gt;
&lt;br /&gt;
#Create a working directory.&lt;br /&gt;
#Create an src_files directory inside it.&lt;br /&gt;
#Copy the src file from the BBP validation directory into the src_files directory.&lt;br /&gt;
#Copy the station list into the working directory.&lt;br /&gt;
#Run run_bbp.py with the '--expert -g' arguments to create an XML file description of the run.  Instead of using the defaults for the validation event, point to the station list file in the working directory and the src file in the src_files directory.  Include the GP GOF, but not FAS.&lt;br /&gt;
#Copy the created XML file into the working directory.&lt;br /&gt;
#Run helper_scripts/create_srcs.sh, which will create 64 random seeds and then create 64 realizations by copying over the initial src file and changing the random seed&lt;br /&gt;
#Create 64 directories, one for each source.&lt;br /&gt;
#Run helper_scripts/prepare_lsf_files.sh script.  This will copy both the XML file and the run_bbp_src.lsf script into each realization directory, and make the needed changes to each for that realization. &lt;br /&gt;
#Submit all 64 src_*/run_bbp_src*.lsf files.&lt;br /&gt;
#Once all 64 run successfully, make a purple_plot/Sims directory, and logs, indata, tmpdata, and outdata directories inside it.&lt;br /&gt;
#Edit stage_purple_plot.sh to have the right sim_id.&lt;br /&gt;
#Run stage_purple_plot.sh .&lt;br /&gt;
#Run utils/batch/combine_gof_gen.py -d purple_plot -o . -c gp&lt;br /&gt;
#Examine the .png file produced to make sure it looks OK.&lt;br /&gt;
&lt;br /&gt;
=== CyberShake ===&lt;br /&gt;
&lt;br /&gt;
==== Set up and run BBP validation event ====&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;b&amp;gt;Setup&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create working directory on local system.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy in the following scripts:&lt;br /&gt;
*insert_sites_from_list.py&lt;br /&gt;
*pop_seeds.py&lt;br /&gt;
*update_dist.py&lt;br /&gt;
*pop_rvs.py&lt;br /&gt;
*pop_site_ruptures.py&amp;lt;/li&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;&amp;lt;br/&amp;gt;Create the sites.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy over *.stl list from BBP validation event.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy stl list into a bbp_to_cs version of the stl list.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run the script  ./check_for_duplicate_stations.py to see if any stations used by this event are already in the CyberShake site list.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;For the stations not in the CyberShake site list, determine their full name from https://www.strongmotioncenter.org/wserv/stations/builder/ .  If no full name exists, look at Google maps and make one up.  Add the abbreviation to the bbp_to_cs file, and add a full entry (lon, lat, short name, long name, 3) for each new site into a sites_to_insert.txt file.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Check each new site short name to make sure one doesn’t already exist in the DB.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run insert_sites_from_list.py on the sites_to_insert.txt file.&amp;lt;/li&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;&amp;lt;br/&amp;gt;Set up the rupture variations remotely.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;On the remote system, create a directory inside the Ruptures_erf60 directory corresponding to the source_id of the run.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Inside the source_id directory, create one directory for each realization, which will each be its own rupture.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy the realizations from the BBP tmpdirs in which the realizations were created, into the corresponding source directory.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a symlink to each SRF with the name &amp;lt;src_id&amp;gt;_&amp;lt;rup_id&amp;gt;_event&amp;lt;rup_id&amp;gt;.srf .&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create rupture geometries, &amp;lt;src_id&amp;gt;_&amp;lt;rup_id&amp;gt;.txt, for each file.  Note that each of these rupture geometries will be slightly different due to the subfaults shifting.  To create the rupture geometry files, run RuptureCodes/utils/extract_rup_geom on each rupture.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine the distances from each station to each rupture variation by running calc_distance.py for each variation on a site file in lon, lat, short name format.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy the rupture geometry files and the distance files back locally.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy the random_seeds.txt file from the BBP validation dir/src_files locally.&amp;lt;/li&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;&amp;lt;br/&amp;gt;Insert rupture variations into database.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run pop_rvs.py, pointing to the rupture geometry files, to insert the rupture variations into the DB.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run pop_site_ruptures.py, pointing to a site list file, the distance files, and rupture geometry files.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run pop_seeds.py on the random_seeds.txt file to populate the seeds into the DB.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a new RLS file on shock with the LFNs for both the rupture geometries and the SRFs.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run pegasus-rc-client to insert the LFNs into the DB.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
==== Perform GoF analysis on results ====&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a working directory.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy the src_files directory from the Broadband validation run directory.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy in the *.stl file from the BBP for this validation event.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create site_list.txt, with one CyberShake site per line.&lt;br /&gt;
&amp;lt;li&amp;gt;From another CyberShake validation directory, copy in the following scripts:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;import.lsf&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;prep.lsf&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;prep_stat.sh&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;rename.py&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;run_gof.sh&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;stage_purple_plot.sh&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;run_cs_src0.lsf&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a .tgz file on shock with the seismograms for all the stations and all the events.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Unzip this .tgz file into the working directory.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Submit prep.lsf.  This will copy the seismograms from the station directories into the src directories, and then process them into BBP-friendly format.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Edit rename.py to point to the BB-to-CS STL file, and then run it to rename the seismograms to include the full BBP site names.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Edit import.lsf to use the correct Run ID prefix, src_id, directory name, and BBP STL file, then submit it.  This script will import the seismograms into BBP working directories, so that the BBP can run GoF on them.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run the BBP to create an XML file which includes running the GoF.  To do this, run &amp;lt;pre&amp;gt;run_bbp.py --expert -g&amp;lt;/pre&amp;gt; When prompted, select&lt;br /&gt;
&amp;lt;ul&amp;gt;&amp;lt;li&amp;gt;'validation simulation'&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;the validation simulation created in the first part from the list&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;'GP'&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes custom source&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;point to BBP-provided source installed in src_files directory&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes rupture generator&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;custom list&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;point to list in the working directory&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes site response&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes velocity seismogram plots&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes acceleration seismogram plots&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;no GMPE comparisons&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;yes goodness-of-fit&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;'GP'&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;no additional metrics&amp;lt;/li&amp;gt;&amp;lt;/ul&amp;gt;&lt;br /&gt;
Copy the created XML file into the working directory.&lt;br /&gt;
&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Edit run_cs_src0.lsf and set SIM_ID_PREFIX, DIR_NAME, and XML_PREFIX to the correct values for this event.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Edit run_gof.sh to set DIR_NAME and XML_PREFIX to the same values, and modify the paths to the *.src and *.srf files used in the sed commands.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run run_gof.sh, which will submit a job to calculate GoF for each of the 64 realizations.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create the directory purple_plot/Sims.  Inside there, create 4 directories: logs, indata, tmpdata, and outdata.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Edit stage_purple_plot.sh to use the correct sim id.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run stage_purple_plot.sh.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create the directory purple_plot_output.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run bbp/utils/batch/combine_gof_gen.py -d purple_plot -o purple_plot_output -c gp to produce a purple plot, combining the GoF results from all stations and events.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Run bbp/utils/batch/compare_bbp_runs.py -d purple_plot -o purple_plot_output/ -c gp to generate the bias values.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Multi-fault GMPEs ===&lt;br /&gt;
&lt;br /&gt;
The BBP, as of v22.4, isn't set up to produce realistic GMPE estimates for multi-fault ruptures.  Therefore, producing meaningful GMPE comparisons requires generating the GMPE estimates outside of the BBP and then importing them.  The steps for this are detailed below:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Ask someone (like Kevin) to produce GMPE estimates at a variety of periods for the event.  The station list and event information must be provided.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;This information is delivered in a series of CSV files, one per period.  Convert the CSV files into .ri50 files.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy the .ri50 files into a working directory.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;To get the observation files in *.rd50 format, you can either copy them in from a previous run, or you can generate them.  To generate them, run:&lt;br /&gt;
&amp;lt;pre&amp;gt;$&amp;gt; mkdir $BBP_DATA_DIR/logs/&amp;lt;arbitrary run ID&amp;gt;&lt;br /&gt;
$&amp;gt; $BBP_DIR/comps/obs_seismograms.py $BBP_VAL_DIR/path/to/site_list.stl $BBP_VAL_DIR/path/to/observations/Acc acc_peer &amp;quot;&amp;quot; &amp;lt;arbitrary run ID&amp;gt;&lt;br /&gt;
$&amp;gt; cp -r $BBP_DATA_DIR/&amp;lt;arbitrary run id&amp;gt;/obs_seis_&amp;lt;event&amp;gt; .&amp;lt;/pre&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Next, run gmpe_gof.py to generate the comparison metrics:&lt;br /&gt;
&amp;lt;pre&amp;gt;$GMSVTOOLKIT_DIR/stats/gmpe_gof.py --gmpe-dir . --comp-dir obs_seis_&amp;lt;event&amp;gt; --station-list /path/to/site_list.stl --src-file /path/to/first/segment/src_file.src --comp-label &amp;lt;label&amp;gt; --gmpe-group nga-west2&amp;lt;/pre&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Finally, plot the results using plot_gmpe_gof.py:&lt;br /&gt;
&amp;lt;/pre&amp;gt;$GMSVTOOLKIT_DIR/plots/plot_gmpe_gof.py --input-dir . --output-dir . --comp-label &amp;lt;label&amp;gt; --gmpe-group nga-west2 --station-list /path/to/site_list.stl&amp;lt;/pre&amp;gt;&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Northridge ==&lt;br /&gt;
&lt;br /&gt;
=== 1D BBP comparisons ===&lt;br /&gt;
&lt;br /&gt;
We calculated 64 realizations for Northridge for these [[Media:northridge_stations.kml | 38 stations]].&lt;br /&gt;
&lt;br /&gt;
Some of the differences between the 1D BBP and 3D CyberShake results can be attributed to differences in site response, which is calculated based on the reference velocity ('vref') and the vs30 of the site.  A spreadsheet with vref and Vs30 for both the BBP and CyberShake is available [[File:BBP_v_CyberShake_Northridge_vs30.xls | here.]]&lt;br /&gt;
&lt;br /&gt;
==== V1 (2/14/22) ====&lt;br /&gt;
&lt;br /&gt;
Initially, we used vref=500 for both the high-frequency and low-frequency site response.  However, this is incorrect; vref for the low-frequency should vary depending on the properties of the 3D velocity mesh.&lt;br /&gt;
&lt;br /&gt;
==== V2 (2/25/22) ====&lt;br /&gt;
&lt;br /&gt;
We continued to use vref=500 for the high-frequency site response.  For the low-frequency site response, we are now using the same vref we used in Study 15.12:&lt;br /&gt;
&lt;br /&gt;
vref_LF_eff = Vs30 * [ VsD5H / Vs5H ]&lt;br /&gt;
&lt;br /&gt;
Vs30 = 30 / Sum (1/(Vs sampled from [0.5,29.5] in 1 meter increments))&lt;br /&gt;
&lt;br /&gt;
H = grid spacing&lt;br /&gt;
&lt;br /&gt;
Vs5H = travel time averaged Vs, computed from the CVM in 1 meter increments down to a depth of 5*H&lt;br /&gt;
&lt;br /&gt;
VsD5H = discrete travel time averaged Vs computed from 3D velocity mesh used in the SGT calculation over the upper 5 grid points&lt;br /&gt;
&lt;br /&gt;
So, for H=100m Vs5H would be:&lt;br /&gt;
&lt;br /&gt;
Vs500 = 500 / ( Sum ( 1 / Vs sampled from [0.5,499.5] in 1 meter increments ))&lt;br /&gt;
&lt;br /&gt;
And then VsD5H is given by:&lt;br /&gt;
&lt;br /&gt;
VsD500 = 5/{ 0.5/Vs(Z=0m) + 1/Vs(Z=100m) + 1/Vs(Z=200m) + 1/Vs(Z=300m) + 1/Vs(Z=400m) + 0.5/Vs(Z=500m) } &lt;br /&gt;
&lt;br /&gt;
Below are the ts_process plots for a subset of 10 stations, comparing the 3D CyberShake with 1D BBP results.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! TS Process plot&lt;br /&gt;
|-&lt;br /&gt;
! SCE || [[File:ts_process_SCE_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! SYL || [[File:ts_process_SYL_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! LDM || [[File:ts_process_LDM_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! PAC || [[File:ts_process_PAC_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! PKC || [[File:ts_process_PKC_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! SPV || [[File:ts_process_SPV_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! WON || [[File:ts_process_WON_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! KAT || [[File:ts_process_KAT_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! RO3 || [[File:ts_process_RO3_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! ANA || [[File:ts_process_ANA_src0_2_25.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== V3 (3/2/22) ====&lt;br /&gt;
&lt;br /&gt;
Next, we recalculated the CyberShake results for 10 sites, using the BBP vs30 values for both the low-frequency and high-frequency elements.  Note that the vref low-frequency value for CyberShake is still being derived from the velocity model.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Site !! TS Process plot&lt;br /&gt;
|-&lt;br /&gt;
! SCE || [[File:ts_process_SCE_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! SYL || [[File:ts_process_SYL_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! LDM || [[File:ts_process_LDM_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! PAC || [[File:ts_process_PAC_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! PKC || [[File:ts_process_PKC_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! SPV || [[File:ts_process_SPV_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! WON || [[File:ts_process_WON_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! KAT || [[File:ts_process_KAT_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! RO3 || [[File:ts_process_RO3_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! ANA || [[File:ts_process_ANA_src0_3_2.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&amp;lt;!-- Additionally, here are vertical cross-sections through CVM-S4.26.M01 with 1 km and 10 km depth.  LDM is at the left-hand edge of the plot, and SYL is precisely in the middle.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:LDM-SYL_cross_section_1km.png|thumb|400px|1 km depth]] || [[File:LDM-SYL_cross_section_10km.png|thumb|400px|10 km depth]]&lt;br /&gt;
|} //--&amp;gt;&lt;br /&gt;
&lt;br /&gt;
=== Observational Comparisons ===&lt;br /&gt;
&lt;br /&gt;
We calculated goodness-of-fit results for both Broadband CyberShake and the BBP against observations for Northridge, using the 64 realizations and 38 stations.&lt;br /&gt;
&lt;br /&gt;
TS process plots comparing the 3 results are available here: [https://g-c662a6.a78b8.36fe.data.globus.org/cybershake/Broadband_CyberShake/validation/Northridge/ts_process_nr.tgz v1 ts process plots].&lt;br /&gt;
&lt;br /&gt;
==== Broadband Platform ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#7) !! GoF, worse realization (#15)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-NR-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-NR-real7.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-NR-real15.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake v4_8 ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#53) !! GoF, worse realization (#54)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_real53.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_real54.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== Broadband Platform, updated frequency bands ====&lt;br /&gt;
&lt;br /&gt;
We realized that our GoF comparisons were run using hard-coded frequency bands of [0.05, 50] Hz.  This was correct when doing CS-to-BBP comparisons, but not correct when using observations.  We updated the frequency bands and reran the GoF.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best BBP realization (#47) !! GoF, worst BBP realization (#25) !! GoF, best CS realization (#40) !! GoF, worst CS realization (#37)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-freqbands-GoF-NR-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-freqbands-GoF-NR-real47.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-freqbands-GoF-NR-real25.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-freqbands-GoF-NR-real40.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-freqbands-GoF-NR-real37.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== CyberShake v4_8, updated frequency bands ====&lt;br /&gt;
&lt;br /&gt;
As above, reran the GoF.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#40) !! GoF, worst CS realization (#37) !! GoF, best BBP realization (#47) !! GoF, worst BBP realization (#25)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_freqbands_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_freqbands_real40.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_freqbands_real37.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_freqbands_real47.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_8_freqbands_real25.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==== CyberShake v4_28, BBP Vs30 ====&lt;br /&gt;
&lt;br /&gt;
We recalculated the CyberShake GoF results, using the BBP Vs30 values.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#40) !! GoF, worst CS realization (#51) !! GoF, best BBP realization (#47) !! GoF, worst BBP realization (#25)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_28_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_28_real40.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_28_real51.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_28_real47.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_28_real25.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== BBP, FAS ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:CyberShake-GoF-NR-BBP_FAS.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake v4_7, FAS ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:CyberShake-GoF-NR-4_7_FAS.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Observational Comparisons, BBP v22.4 ===&lt;br /&gt;
&lt;br /&gt;
We repeated the comparisons, using BBP v22.4 for both the 1D BBP and 3D CyberShake calculations.&lt;br /&gt;
&lt;br /&gt;
==== BBP ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best BBP realization (#6) !! GoF, worst BBP realization (#51) !! GoF, best CS realization (#40) !! GoF, worst CS realization (#51)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-v22.4-GoF-NR-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-v22.4-GoF-NR-real6.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-v22.4-GoF-NR-real51.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-v22.4-GoF-NR-real40.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-v22.4-GoF-NR-real51.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#40) !! GoF, worst CS realization (#51) !! GoF, best BBP realization (#6) !! GoF, worst BBP realization (#51)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-9_1_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-9_1_real40.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-9_1_real51.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-9_1_real6.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-9_1_real51.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, updated velocity model ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#33) !! GoF, worst CS realization (#51)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-11_28_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-11_28_real33.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-11_28_real51.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Study 22.12 configuration ===&lt;br /&gt;
&lt;br /&gt;
These are the validation results using the Study 22.12 configuration, with risetime_coef=2.3 and hb_high_v6.1.1.&lt;br /&gt;
&lt;br /&gt;
==== BBP (12/9/22) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#40) !! GoF, worst CS realization (#9) !! GoF, best BBP realization (#6) !! GoF, worst BBP realization (#51)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_9_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_9_real40.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_9_real9.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_9_real6.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_9_real51.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake (12/11/22) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#40) !! GoF, worst CS realization (#9) !! GoF, best BBP realization (#6) !! GoF, worst BBP realization (#51)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_11_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_11_real40.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_11_real9.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_11_real6.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-NR-12_11_real51.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== SRF plots ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Best CS realization (#40)&lt;br /&gt;
| [[File:NR_v14_02_1_40.png|thumb|400px]]&lt;br /&gt;
|-&lt;br /&gt;
! Best BBP realization (#6)&lt;br /&gt;
| [[File:NR_v14_02_1_6.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Chino Hills ==&lt;br /&gt;
&lt;br /&gt;
For our second event, we selected Chino Hills.&lt;br /&gt;
&lt;br /&gt;
=== Sites ===&lt;br /&gt;
&lt;br /&gt;
The BBP validation event for Chino Hills has 40 stations.  A KML file with the stations is available [[:File:chino_hills_stations.kml|here]].&lt;br /&gt;
&lt;br /&gt;
=== BBP ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#24) !! GoF, worst realization (#35) !! GoF, worst CS realization (#34)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-CH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-CH-real24.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-CH-real35.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-CH-real34.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:BBP-GoF-CH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#24) !! GoF, worst realization (#34) !! GoF, worst BBP realization (#35)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-GoF-CH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-CH-real24.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-CH-real34.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-CH-real35.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:CS-GoF-CH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP Vs30 values ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#24) !! GoF, worst realization (#34) !! GoF, worst BBP realization (#35)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-BBPvs30-GoF-CH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-BBPvs30-GoF-CH-real24.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-BBPvs30-GoF-CH-real34.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-BBPvs30-GoF-CH-real35.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:CS-BBPvs30-GoF-CH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP v22.4 ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#24) !! GoF, worst realization (#34) !! GoF, best BBP realization (#24) !! GoF, worst BBP realization (#35)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-9_2-GoF-CH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_2-GoF-CH-real24.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_2-GoF-CH-real34.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_2-GoF-CH-real24.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_2-GoF-CH-real35.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP v22.4, BBP Vs30 ===&lt;br /&gt;
&lt;br /&gt;
== Whittier ==&lt;br /&gt;
&lt;br /&gt;
Our third event is Whittier.&lt;br /&gt;
&lt;br /&gt;
=== Sites ===&lt;br /&gt;
&lt;br /&gt;
The BBP validation event for Whittier has 39 stations, one of which is a duplicate.  A KML file with the 38 stations used in these tests is available [[:File:whittier_stations.kml|here]].&lt;br /&gt;
&lt;br /&gt;
=== BBP ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#15) !! GoF, worst realization (#7) !! GoF, best CS realization (#62) !! GoF, worst CS realization (#3)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-WH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-WH-2022052315.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-WH-2022052307.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-WH-2022052362.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-WH-2022052303.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:BBP-GoF-WH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake ===&lt;br /&gt;
&lt;br /&gt;
These results are incomplete, without 2 of the stations (VER and PMN).&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#62) !! GoF, worst realization (#3) !! GoF, best BBP realization (#15) !! GoF, worst BBP realization (#7)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-GoF-WH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-WH-2022061362.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-WH-2022061303.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-WH-2022061315.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-WH-2022061307.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:CS-GoF-WH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP Vs30 values ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization !! GoF, worst realization !! GoF, best BBP realization !! GoF, worst BBP realization&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-BBPvs30-GoF-WH-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
{| Overall FAS &lt;br /&gt;
| [[File:CS-BBPvs30-GoF-WH-FAS-combined.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP v22.4 ===&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best realization (#62) !! GoF, worst realization (#39) !! GoF, best BBP realization (#15) !! GoF, worst BBP realization (#7)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-9_4-GoF-WH-combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_4-GoF-WH-real62.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_4-GoF-WH-real39.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_4-GoF-WH-real15.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-9_4-GoF-WH-real7.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== CyberShake, BBP v22.4, BBP Vs30 values ===&lt;br /&gt;
&lt;br /&gt;
== Landers ==&lt;br /&gt;
&lt;br /&gt;
=== Single fault realization ===&lt;br /&gt;
&lt;br /&gt;
==== CS (2/15/22) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#6) !! GoF, best BBP realization (#13)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-GoF-Landers-2_15_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-2_15_real6.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-2_15_real13.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== BBP (12/10/22) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#9) !! GoF, best BBP realization (#13) &lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-Landers-12_10_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-12_10_real9.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-12_10_real13.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake (Study 22.12 configuration, 12/12/22) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#9) !! GoF, worst CS realization (#22) !! GoF, best BBP realization (#13) !! GoF, worst BBP realization (#22)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CS-GoF-Landers-12_12_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-12_12_real9.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-12_12_real22.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-12_12_real13.png|thumb|400px]]&lt;br /&gt;
| [[File:CS-GoF-Landers-12_12_real22.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== GMPE ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:GMPE-GoF-Landers-single.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Multi fault realization ===&lt;br /&gt;
&lt;br /&gt;
==== BBP ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#54) !! GoF, worst CS realization (#11) !! GoF, best BBP realization (#55) !! GoF, worst BBP realization (#11)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-Landers-9_22_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-9_22_real54.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-9_22_real11.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-9_22_real55.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-Landers-9_22_real11.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#54) !! GoF, worst CS realization (#11) !! GoF, best BBP realization (#55) !! GoF, worst BBP realization (#11)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-Landers-8_25_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-Landers-8_25_real54.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-Landers-8_25_real11.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-Landers-8_25_real55.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-Landers-8_25_real11.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== GMPE ====&lt;br /&gt;
&lt;br /&gt;
Fabio notes that the GMPE codes we have on the BBP are not set up for multi-segment events, and probably only use the first segment in the rupture.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:GMPE-GoF-Landers-multi.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Ridgecrest ==&lt;br /&gt;
&lt;br /&gt;
We looked at both the smaller Ridgecrest (7/4/19, M6.4) 'A' event and the larger (7/6/19, M7.1) multi-fault 'C' event.&lt;br /&gt;
&lt;br /&gt;
=== Ridgecrest A ===&lt;br /&gt;
&lt;br /&gt;
==== BBP (6/19/23) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#6) !! GoF, worst CS realization (#16) !! GoF, best BBP realization (#27) !! GoF, worst BBP realization (#16)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestA-6_19_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestA-6_19_real06.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestA-6_19_real16.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestA-6_19_real27.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestA-6_19_real16.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake (9/29/23) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#6) !! GoF, worst CS realization (#16) !! GoF, best BBP realization (#27) !! GoF, worst BBP realization (#16)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestA-9_29_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestA-9_29_real06.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestA-9_29_real16.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestA-9_29_real27.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestA-9_29_real16.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== GMPE ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:GMPE-GoF-RidgecrestA.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
=== Ridgecrest C multi-fault ===&lt;br /&gt;
&lt;br /&gt;
==== BBP (10/18/23) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#32) !! GoF, best BBP realization (#49)&lt;br /&gt;
|-&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestC-10_18_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestC-10_18_real32.png|thumb|400px]]&lt;br /&gt;
| [[File:BBP-GoF-RidgecrestC-10_18_real49.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
==== CyberShake (11/29/23) ====&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
! Overall GoF !! GoF, best CS realization (#32) !! GoF, best BBP realization (#49) &lt;br /&gt;
|-&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestC-11_29_combined.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestC-11_29_real32.png|thumb|400px]]&lt;br /&gt;
| [[File:CyberShake-GoF-RidgecrestC-11_29_real49.png|thumb|400px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== North Palm Springs ==&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_Database_Archiving&amp;diff=30432</id>
		<title>CyberShake Study Database Archiving</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_Database_Archiving&amp;diff=30432"/>
		<updated>2025-09-07T04:39:47Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: Added --skip-add-drop-table option so preexisting tables aren't dropped.&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== SQLite Archiving ==&lt;br /&gt;
&lt;br /&gt;
The following procedure should be used when we want to archive a study to SQLite which is either in the production or data access database to disk, to free up room in the database.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;First, dump the old database contents into a new directory by running the following commands.&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note that if you don't want to lock the tables, you can replace '--lock-all-tables' with '--single-transaction=TRUE' in the following commands.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine the Study ID you want to archive, from looking at the Studies table in the DB:&amp;lt;/li&amp;gt;&lt;br /&gt;
  select * from Studies;&lt;br /&gt;
&amp;lt;li&amp;gt;Dump the PeakAmplitudes, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'Run_ID in (select R.Run_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake PeakAmplitudes &amp;gt; peak_amps.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump Hazard_Datasets, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'Study_ID=&amp;lt;study id to archive&amp;gt;' CyberShake Hazard_Datasets &amp;gt; hazard_datasets.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump Hazard_Curves, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'Hazard_Dataset_ID in (select D.Hazard_Dataset_ID from Hazard_Datasets D where D.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Hazard_Curves &amp;gt; hazard_curves.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump Hazard_Curve_Points, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'Hazard_Curve_ID in (select C.Hazard_Curve_ID from Hazard_Curves C, Hazard_Datasets D where D.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Hazard_Curve_Points &amp;gt; hazard_curve_points.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump CyberShake_Runs, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'Study_ID=&amp;lt;study id to archive&amp;gt;' CyberShake CyberShake_Runs &amp;gt; runs.sql&lt;br /&gt;
&amp;lt;li&amp;gt;You now need the rest of the input tables.  They used to be pretty small, but now they're up to ~70GB with indices.  As of the completion of Study 22.12, here are their approximate sizes (in the DB, not as dump files):&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
! Table !! Size (GB)&lt;br /&gt;
|-&lt;br /&gt;
| AR_Hazard_Curve_Points || 44.6&lt;br /&gt;
|-&lt;br /&gt;
| CyberShake_Site_Ruptures || 10.3&lt;br /&gt;
|-&lt;br /&gt;
| Ruptures || 2.3&lt;br /&gt;
|-&lt;br /&gt;
| Rupture_Variations || 2.6&lt;br /&gt;
|-&lt;br /&gt;
| AR_Hazard_Curves || 1.4&lt;br /&gt;
|-&lt;br /&gt;
| Other tables || 0.5&lt;br /&gt;
|}&lt;br /&gt;
You can either:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Dump all of the input tables -- they are small enough that we don't mind capturing data which wasn't directly used in this study.&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --skip-add-drop-table -u cybershk -p CyberShake AR_Hazard_Curve_Points AR_Hazard_Curves AR_Hazard_Datasets Atten_Rel_Metadata Atten_Rels CyberShake_Site_Regions CyberShake_Site_Ruptures CyberShake_Site_Types CyberShake_Sites ERF_IDs ERF_Metadata ERF_Probability_Models IM_Types Points Rupture_Variation_Probability_Modifier Rupture_Variation_Scenario_IDs Rupture_Variation_Scenario_Metadata Rupture_Variations Ruptures Rup_Var_Seeds SGT_Variation_IDs SGT_Variation_Metadata Studies Time_Spans Velocity_Models &amp;gt; input_tables.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Dump only the sections of the large input tables that you need.  Note that the Points table isn't used anymore, so don't bother to dump it.&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'AR_Hazard_Curve_ID in (select distinct C.AR_Hazard_Curve_ID from AR_Hazard_Curves C where C.AR_Hazard_Dataset_ID in (select distinct D.AR_Hazard_Dataset_ID from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;gt;=(select distinct D.Min_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;lt;=(select distinct D.Max_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;gt;=(select distinct D.Min_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;lt;=(select distinct D.Max_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) )' CyberShake AR_Hazard_Curve_Points &amp;gt; ar_hazard_curve_points.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'CS_Site_ID in (select R.Site_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;) and ERF_ID in (select R.ERF_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake CyberShake_Site_Ruptures &amp;gt; cs_site_ruptures.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'ERF_ID in (select R.ERF_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;) and (Source_ID, Rupture_ID) in (select distinct SR.Source_ID, SR.Rupture_ID from CyberShake_Site_Ruptures SR, CyberShake_Runs R where SR.CS_Site_ID=R.Site_ID and SR.ERF_ID=R.ERF_ID and R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Ruptures &amp;gt; ruptures.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where '(Rup_Var_Scenario_ID, ERF_ID) in (select R.Rup_Var_Scenario_ID, R.ERF_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;) and (Source_ID, Rupture_ID) in (select distinct SR.Source_ID, SR.Rupture_ID from CyberShake_Site_Ruptures SR, CyberShake_Runs R where SR.CS_Site_ID=R.Site_ID and SR.ERF_ID=R.ERF_ID and R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Rupture_Variations &amp;gt; rupture_variations.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p --where 'AR_Hazard_Dataset_ID in (select distinct D.AR_Hazard_Dataset_ID from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;gt;=(select distinct D.Min_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;lt;=(select distinct D.Max_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;gt;=(select distinct D.Min_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;lt;=(select distinct D.Max_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID)' CyberShake AR_Hazard_Curves &amp;gt; ar_hazard_curves.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables --skip-add-drop-table -u cybershk -p CyberShake AR_Hazard_Datasets Atten_Rel_Metadata Atten_Rels CyberShake_Site_Regions CyberShake_Site_Types CyberShake_Sites ERF_IDs ERF_Metadata ERF_Probability_Models IM_Types Rupture_Variation_Probability_Modifier Rupture_Variation_Scenario_IDs Rupture_Variation_Scenario_Metadata Rup_Var_Seeds SGT_Variation_IDs SGT_Variation_Metadata Studies Time_Spans Velocity_Models &amp;gt; input_tables.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Next, convert the SQL dumps into SQLite format using mysql2sqlite:&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#For each of the dump files, run:&lt;br /&gt;
  mysql2sqlite &amp;lt;SQL dump file&amp;gt; &amp;gt; &amp;lt;SQLite dump file&amp;gt;&lt;br /&gt;
  Example:  ./mysql2sqlite peak_amps.sql &amp;gt; peak_amps.sqlite&lt;br /&gt;
For large tables this may take an hour or two.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Create a SQLite database and import the tables.  Use sqlite3 3.7.11 or later, or there will be an error reading the dump files.  If you get errors about too many entries, upgrade to a more recent version and try again.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a database for this study:&amp;lt;/li&amp;gt;&lt;br /&gt;
  sqlite3 &amp;lt;study name&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;For each dump file, run the following command:&amp;lt;/li&amp;gt;&lt;br /&gt;
  .read &amp;lt;path/to/dump/file.sqlite&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Run a few queries on the original table on the database, and on the sqlite files to check that the count is the same.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Check the number of PeakAmplitudes:&amp;lt;/li&amp;gt;&lt;br /&gt;
  select count(*) from PeakAmplitudes P, CyberShake_Runs R where R.Study_ID=&amp;lt;study_id&amp;gt; and P.Run_ID=R.Run_ID;&lt;br /&gt;
&amp;lt;li&amp;gt;Check the number of rupture variations for each site:&amp;lt;/li&amp;gt;&lt;br /&gt;
  select count(*) from CyberShake_Runs R, CyberShake_Site_Ruptures SR, Rupture_Variations V where R.Site_ID=SR.CS_Site_ID and SR.Source_ID=V.Source_ID and SR.Rupture_ID=V.Rupture_ID and R.ERF_ID=V.ERF_ID and R.ERF_ID=SR.ERF_ID and V.Rup_Var_Scenario_ID=R.Rup_Var_Scenario_ID and R.Study_ID=&amp;lt;study id&amp;gt;;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Create index on Run_ID and IM_Type_ID in PeakAmplitudes table to speed up access.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy over the original version to a new file.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Set the SQLITE_TMPDIR variable to point to a filesystem with lots of free space.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Using a current version of sqlite3, open the new database file and run:&amp;lt;/li&amp;gt;&lt;br /&gt;
  CREATE INDEX &amp;quot;idx_PeakAmplitudes_Run_ID_IM_Type_ID&amp;quot; ON &amp;quot;PeakAmplitudes&amp;quot; (`Run_ID`, `IM_Type_ID`);&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Move the files to the archive location on project.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Tar up the sqlite files:&amp;lt;/li&amp;gt;&lt;br /&gt;
  tar czvf &amp;lt;study_name&amp;gt;.tgz *.sqlite&lt;br /&gt;
&amp;lt;li&amp;gt;SFTP the tgz file to the study sqlite archive location on project at CARC (/project/scec_608/cybershake/results/sqlite_studies/&amp;lt;study name&amp;gt;), using hpc-transfer1.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Extract the database to /project/scec_608/cybershake/results/sqlite_studies/&amp;lt;study_name&amp;gt; .&amp;lt;/li&amp;gt;&lt;br /&gt;
  tar xzvf sqlite_dumps/&amp;lt;study_name&amp;gt;.tgz &amp;lt;study_name&amp;gt;.sqlite&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Move study to from production DB to data access DB ==&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_Database_Archiving&amp;diff=30410</id>
		<title>CyberShake Study Database Archiving</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=CyberShake_Study_Database_Archiving&amp;diff=30410"/>
		<updated>2025-09-02T21:39:29Z</updated>

		<summary type="html">&lt;p&gt;Scottcal: /* SQLite Archiving */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== SQLite Archiving ==&lt;br /&gt;
&lt;br /&gt;
The following procedure should be used when we want to archive a study to SQLite which is either in the production or data access database to disk, to free up room in the database.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;First, dump the old database contents into a new directory by running the following commands.&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Note that if you don't want to lock the tables, you can replace '--lock-all-tables' with '--single-transaction=TRUE' in the following commands.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Determine the Study ID you want to archive, from looking at the Studies table in the DB:&amp;lt;/li&amp;gt;&lt;br /&gt;
  select * from Studies;&lt;br /&gt;
&amp;lt;li&amp;gt;Dump the PeakAmplitudes, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables -u cybershk -p --where 'Run_ID in (select R.Run_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake PeakAmplitudes &amp;gt; peak_amps.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump Hazard_Datasets, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables -u cybershk -p --where 'Study_ID=&amp;lt;study id to archive&amp;gt;' CyberShake Hazard_Datasets &amp;gt; hazard_datasets.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump Hazard_Curves, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables -u cybershk -p --where 'Hazard_Dataset_ID in (select D.Hazard_Dataset_ID from Hazard_Datasets D where D.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Hazard_Curves &amp;gt; hazard_curves.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump Hazard_Curve_Points, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables -u cybershk -p --where 'Hazard_Curve_ID in (select C.Hazard_Curve_ID from Hazard_Curves C, Hazard_Datasets D where D.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Hazard_Curve_Points &amp;gt; hazard_curve_points.sql&lt;br /&gt;
&amp;lt;li&amp;gt;Dump CyberShake_Runs, using:&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump --lock-all-tables -u cybershk -p --where 'Study_ID=&amp;lt;study id to archive&amp;gt;' CyberShake CyberShake_Runs &amp;gt; runs.sql&lt;br /&gt;
&amp;lt;li&amp;gt;You now need the rest of the input tables.  They used to be pretty small, but now they're up to ~70GB with indices.  As of the completion of Study 22.12, here are their approximate sizes (in the DB, not as dump files):&lt;br /&gt;
{| border=&amp;quot;1&amp;quot;&lt;br /&gt;
! Table !! Size (GB)&lt;br /&gt;
|-&lt;br /&gt;
| AR_Hazard_Curve_Points || 44.6&lt;br /&gt;
|-&lt;br /&gt;
| CyberShake_Site_Ruptures || 10.3&lt;br /&gt;
|-&lt;br /&gt;
| Ruptures || 2.3&lt;br /&gt;
|-&lt;br /&gt;
| Rupture_Variations || 2.6&lt;br /&gt;
|-&lt;br /&gt;
| AR_Hazard_Curves || 1.4&lt;br /&gt;
|-&lt;br /&gt;
| Other tables || 0.5&lt;br /&gt;
|}&lt;br /&gt;
You can either:&lt;br /&gt;
&amp;lt;ul&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Dump all of the input tables -- they are small enough that we don't mind capturing data which wasn't directly used in this study.&amp;lt;/li&amp;gt;&lt;br /&gt;
  mysqldump -u cybershk -p CyberShake AR_Hazard_Curve_Points AR_Hazard_Curves AR_Hazard_Datasets Atten_Rel_Metadata Atten_Rels CyberShake_Site_Regions CyberShake_Site_Ruptures CyberShake_Site_Types CyberShake_Sites ERF_IDs ERF_Metadata ERF_Probability_Models IM_Types Points Rupture_Variation_Probability_Modifier Rupture_Variation_Scenario_IDs Rupture_Variation_Scenario_Metadata Rupture_Variations Ruptures Rup_Var_Seeds SGT_Variation_IDs SGT_Variation_Metadata Studies Time_Spans Velocity_Models &amp;gt; input_tables.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Dump only the sections of the large input tables that you need.  Note that the Points table isn't used anymore, so don't bother to dump it.&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables -u cybershk -p --where 'AR_Hazard_Curve_ID in (select distinct C.AR_Hazard_Curve_ID from AR_Hazard_Curves C where C.AR_Hazard_Dataset_ID in (select distinct D.AR_Hazard_Dataset_ID from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;gt;=(select distinct D.Min_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;lt;=(select distinct D.Max_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;gt;=(select distinct D.Min_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;lt;=(select distinct D.Max_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) )' CyberShake AR_Hazard_Curve_Points &amp;gt; ar_hazard_curve_points.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables -u cybershk -p --where 'CS_Site_ID in (select R.Site_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;) and ERF_ID in (select R.ERF_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake CyberShake_Site_Ruptures &amp;gt; cs_site_ruptures.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables -u cybershk -p --where 'ERF_ID in (select R.ERF_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;) and (Source_ID, Rupture_ID) in (select distinct SR.Source_ID, SR.Rupture_ID from CyberShake_Site_Ruptures SR, CyberShake_Runs R where SR.CS_Site_ID=R.Site_ID and SR.ERF_ID=R.ERF_ID and R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Ruptures &amp;gt; ruptures.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables -u cybershk -p --where '(Rup_Var_Scenario_ID, ERF_ID) in (select R.Rup_Var_Scenario_ID, R.ERF_ID from CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt;) and (Source_ID, Rupture_ID) in (select distinct SR.Source_ID, SR.Rupture_ID from CyberShake_Site_Ruptures SR, CyberShake_Runs R where SR.CS_Site_ID=R.Site_ID and SR.ERF_ID=R.ERF_ID and R.Study_ID=&amp;lt;study id to archive&amp;gt;)' CyberShake Rupture_Variations &amp;gt; rupture_variations.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables -u cybershk -p --where 'AR_Hazard_Dataset_ID in (select distinct D.AR_Hazard_Dataset_ID from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;gt;=(select distinct D.Min_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lat&amp;lt;=(select distinct D.Max_Lat from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;gt;=(select distinct D.Min_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID) and Lon&amp;lt;=(select distinct D.Max_Lon from AR_Hazard_Datasets D, CyberShake_Runs R where R.Study_ID=&amp;lt;study id to archive&amp;gt; and D.Time_Span_ID=1 and D.Prob_Model_ID=1 and D.ERF_ID=R.ERF_ID and D.Velocity_Model_ID=R.Velocity_Model_ID)' CyberShake AR_Hazard_Curves &amp;gt; ar_hazard_curves.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;mysqldump --lock-all-tables -u cybershk -p CyberShake AR_Hazard_Datasets Atten_Rel_Metadata Atten_Rels CyberShake_Site_Regions CyberShake_Site_Types CyberShake_Sites ERF_IDs ERF_Metadata ERF_Probability_Models IM_Types Rupture_Variation_Probability_Modifier Rupture_Variation_Scenario_IDs Rupture_Variation_Scenario_Metadata Rup_Var_Seeds SGT_Variation_IDs SGT_Variation_Metadata Studies Time_Spans Velocity_Models &amp;gt; input_tables.sql&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&amp;lt;/ul&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Next, convert the SQL dumps into SQLite format using mysql2sqlite:&amp;lt;/b&amp;gt;&lt;br /&gt;
&lt;br /&gt;
#For each of the dump files, run:&lt;br /&gt;
  mysql2sqlite &amp;lt;SQL dump file&amp;gt; &amp;gt; &amp;lt;SQLite dump file&amp;gt;&lt;br /&gt;
  Example:  ./mysql2sqlite peak_amps.sql &amp;gt; peak_amps.sqlite&lt;br /&gt;
For large tables this may take an hour or two.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Create a SQLite database and import the tables.  Use sqlite3 3.7.11 or later, or there will be an error reading the dump files.  If you get errors about too many entries, upgrade to a more recent version and try again.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Create a database for this study:&amp;lt;/li&amp;gt;&lt;br /&gt;
  sqlite3 &amp;lt;study name&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;For each dump file, run the following command:&amp;lt;/li&amp;gt;&lt;br /&gt;
  .read &amp;lt;path/to/dump/file.sqlite&amp;gt;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Run a few queries on the original table on the database, and on the sqlite files to check that the count is the same.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Check the number of PeakAmplitudes:&amp;lt;/li&amp;gt;&lt;br /&gt;
  select count(*) from PeakAmplitudes P, CyberShake_Runs R where R.Study_ID=&amp;lt;study_id&amp;gt; and P.Run_ID=R.Run_ID;&lt;br /&gt;
&amp;lt;li&amp;gt;Check the number of rupture variations for each site:&amp;lt;/li&amp;gt;&lt;br /&gt;
  select count(*) from CyberShake_Runs R, CyberShake_Site_Ruptures SR, Rupture_Variations V where R.Site_ID=SR.CS_Site_ID and SR.Source_ID=V.Source_ID and SR.Rupture_ID=V.Rupture_ID and R.ERF_ID=V.ERF_ID and R.ERF_ID=SR.ERF_ID and V.Rup_Var_Scenario_ID=R.Rup_Var_Scenario_ID and R.Study_ID=&amp;lt;study id&amp;gt;;&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Create index on Run_ID and IM_Type_ID in PeakAmplitudes table to speed up access.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Copy over the original version to a new file.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Set the SQLITE_TMPDIR variable to point to a filesystem with lots of free space.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Using a current version of sqlite3, open the new database file and run:&amp;lt;/li&amp;gt;&lt;br /&gt;
  CREATE INDEX &amp;quot;idx_PeakAmplitudes_Run_ID_IM_Type_ID&amp;quot; ON &amp;quot;PeakAmplitudes&amp;quot; (`Run_ID`, `IM_Type_ID`);&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;b&amp;gt;Move the files to the archive location on project.&amp;lt;/b&amp;gt;&lt;br /&gt;
&amp;lt;ol&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Tar up the sqlite files:&amp;lt;/li&amp;gt;&lt;br /&gt;
  tar czvf &amp;lt;study_name&amp;gt;.tgz *.sqlite&lt;br /&gt;
&amp;lt;li&amp;gt;SFTP the tgz file to the study sqlite archive location on project at CARC (/project/scec_608/cybershake/results/sqlite_studies/&amp;lt;study name&amp;gt;), using hpc-transfer1.&amp;lt;/li&amp;gt;&lt;br /&gt;
&amp;lt;li&amp;gt;Extract the database to /project/scec_608/cybershake/results/sqlite_studies/&amp;lt;study_name&amp;gt; .&amp;lt;/li&amp;gt;&lt;br /&gt;
  tar xzvf sqlite_dumps/&amp;lt;study_name&amp;gt;.tgz &amp;lt;study_name&amp;gt;.sqlite&lt;br /&gt;
&amp;lt;/ol&amp;gt;&lt;br /&gt;
&lt;br /&gt;
== Move study to from production DB to data access DB ==&lt;/div&gt;</summary>
		<author><name>Scottcal</name></author>
		
	</entry>
</feed>