

<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://strike.scec.org/scecwiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Bhatthal</id>
	<title>SCECpedia - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://strike.scec.org/scecwiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Bhatthal"/>
	<link rel="alternate" type="text/html" href="https://strike.scec.org/scecpedia/Special:Contributions/Bhatthal"/>
	<updated>2026-04-14T23:41:41Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.34.2</generator>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Publishing_UCERF3-ETAS_Event_Reports&amp;diff=30743</id>
		<title>Publishing UCERF3-ETAS Event Reports</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Publishing_UCERF3-ETAS_Event_Reports&amp;diff=30743"/>
		<updated>2026-04-08T20:48:03Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* Publishing the UCERF3-ETAS Forecast */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;After successfully running a UCERF3-ETAS simulation and generating our plots, the generated UCERF3-ETAS forecast results can be published as part of a larger SCEC Event page. This page outlines the process by which results are published.&lt;br /&gt;
&lt;br /&gt;
Refer to [[UCERF3-ETAS Measurements]] for detailed instructions on how to run simulations and generate plots across HPC systems. The following examples allow us to take our generated results directly from the HPC system on which the computation occurs, although results can be published from any system with an internet connection.&lt;br /&gt;
&lt;br /&gt;
== Creating and Updating SCEC Event Pages ==&lt;br /&gt;
SCEC Event Pages detail earthquake events. For a given mainshock, recorded magnitude, time and location, and aftershock sequences are recorded. Identify a given event from its USGS ID and use this to generate an event on the SCEC.org website at https://central.scec.org/earthquakes/eventpage/generate. (See Figure 1).&lt;br /&gt;
&lt;br /&gt;
After generating an event page, it should be populated and available to view from the SCEC Event Pages list. This page is not yet published and isn't available to the public. Navigate to the &amp;quot;View&amp;quot; link to view the event page. (See Figure 2).&lt;br /&gt;
Under the Table of Contents, you shouldn't see a &amp;quot;UCERF3-ETAS Forecast&amp;quot; yet, but after generating your results and making them available, you will be able to update this event page by selecting the &amp;quot;Regenerate Page with Latest Data&amp;quot; button. (See Figure 3)&lt;br /&gt;
&lt;br /&gt;
[[File:SCEC Event Page Generator.png|400px|thumb|Fig 1. SCEC Event Page Generator available at scec.org]]&lt;br /&gt;
[[File:Malibu EQ.png|400px|thumb|Fig 2. SCEC Event Page for M4 earthquake near Malibu, CA]]&lt;br /&gt;
[[File:UCERF3-ETAS Forecast in Event Page.png|400px|thumb|Fig 3. UCERF3-ETAS Forecast as seen on the Event Page]]&lt;br /&gt;
&lt;br /&gt;
For any errors generating SCEC Event Pages, refer to [[SCEC Event Page Troubleshooting]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Git Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
Pushing your changes upstream requires you to have an added SSH key on Frontera to allow you to make changes with your GitHub account. You must also use a GitHub account that is authorized to push changes directly to the ucerf3-etas-results/master branch.&lt;br /&gt;
&lt;br /&gt;
You can request edit permissions to the repository owner, Akash Bhatthal &amp;lt;[mailto:bhatthal@usc.edu bhattha@usc.edu]&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Consider the following brief instructions on getting set up with SSH.&lt;br /&gt;
&lt;br /&gt;
1. Check if you already have an SSH key generated.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;ls ~/.ssh/id_rsa.pub&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If not, then generate one with &amp;lt;code&amp;gt;ssh-keygen&amp;lt;/code&amp;gt;. It's possible you may have a non-RSA public key. Just check for any file ending in &amp;lt;code&amp;gt;.pub&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
2. Copy the contents of this public SSH key to your clipboard.&amp;lt;br/&amp;gt;&lt;br /&gt;
You can either select it directly from your Terminal, or copy the file over SSH.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3. Add as an authenticated key on your GitHub account.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Go to GitHub Settings -&amp;gt; SSH and GPG Keys -&amp;gt; New SSH key&lt;br /&gt;
and paste the public key you copied. It should start with &amp;lt;code&amp;gt;ssh-rsa&amp;lt;/code&amp;gt;.&lt;br /&gt;
Be careful to not use the private key, the public key is from the file&lt;br /&gt;
ending in &amp;lt;code&amp;gt;.pub&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
4. If you cloned the repository with HTTPS instead of SSH, then you need to update your remote accordingly.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;git remote set-url origin git@github.com:(your_user_name)/ucerf3-etas-results.git&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
5. If your account has sufficient permissions, you should be able to push directly upstream to ucerf3-etas-results/master.&lt;br /&gt;
&lt;br /&gt;
== Publishing the UCERF3-ETAS Forecast ==&lt;br /&gt;
&lt;br /&gt;
Generate your UCERF3-ETAS forecast for a given earthquake event following instructions available at [[UCERF3-ETAS Measurements]].&lt;br /&gt;
Our generated results are pushed to a private GitHub repository (SCECcode/ucerf3-etas-results) that is directly read by the SCEC Event Page. You must be granted permission to view and contribute to this repository to continue publishing UCERF3-ETAS results for use in SCEC Event Pages.&lt;br /&gt;
&lt;br /&gt;
1) Clone this repository to your home directory with the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;cd $HOME &amp;amp;&amp;amp; git clone git@github.com:SCECcode/ucerf3-etas-results.git&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
2) Get scripts&lt;br /&gt;
&lt;br /&gt;
We can't directly copy results into this repository. The data is prepared using the u3etas_jar_wrapper shell script. You should already have such scripts downloaded and made available to execute by adding to your PATH. If not, clone the repository.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;git clone https://github.com/opensha/ucerf3-etas-launcher.git&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
3) Ensure that you have selected the correct Event ID and directories. See the following section for an explanation of the parameters used.&lt;br /&gt;
&lt;br /&gt;
* Set environment variables in bash config&lt;br /&gt;
* Ensure you're on an interactive node&lt;br /&gt;
* Update your bash config with:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
module load sdsc&lt;br /&gt;
                                                  &lt;br /&gt;
module load cpu/0.15.4&lt;br /&gt;
                                               &lt;br /&gt;
module use /cm/shared/apps/spack/0.21.2/cpu/a/share/spack/lmod/linux-rocky8-x86_64/Core&lt;br /&gt;
&lt;br /&gt;
module load openjdk/17.0.8.1_1&lt;br /&gt;
&lt;br /&gt;
export PROJFS=/expanse/lustre/projects/usc143/$USER&lt;br /&gt;
                   &lt;br /&gt;
export ETAS_LAUNCHER=$PROJFS/ucerf3/ucerf3-etas-launcher&lt;br /&gt;
     &lt;br /&gt;
export ETAS_SIM_DIR=$PROJFS/ucerf3/u3etas_sim&lt;br /&gt;
                            &lt;br /&gt;
export ETAS_MEM_GB=5&lt;br /&gt;
&lt;br /&gt;
export MPJ_HOME=$PROJFS/mpj-express&lt;br /&gt;
    &lt;br /&gt;
export PATH=$ETAS_LAUNCHER/parallel/slurm_sbin/:$ETAS_LAUNCHER/sbin/:$MPJ_HOME/bin:$PATH&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
4) Connect to an interactive node&lt;br /&gt;
&amp;lt;code&amp;gt;srun --partition=debug  --pty --account=usc143 --nodes=1 --ntasks-per-node=4 --mem=32G -t 00:30:00 --wait=0 --export=ALL /bin/bash&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
5) Assuming you have reserved an interactive node with sufficient memory, ensure that you have sufficient memory for execution on this node.&lt;br /&gt;
Set &amp;lt;code&amp;gt;ETAS_MEM_GB=32&amp;lt;/code&amp;gt;. &lt;br /&gt;
* export ETAS_MEM_GB=32&lt;br /&gt;
This doesn't have to be in your bash config, just execute directly. Adjust memory needed if you encounter an OutOfMemoryError during execution.&lt;br /&gt;
&lt;br /&gt;
6) Run ComCatReportPageGen&lt;br /&gt;
The following example has the ucerf3-etas-results repository cloned in our home directory on Expanse. Execution updates the local repository, after which you can &amp;lt;code&amp;gt; git push&amp;lt;/code&amp;gt; the changes upstream given sufficient permissions. Recall that such changes don't update an already generated SCEC Event page, in which case the page will have to be regenerated.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
u3etas_jar_wrapper.sh org.opensha.commons.data.comcat.plot.ComcatReportPageGen --event-id ci41075584 \&lt;br /&gt;
&lt;br /&gt;
--min-mag 0d --radius 50 --output-parent-dir /home1/10177/bhatthal/ucerf3-etas-results \&lt;br /&gt;
&lt;br /&gt;
--etas-dir $ETAS_SIM_DIR/frontera-comcat-malibu-m3.9-n14-s100000 \&lt;br /&gt;
&lt;br /&gt;
--etas-output-dir /home1/10177/bhatthal/ucerf3-etas-results/ucerf3-etas&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You will need to update the above command with the relevant absolute paths on your system. If you encounter issues with the specified outputDir, remove the conflicting &amp;quot;outputDir&amp;quot; value in your simulation's `config.json` and try again.&lt;br /&gt;
&lt;br /&gt;
If you are unable to identify the event-id, try passing with the shortened parameter and ensuring these is no space, i.e. `-eci41075584`.&lt;br /&gt;
&lt;br /&gt;
7) Commit the changes in `ucerf3-etas-results` and push upstream to origin/master.&lt;br /&gt;
See the Git Troubleshooting section if you're unable to do this.&lt;br /&gt;
&lt;br /&gt;
== ComcatReportPageGen Usage ==&lt;br /&gt;
The [https://github.com/opensha/ucerf3-etas-launcher/blob/master/sbin/u3etas_jar_wrapper.sh u3etas_jar_wrapper.sh] shell script is used to execute any Java application in the provided OpenSHA Jar. In this case, we're executing the [https://github.com/opensha/opensha/blob/master/src/main/java/org/opensha/commons/data/comcat/plot/ComcatReportPageGen.java ComcatReportPageGen] application.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
usage: ComcatReportPageGen [-?] [-d &amp;lt;arg&amp;gt;] -e &amp;lt;arg&amp;gt; [-eod &amp;lt;arg&amp;gt;] [-etas &amp;lt;arg&amp;gt;] [-m &amp;lt;arg&amp;gt;] [-o &amp;lt;arg&amp;gt;] [-opd &amp;lt;arg&amp;gt;] [-r &amp;lt;arg&amp;gt;]&lt;br /&gt;
 -?,--help                        Display this message&lt;br /&gt;
 -d,--days-before &amp;lt;arg&amp;gt;           Number of days of events before the mainshock to fetch (default: 3)&lt;br /&gt;
 -e,--event-id &amp;lt;arg&amp;gt;              ComCat event id, e.g. 'ci39126079'&lt;br /&gt;
 -eod,--etas-output-dir &amp;lt;arg&amp;gt;     If supplied, ETAS only results will also be written to &amp;lt;path&amp;gt;/&amp;lt;event-id&amp;gt;&lt;br /&gt;
 -etas,--etas-dir &amp;lt;arg&amp;gt;           Path to a UCERF3-ETAS simulation directory&lt;br /&gt;
 -m,--min-mag &amp;lt;arg&amp;gt;               Minimum magnitude of events to fetch (default: 0.0)&lt;br /&gt;
 -o,--output-dir &amp;lt;arg&amp;gt;            Output dirctory. Must supply either this or --output-parent-dir&lt;br /&gt;
 -opd,--output-parent-dir &amp;lt;arg&amp;gt;   Output parent dirctory. The directory name will be generated automatically from the&lt;br /&gt;
                                  event name, date, and magnitude. Must supply either this or --output-dir&lt;br /&gt;
 -r,--radius &amp;lt;arg&amp;gt;                Search radius around mainshock for aftershocks. Default is the greater of 10.0 km and&lt;br /&gt;
                                  twice the Wells &amp;amp; Coppersmith (1994) median rupture length for the mainshock magnitude&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In our example, we specify the absolute path to where we cloned the ucerf3-etas-results repository, as well as the absolute path to a UCERF3-ETAS results directory within the ucerf3-etas-results repository. Our results are written into two folders to allow us to filter by either date or Event ID.&lt;br /&gt;
&lt;br /&gt;
If running on Expanse, copy any plots from scratch into your own account as we can't run in someone else's folder. You can create a tarball and copy the whole folder into your account. The same logic should apply to any other HPC system. Ensure you execute the script on an interactive node, not on the head node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Workflow Automation and Potential Challenges ==&lt;br /&gt;
This process of getting our UCERF3-ETAS forecasts into SCEC Event Pages could be automated for jobs run on Quakeworx.&lt;br /&gt;
&lt;br /&gt;
We face the following challenges in doing so:&lt;br /&gt;
* Not all Quakeworx users should have permission to publish results for SCEC Event Pages&lt;br /&gt;
* If there's an existing result, does the latest run overwrite? New UI for selecting result to use?&lt;br /&gt;
* We would have to start tagging commits to track overwritten events and update the web service to checkout accordingly&lt;br /&gt;
* SCEC Event Page regeneration would need to be triggered from Quakeworx. &lt;br /&gt;
&lt;br /&gt;
Handling of permissions for a Quakeworx GitHub account, structural changes for storing and reading results from the event-reports repository, and implementing a &amp;quot;Write to Event Page&amp;quot; boolean field on job submission is all feasible. Implementing a UI to retroactively change the selected forecast and trigger page regeneration would require further investigation into the capabilities of the Quakeworx framework and if there's an API for externally triggering the generator. These changes may improve the user experience and make publishing UCERF3-ETAS forecasts easier without requiring knowledge of a Linux terminal.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Quakeworx ==&lt;br /&gt;
Results can be published even when generated on Quakeworx, not just directly via the command-line.&lt;br /&gt;
Results are stored inside the Quakeworx development (qwxdev) scratch directory, but each user has a unique ID where their jobs are written to. Below is a list of known Drupaluid IDs for users who frequently run UCERF3-ETAS on Quakeworx.&lt;br /&gt;
* Phil Maechling: 6&lt;br /&gt;
* Fabio Silva: 7&lt;br /&gt;
* Akash Bhatthal: 20&lt;br /&gt;
* Scott Callaghan: 22&lt;br /&gt;
&lt;br /&gt;
For example, a job Phil ran on Quakeworx labelled &amp;quot;DeepSprings_EQ&amp;quot; would be found at &amp;lt;code&amp;gt;/expanse/lustre/scratch/qwxdev/temp_project/qwx1/users/drupaluid_6/jobs/DeepSprings_EQ/&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
In order to run the ComcatReportPageGen, you should copy this event from scratch into your own home directory. Here you can make changes to the config.json configuration as necessary and execute the generator (See step 6) with an updated etas-dir.&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30742</id>
		<title>SCEC Event Page Troubleshooting</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30742"/>
		<updated>2026-04-02T22:35:20Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page outlines all encountered bugs with the process of generating SCEC Event Pages. Each bug should have a header with a succinct summary and the date it was first encountered. Bugs that have been resolved will have &amp;quot;'''(Resolved)'''&amp;quot; appended to its header.&lt;br /&gt;
&lt;br /&gt;
== Apr 2 2026 - Failure to generate &amp;quot;M4.6 - 1 km SE of Boulder Creek, CA 04/02/2026 (nc75337442)&amp;quot; '''(Resolved)'''==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M4.6 - 1 km SE of Boulder Creek, CA 04/02/2026 (nc75337442)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
&lt;br /&gt;
[[Image:error-generating-m4.6-boulder-creek-20260402.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 1. Error generating Boulder Creek event'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The bug was identified in the ComcatReportPageGen tool. Merging pager images fails due to mismatched image widths. Updating the bounds resolves the bug. (See [https://github.com/opensha/opensha/pull/223 #223] for details)&lt;br /&gt;
&lt;br /&gt;
After testing locally with Docker eclipse-temurin:17-jdk using the OpenSHA fat jar on opensha/opensha:master at commit c2e5c3da8fa74d9cc46b73d9f0d6d9671b34285c, we are able to generate the page successfully. The fat jar in repository opensha-event-reports is updated accordingly and changes are pulled on the server for production.&lt;br /&gt;
&lt;br /&gt;
[[Image:boulder-creek-generated-20260402.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 2. Boulder Creek event page generated successfully'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Nov 4 2025 - Failure to generate &amp;quot;M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)&amp;quot; '''(Resolved)''' ==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
Ambiguous error, “error generating page” and simply “error” using the “Enter USGS ID to Generate Page” and “Recent Earthquakes” Generate buttons. (See Fig 1.)&lt;br /&gt;
Bug is only encountered with this specific event. Able to generate an event page for 10/23/2205 Woodside, but not 10/23/2025 Yucaipa.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-error-msg.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 1. Error generating Yucaipa event'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Chrome Developer Tools allow for analysis of the JSON error message. (See Fig. 2)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-err-json-response.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 2. Resolve ambiguous error message with Chrome Developer Tools'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
A match for the error detail (See Fig 3.) is found in the &amp;lt;code&amp;gt;earthquake_event_page.php&amp;lt;/code&amp;gt; script at function &amp;lt;code&amp;gt;scec_earthquake_event_page_generate($event_id)&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-eq-event-page-php-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 3. Error detail match found in code'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This error is returned due to a non-zero return value from the execution of &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt;. Upon analysis of Docker logs we see this failure is in the Docker execution of &amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; (See Fig. 4)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-update-eq-event-report-sh-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 4. update_eq_event_report.sh invokes Docker Java 8 for generate_report.sh'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; invokes the Java OpenSHA &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; CLT with appropriate parameters.&lt;br /&gt;
&lt;br /&gt;
See logs below:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
Array(&lt;br /&gt;
    [0] =&amp;gt; cd /data/scectmp&lt;br /&gt;
    [1] =&amp;gt; run docker&lt;br /&gt;
    [2] =&amp;gt; https://earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ci41113519.geojson&lt;br /&gt;
    [3] =&amp;gt; Count of events received = 1&lt;br /&gt;
    [4] =&amp;gt; WC 1994 Radius: 0.08038963&lt;br /&gt;
    [5] =&amp;gt; Reverting to min radius of 10.0&lt;br /&gt;
    [6] =&amp;gt; Mainshock is a M3.08&lt;br /&gt;
    [7] =&amp;gt;         Hypocenter: 34.11083, -116.98350, 13.42000&lt;br /&gt;
    [8] =&amp;gt; Place name: 10 km NNE of Yucaipa, CA&lt;br /&gt;
    [9] =&amp;gt; Fetching 3.0 days of foreshocks&lt;br /&gt;
    [10] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-10-24T03:12:37.610Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-21T03:12:37.610Z&lt;br /&gt;
    [11] =&amp;gt; Count of events received = 0&lt;br /&gt;
    [12] =&amp;gt; Count of events after filtering = 0&lt;br /&gt;
    [13] =&amp;gt; Total number of events returned = 0&lt;br /&gt;
    [14] =&amp;gt; Found 0 foreshocks, maxMag=-Infinity&lt;br /&gt;
    [15] =&amp;gt; Fetching aftershocks&lt;br /&gt;
    [16] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-11-05T17:03:42.497Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-24T03:12:37.610Z&lt;br /&gt;
    [17] =&amp;gt; Count of events received = 4&lt;br /&gt;
    [18] =&amp;gt; Count of events after filtering = 2&lt;br /&gt;
    [19] =&amp;gt; Events filtered due to conversion = 0, location = 1, id = 1&lt;br /&gt;
    [20] =&amp;gt; Total number of events returned = 2&lt;br /&gt;
    [21] =&amp;gt; Found 2 aftershocks, maxMag=1.2&lt;br /&gt;
    [22] =&amp;gt; Output dir: /reports/ci41113519&lt;br /&gt;
    [23] =&amp;gt; URL: https://earthquake.usgs.gov/earthquakes/eventpage/ci41113519&lt;br /&gt;
    [24] =&amp;gt; Shakemap image: https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg&lt;br /&gt;
    [25] =&amp;gt; DYFI image: https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg&lt;br /&gt;
    [26] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg to /reports/ci41113519/resources/ci41113519_shakemap.jpg&lt;br /&gt;
    [27] =&amp;gt; DONE&lt;br /&gt;
    [28] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg to /reports/ci41113519/resources/ci41113519_dyfi.jpg&lt;br /&gt;
    [29] =&amp;gt; DONE&lt;br /&gt;
    [30] =&amp;gt; Loading FM from cached file: FM3_1.xml&lt;br /&gt;
    [31] =&amp;gt; Loading FM from cached file: FM3_2.xml&lt;br /&gt;
    [32] =&amp;gt; java.lang.IllegalStateException: Min data mag is non-finite: NaN&lt;br /&gt;
    [33] =&amp;gt;     at com.google.common.base.Preconditions.checkState(Preconditions.java:588)&lt;br /&gt;
    [34] =&amp;gt;  at org.opensha.commons.data.comcat.plot.ComcatDataPlotter.plotMagTimeFunc(ComcatDataPlotter.java:873)&lt;br /&gt;
    [35] =&amp;gt;    at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.generateReport(ComcatReportPageGen.java:384)&lt;br /&gt;
    [36] =&amp;gt;   at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.main(ComcatReportPageGen.java:1418)&lt;br /&gt;
    [37] =&amp;gt; quitting with error)&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here is a snippet of &amp;lt;code&amp;gt;ComcatDataPlotter&amp;lt;/code&amp;gt;, where the error is thrown. (See Fig. 5)&lt;br /&gt;
&lt;br /&gt;
[[Image:Comcatdataplotter-fail-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 5. Failed Precondition in ComcatDataPlotter'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
It’s resolving NaN, because minDataMag is set to infinity. We failed to find any of mainshock.getMag, foreshocksFunc.getMinY, or aftershocksFunc.getMinY&lt;br /&gt;
&lt;br /&gt;
For this specific event, we’re failing to retrieve any foreshocks, aftershocks or even the mainshock. Evidently there was a mainshock, so this must either be an error in our code parsing the geodata or a malformed geojson was provided.&lt;br /&gt;
&lt;br /&gt;
The mainshock was retrieved successfully. We didn’t find any foreshocks 3 days before the mainshock, and we’ve observed 2 aftershocks in total between the date of the mainshock (2025-10-24) and the date of the attempted event page generation (2025-11-05).&lt;br /&gt;
&lt;br /&gt;
Clearly we have this data so our minDataMag should be finite.&lt;br /&gt;
&lt;br /&gt;
After debugging ComcatReportPageGen locally, I’ve determined the issue.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
aftershocksFunc.getMinY() is returning NaN. The minimum of NaN and infinity evaluates to NaN. Later on, even though mainshock.getMag() evaluates to 3.08, Math.min(NaN, 3.08) resolves NaN. I don’t believe we should see NaN summary statistics for our aftershocksFunc, but adding checks for this allows pages to build.&lt;br /&gt;
&lt;br /&gt;
See the modified code that we now use on the Central SCEC Server for page generation: https://github.com/abhatthal/opensha-fork/tree/bugfix/comcat-report-page-gen/nonfinite-min-data-mag&lt;br /&gt;
This code will be merged into the OpenSHA codebase with a PR after review. Changes incurred at merge will require a new Jar to be built and deployed to the Central SCEC Server.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucapia-comcatdataplotter-fix.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 6. Add NaN checks in ComcatDataPlotter'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
After we deployed the new Jar, we updated the docker image invoked in &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt; from &amp;lt;code&amp;gt;openjdk:8&amp;lt;/code&amp;gt; to &amp;lt;code&amp;gt;eclipse-temurin:11-jdk&amp;lt;/code&amp;gt;.&lt;br /&gt;
This executes our updated &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; successfully.&lt;br /&gt;
&lt;br /&gt;
The generated event page for Yucaipa is available at https://central.scec.org/earthquakes/eventpage/ci41113519.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
* [[Publishing_UCERF3-ETAS_Event_Reports]]&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Boulder-creek-generated-20260402.png&amp;diff=30741</id>
		<title>File:Boulder-creek-generated-20260402.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Boulder-creek-generated-20260402.png&amp;diff=30741"/>
		<updated>2026-04-02T22:34:02Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30740</id>
		<title>SCEC Event Page Troubleshooting</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30740"/>
		<updated>2026-04-02T21:06:11Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* Apr 2 2026 - Failure to generate &amp;quot;M4.6 - 1 km SE of Boulder Creek, CA 04/02/2026 (nc75337442)&amp;quot; (Resolved) */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page outlines all encountered bugs with the process of generating SCEC Event Pages. Each bug should have a header with a succinct summary and the date it was first encountered. Bugs that have been resolved will have &amp;quot;'''(Resolved)'''&amp;quot; appended to its header.&lt;br /&gt;
&lt;br /&gt;
== Apr 2 2026 - Failure to generate &amp;quot;M4.6 - 1 km SE of Boulder Creek, CA 04/02/2026 (nc75337442)&amp;quot; '''(Resolved)'''==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M4.6 - 1 km SE of Boulder Creek, CA 04/02/2026 (nc75337442)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
&lt;br /&gt;
[[Image:error-generating-m4.6-boulder-creek-20260402.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 1. Error generating Boulder Creek event'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The bug was identified in the ComcatReportPageGen tool. Merging pager images fails due to mismatched image widths. Updating the bounds resolves the bug. (See [https://github.com/opensha/opensha/pull/223 #223] for details)&lt;br /&gt;
&lt;br /&gt;
After testing locally with Docker eclipse-temurin:17-jdk using the OpenSHA fat jar on opensha/opensha:master at commit c2e5c3da8fa74d9cc46b73d9f0d6d9671b34285c, we are able to generate the page successfully. The fat jar in repository opensha-event-reports is updated accordingly and changes are pulled on the server for production.&lt;br /&gt;
&lt;br /&gt;
== Nov 4 2025 - Failure to generate &amp;quot;M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)&amp;quot; '''(Resolved)''' ==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
Ambiguous error, “error generating page” and simply “error” using the “Enter USGS ID to Generate Page” and “Recent Earthquakes” Generate buttons. (See Fig 1.)&lt;br /&gt;
Bug is only encountered with this specific event. Able to generate an event page for 10/23/2205 Woodside, but not 10/23/2025 Yucaipa.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-error-msg.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 1. Error generating Yucaipa event'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Chrome Developer Tools allow for analysis of the JSON error message. (See Fig. 2)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-err-json-response.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 2. Resolve ambiguous error message with Chrome Developer Tools'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
A match for the error detail (See Fig 3.) is found in the &amp;lt;code&amp;gt;earthquake_event_page.php&amp;lt;/code&amp;gt; script at function &amp;lt;code&amp;gt;scec_earthquake_event_page_generate($event_id)&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-eq-event-page-php-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 3. Error detail match found in code'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This error is returned due to a non-zero return value from the execution of &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt;. Upon analysis of Docker logs we see this failure is in the Docker execution of &amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; (See Fig. 4)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-update-eq-event-report-sh-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 4. update_eq_event_report.sh invokes Docker Java 8 for generate_report.sh'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; invokes the Java OpenSHA &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; CLT with appropriate parameters.&lt;br /&gt;
&lt;br /&gt;
See logs below:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
Array(&lt;br /&gt;
    [0] =&amp;gt; cd /data/scectmp&lt;br /&gt;
    [1] =&amp;gt; run docker&lt;br /&gt;
    [2] =&amp;gt; https://earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ci41113519.geojson&lt;br /&gt;
    [3] =&amp;gt; Count of events received = 1&lt;br /&gt;
    [4] =&amp;gt; WC 1994 Radius: 0.08038963&lt;br /&gt;
    [5] =&amp;gt; Reverting to min radius of 10.0&lt;br /&gt;
    [6] =&amp;gt; Mainshock is a M3.08&lt;br /&gt;
    [7] =&amp;gt;         Hypocenter: 34.11083, -116.98350, 13.42000&lt;br /&gt;
    [8] =&amp;gt; Place name: 10 km NNE of Yucaipa, CA&lt;br /&gt;
    [9] =&amp;gt; Fetching 3.0 days of foreshocks&lt;br /&gt;
    [10] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-10-24T03:12:37.610Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-21T03:12:37.610Z&lt;br /&gt;
    [11] =&amp;gt; Count of events received = 0&lt;br /&gt;
    [12] =&amp;gt; Count of events after filtering = 0&lt;br /&gt;
    [13] =&amp;gt; Total number of events returned = 0&lt;br /&gt;
    [14] =&amp;gt; Found 0 foreshocks, maxMag=-Infinity&lt;br /&gt;
    [15] =&amp;gt; Fetching aftershocks&lt;br /&gt;
    [16] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-11-05T17:03:42.497Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-24T03:12:37.610Z&lt;br /&gt;
    [17] =&amp;gt; Count of events received = 4&lt;br /&gt;
    [18] =&amp;gt; Count of events after filtering = 2&lt;br /&gt;
    [19] =&amp;gt; Events filtered due to conversion = 0, location = 1, id = 1&lt;br /&gt;
    [20] =&amp;gt; Total number of events returned = 2&lt;br /&gt;
    [21] =&amp;gt; Found 2 aftershocks, maxMag=1.2&lt;br /&gt;
    [22] =&amp;gt; Output dir: /reports/ci41113519&lt;br /&gt;
    [23] =&amp;gt; URL: https://earthquake.usgs.gov/earthquakes/eventpage/ci41113519&lt;br /&gt;
    [24] =&amp;gt; Shakemap image: https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg&lt;br /&gt;
    [25] =&amp;gt; DYFI image: https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg&lt;br /&gt;
    [26] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg to /reports/ci41113519/resources/ci41113519_shakemap.jpg&lt;br /&gt;
    [27] =&amp;gt; DONE&lt;br /&gt;
    [28] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg to /reports/ci41113519/resources/ci41113519_dyfi.jpg&lt;br /&gt;
    [29] =&amp;gt; DONE&lt;br /&gt;
    [30] =&amp;gt; Loading FM from cached file: FM3_1.xml&lt;br /&gt;
    [31] =&amp;gt; Loading FM from cached file: FM3_2.xml&lt;br /&gt;
    [32] =&amp;gt; java.lang.IllegalStateException: Min data mag is non-finite: NaN&lt;br /&gt;
    [33] =&amp;gt;     at com.google.common.base.Preconditions.checkState(Preconditions.java:588)&lt;br /&gt;
    [34] =&amp;gt;  at org.opensha.commons.data.comcat.plot.ComcatDataPlotter.plotMagTimeFunc(ComcatDataPlotter.java:873)&lt;br /&gt;
    [35] =&amp;gt;    at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.generateReport(ComcatReportPageGen.java:384)&lt;br /&gt;
    [36] =&amp;gt;   at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.main(ComcatReportPageGen.java:1418)&lt;br /&gt;
    [37] =&amp;gt; quitting with error)&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here is a snippet of &amp;lt;code&amp;gt;ComcatDataPlotter&amp;lt;/code&amp;gt;, where the error is thrown. (See Fig. 5)&lt;br /&gt;
&lt;br /&gt;
[[Image:Comcatdataplotter-fail-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 5. Failed Precondition in ComcatDataPlotter'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
It’s resolving NaN, because minDataMag is set to infinity. We failed to find any of mainshock.getMag, foreshocksFunc.getMinY, or aftershocksFunc.getMinY&lt;br /&gt;
&lt;br /&gt;
For this specific event, we’re failing to retrieve any foreshocks, aftershocks or even the mainshock. Evidently there was a mainshock, so this must either be an error in our code parsing the geodata or a malformed geojson was provided.&lt;br /&gt;
&lt;br /&gt;
The mainshock was retrieved successfully. We didn’t find any foreshocks 3 days before the mainshock, and we’ve observed 2 aftershocks in total between the date of the mainshock (2025-10-24) and the date of the attempted event page generation (2025-11-05).&lt;br /&gt;
&lt;br /&gt;
Clearly we have this data so our minDataMag should be finite.&lt;br /&gt;
&lt;br /&gt;
After debugging ComcatReportPageGen locally, I’ve determined the issue.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
aftershocksFunc.getMinY() is returning NaN. The minimum of NaN and infinity evaluates to NaN. Later on, even though mainshock.getMag() evaluates to 3.08, Math.min(NaN, 3.08) resolves NaN. I don’t believe we should see NaN summary statistics for our aftershocksFunc, but adding checks for this allows pages to build.&lt;br /&gt;
&lt;br /&gt;
See the modified code that we now use on the Central SCEC Server for page generation: https://github.com/abhatthal/opensha-fork/tree/bugfix/comcat-report-page-gen/nonfinite-min-data-mag&lt;br /&gt;
This code will be merged into the OpenSHA codebase with a PR after review. Changes incurred at merge will require a new Jar to be built and deployed to the Central SCEC Server.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucapia-comcatdataplotter-fix.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 6. Add NaN checks in ComcatDataPlotter'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
After we deployed the new Jar, we updated the docker image invoked in &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt; from &amp;lt;code&amp;gt;openjdk:8&amp;lt;/code&amp;gt; to &amp;lt;code&amp;gt;eclipse-temurin:11-jdk&amp;lt;/code&amp;gt;.&lt;br /&gt;
This executes our updated &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; successfully.&lt;br /&gt;
&lt;br /&gt;
The generated event page for Yucaipa is available at https://central.scec.org/earthquakes/eventpage/ci41113519.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
* [[Publishing_UCERF3-ETAS_Event_Reports]]&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30739</id>
		<title>SCEC Event Page Troubleshooting</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30739"/>
		<updated>2026-04-02T20:57:33Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page outlines all encountered bugs with the process of generating SCEC Event Pages. Each bug should have a header with a succinct summary and the date it was first encountered. Bugs that have been resolved will have &amp;quot;'''(Resolved)'''&amp;quot; appended to its header.&lt;br /&gt;
&lt;br /&gt;
== Apr 2 2026 - Failure to generate &amp;quot;M4.6 - 1 km SE of Boulder Creek, CA 04/02/2026 (nc75337442)&amp;quot; '''(Resolved)'''==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M4.6 - 1 km SE of Boulder Creek, CA 04/02/2026 (nc75337442)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
&lt;br /&gt;
[[Image:error-generating-m4.6-boulder-creek-20260402.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 1. Error generating Boulder Creek event'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The bug was identified in the ComcatReportPageGen tool. Merging pager images fails due to mismatched image widths. Updating the bounds resolves the bug. (See [https://github.com/opensha/opensha/pull/223 #223] for details)&lt;br /&gt;
&lt;br /&gt;
After testing locally with Docker eclipse-temurin:17-jdk using the OpenSHA fat jar on opensha/opensha:master at commit c2e5c3da8fa74d9cc46b73d9f0d6d9671b34285c, we are able to generate the page successfully. The fat jar in repository opensha-event-reports is updated accordingly and changes are pulled on the server for production.&lt;br /&gt;
&lt;br /&gt;
TODO: Test if the prod deployment generates the event page successfully.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Nov 4 2025 - Failure to generate &amp;quot;M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)&amp;quot; '''(Resolved)''' ==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
Ambiguous error, “error generating page” and simply “error” using the “Enter USGS ID to Generate Page” and “Recent Earthquakes” Generate buttons. (See Fig 1.)&lt;br /&gt;
Bug is only encountered with this specific event. Able to generate an event page for 10/23/2205 Woodside, but not 10/23/2025 Yucaipa.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-error-msg.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 1. Error generating Yucaipa event'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Chrome Developer Tools allow for analysis of the JSON error message. (See Fig. 2)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-err-json-response.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 2. Resolve ambiguous error message with Chrome Developer Tools'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
A match for the error detail (See Fig 3.) is found in the &amp;lt;code&amp;gt;earthquake_event_page.php&amp;lt;/code&amp;gt; script at function &amp;lt;code&amp;gt;scec_earthquake_event_page_generate($event_id)&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-eq-event-page-php-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 3. Error detail match found in code'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This error is returned due to a non-zero return value from the execution of &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt;. Upon analysis of Docker logs we see this failure is in the Docker execution of &amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; (See Fig. 4)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-update-eq-event-report-sh-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 4. update_eq_event_report.sh invokes Docker Java 8 for generate_report.sh'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; invokes the Java OpenSHA &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; CLT with appropriate parameters.&lt;br /&gt;
&lt;br /&gt;
See logs below:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
Array(&lt;br /&gt;
    [0] =&amp;gt; cd /data/scectmp&lt;br /&gt;
    [1] =&amp;gt; run docker&lt;br /&gt;
    [2] =&amp;gt; https://earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ci41113519.geojson&lt;br /&gt;
    [3] =&amp;gt; Count of events received = 1&lt;br /&gt;
    [4] =&amp;gt; WC 1994 Radius: 0.08038963&lt;br /&gt;
    [5] =&amp;gt; Reverting to min radius of 10.0&lt;br /&gt;
    [6] =&amp;gt; Mainshock is a M3.08&lt;br /&gt;
    [7] =&amp;gt;         Hypocenter: 34.11083, -116.98350, 13.42000&lt;br /&gt;
    [8] =&amp;gt; Place name: 10 km NNE of Yucaipa, CA&lt;br /&gt;
    [9] =&amp;gt; Fetching 3.0 days of foreshocks&lt;br /&gt;
    [10] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-10-24T03:12:37.610Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-21T03:12:37.610Z&lt;br /&gt;
    [11] =&amp;gt; Count of events received = 0&lt;br /&gt;
    [12] =&amp;gt; Count of events after filtering = 0&lt;br /&gt;
    [13] =&amp;gt; Total number of events returned = 0&lt;br /&gt;
    [14] =&amp;gt; Found 0 foreshocks, maxMag=-Infinity&lt;br /&gt;
    [15] =&amp;gt; Fetching aftershocks&lt;br /&gt;
    [16] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-11-05T17:03:42.497Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-24T03:12:37.610Z&lt;br /&gt;
    [17] =&amp;gt; Count of events received = 4&lt;br /&gt;
    [18] =&amp;gt; Count of events after filtering = 2&lt;br /&gt;
    [19] =&amp;gt; Events filtered due to conversion = 0, location = 1, id = 1&lt;br /&gt;
    [20] =&amp;gt; Total number of events returned = 2&lt;br /&gt;
    [21] =&amp;gt; Found 2 aftershocks, maxMag=1.2&lt;br /&gt;
    [22] =&amp;gt; Output dir: /reports/ci41113519&lt;br /&gt;
    [23] =&amp;gt; URL: https://earthquake.usgs.gov/earthquakes/eventpage/ci41113519&lt;br /&gt;
    [24] =&amp;gt; Shakemap image: https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg&lt;br /&gt;
    [25] =&amp;gt; DYFI image: https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg&lt;br /&gt;
    [26] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg to /reports/ci41113519/resources/ci41113519_shakemap.jpg&lt;br /&gt;
    [27] =&amp;gt; DONE&lt;br /&gt;
    [28] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg to /reports/ci41113519/resources/ci41113519_dyfi.jpg&lt;br /&gt;
    [29] =&amp;gt; DONE&lt;br /&gt;
    [30] =&amp;gt; Loading FM from cached file: FM3_1.xml&lt;br /&gt;
    [31] =&amp;gt; Loading FM from cached file: FM3_2.xml&lt;br /&gt;
    [32] =&amp;gt; java.lang.IllegalStateException: Min data mag is non-finite: NaN&lt;br /&gt;
    [33] =&amp;gt;     at com.google.common.base.Preconditions.checkState(Preconditions.java:588)&lt;br /&gt;
    [34] =&amp;gt;  at org.opensha.commons.data.comcat.plot.ComcatDataPlotter.plotMagTimeFunc(ComcatDataPlotter.java:873)&lt;br /&gt;
    [35] =&amp;gt;    at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.generateReport(ComcatReportPageGen.java:384)&lt;br /&gt;
    [36] =&amp;gt;   at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.main(ComcatReportPageGen.java:1418)&lt;br /&gt;
    [37] =&amp;gt; quitting with error)&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here is a snippet of &amp;lt;code&amp;gt;ComcatDataPlotter&amp;lt;/code&amp;gt;, where the error is thrown. (See Fig. 5)&lt;br /&gt;
&lt;br /&gt;
[[Image:Comcatdataplotter-fail-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 5. Failed Precondition in ComcatDataPlotter'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
It’s resolving NaN, because minDataMag is set to infinity. We failed to find any of mainshock.getMag, foreshocksFunc.getMinY, or aftershocksFunc.getMinY&lt;br /&gt;
&lt;br /&gt;
For this specific event, we’re failing to retrieve any foreshocks, aftershocks or even the mainshock. Evidently there was a mainshock, so this must either be an error in our code parsing the geodata or a malformed geojson was provided.&lt;br /&gt;
&lt;br /&gt;
The mainshock was retrieved successfully. We didn’t find any foreshocks 3 days before the mainshock, and we’ve observed 2 aftershocks in total between the date of the mainshock (2025-10-24) and the date of the attempted event page generation (2025-11-05).&lt;br /&gt;
&lt;br /&gt;
Clearly we have this data so our minDataMag should be finite.&lt;br /&gt;
&lt;br /&gt;
After debugging ComcatReportPageGen locally, I’ve determined the issue.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
aftershocksFunc.getMinY() is returning NaN. The minimum of NaN and infinity evaluates to NaN. Later on, even though mainshock.getMag() evaluates to 3.08, Math.min(NaN, 3.08) resolves NaN. I don’t believe we should see NaN summary statistics for our aftershocksFunc, but adding checks for this allows pages to build.&lt;br /&gt;
&lt;br /&gt;
See the modified code that we now use on the Central SCEC Server for page generation: https://github.com/abhatthal/opensha-fork/tree/bugfix/comcat-report-page-gen/nonfinite-min-data-mag&lt;br /&gt;
This code will be merged into the OpenSHA codebase with a PR after review. Changes incurred at merge will require a new Jar to be built and deployed to the Central SCEC Server.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucapia-comcatdataplotter-fix.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 6. Add NaN checks in ComcatDataPlotter'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
After we deployed the new Jar, we updated the docker image invoked in &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt; from &amp;lt;code&amp;gt;openjdk:8&amp;lt;/code&amp;gt; to &amp;lt;code&amp;gt;eclipse-temurin:11-jdk&amp;lt;/code&amp;gt;.&lt;br /&gt;
This executes our updated &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; successfully.&lt;br /&gt;
&lt;br /&gt;
The generated event page for Yucaipa is available at https://central.scec.org/earthquakes/eventpage/ci41113519.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
* [[Publishing_UCERF3-ETAS_Event_Reports]]&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30738</id>
		<title>SCEC Event Page Troubleshooting</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30738"/>
		<updated>2026-04-02T20:54:45Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Create entry for bug report on apr 2 2026&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page outlines all encountered bugs with the process of generating SCEC Event Pages. Each bug should have a header with a succinct summary and the date it was first encountered. Bugs that have been resolved will have &amp;quot;'''(Resolved)'''&amp;quot; appended to its header.&lt;br /&gt;
&lt;br /&gt;
== Apr 2 2026 - Failure to generate &amp;quot;M4.6 - 1 km SE of Boulder Creek, CA 04/02/2026 (nc75337442)&amp;quot; '''(Resolved)'''==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M4.6 - 1 km SE of Boulder Creek, CA 04/02/2026 (nc75337442)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
&lt;br /&gt;
[[Image:error-generating-m4.6-boulder-creek-20260402.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 1. Error generating Boulder Creek event'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
The bug was identified in the ComcatReportPageGen tool. Merging pager images fails due to mismatched image widths. Updating the bounds resolves the bug. (See [https://github.com/opensha/opensha/pull/223 #223] for details)&lt;br /&gt;
&lt;br /&gt;
TODO: After testing locally with Docker eclipse-temurin:17-jdk using the OpenSHA fat jar on master at commit c2e5c3da8fa74d9cc46b73d9f0d6d9671b34285c, we are able to generate the page successfully. The fat jar in repository opensha-event-reports is updated accordingly and changes are pulled on the server for production.&lt;br /&gt;
&lt;br /&gt;
TODO: Test if the prod deployment generates the event page successfully.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Nov 4 2025 - Failure to generate &amp;quot;M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)&amp;quot; '''(Resolved)''' ==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
Ambiguous error, “error generating page” and simply “error” using the “Enter USGS ID to Generate Page” and “Recent Earthquakes” Generate buttons. (See Fig 1.)&lt;br /&gt;
Bug is only encountered with this specific event. Able to generate an event page for 10/23/2205 Woodside, but not 10/23/2025 Yucaipa.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-error-msg.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 1. Error generating Yucaipa event'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Chrome Developer Tools allow for analysis of the JSON error message. (See Fig. 2)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-err-json-response.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 2. Resolve ambiguous error message with Chrome Developer Tools'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
A match for the error detail (See Fig 3.) is found in the &amp;lt;code&amp;gt;earthquake_event_page.php&amp;lt;/code&amp;gt; script at function &amp;lt;code&amp;gt;scec_earthquake_event_page_generate($event_id)&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-eq-event-page-php-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 3. Error detail match found in code'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This error is returned due to a non-zero return value from the execution of &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt;. Upon analysis of Docker logs we see this failure is in the Docker execution of &amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; (See Fig. 4)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-update-eq-event-report-sh-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 4. update_eq_event_report.sh invokes Docker Java 8 for generate_report.sh'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; invokes the Java OpenSHA &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; CLT with appropriate parameters.&lt;br /&gt;
&lt;br /&gt;
See logs below:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
Array(&lt;br /&gt;
    [0] =&amp;gt; cd /data/scectmp&lt;br /&gt;
    [1] =&amp;gt; run docker&lt;br /&gt;
    [2] =&amp;gt; https://earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ci41113519.geojson&lt;br /&gt;
    [3] =&amp;gt; Count of events received = 1&lt;br /&gt;
    [4] =&amp;gt; WC 1994 Radius: 0.08038963&lt;br /&gt;
    [5] =&amp;gt; Reverting to min radius of 10.0&lt;br /&gt;
    [6] =&amp;gt; Mainshock is a M3.08&lt;br /&gt;
    [7] =&amp;gt;         Hypocenter: 34.11083, -116.98350, 13.42000&lt;br /&gt;
    [8] =&amp;gt; Place name: 10 km NNE of Yucaipa, CA&lt;br /&gt;
    [9] =&amp;gt; Fetching 3.0 days of foreshocks&lt;br /&gt;
    [10] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-10-24T03:12:37.610Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-21T03:12:37.610Z&lt;br /&gt;
    [11] =&amp;gt; Count of events received = 0&lt;br /&gt;
    [12] =&amp;gt; Count of events after filtering = 0&lt;br /&gt;
    [13] =&amp;gt; Total number of events returned = 0&lt;br /&gt;
    [14] =&amp;gt; Found 0 foreshocks, maxMag=-Infinity&lt;br /&gt;
    [15] =&amp;gt; Fetching aftershocks&lt;br /&gt;
    [16] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-11-05T17:03:42.497Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-24T03:12:37.610Z&lt;br /&gt;
    [17] =&amp;gt; Count of events received = 4&lt;br /&gt;
    [18] =&amp;gt; Count of events after filtering = 2&lt;br /&gt;
    [19] =&amp;gt; Events filtered due to conversion = 0, location = 1, id = 1&lt;br /&gt;
    [20] =&amp;gt; Total number of events returned = 2&lt;br /&gt;
    [21] =&amp;gt; Found 2 aftershocks, maxMag=1.2&lt;br /&gt;
    [22] =&amp;gt; Output dir: /reports/ci41113519&lt;br /&gt;
    [23] =&amp;gt; URL: https://earthquake.usgs.gov/earthquakes/eventpage/ci41113519&lt;br /&gt;
    [24] =&amp;gt; Shakemap image: https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg&lt;br /&gt;
    [25] =&amp;gt; DYFI image: https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg&lt;br /&gt;
    [26] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg to /reports/ci41113519/resources/ci41113519_shakemap.jpg&lt;br /&gt;
    [27] =&amp;gt; DONE&lt;br /&gt;
    [28] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg to /reports/ci41113519/resources/ci41113519_dyfi.jpg&lt;br /&gt;
    [29] =&amp;gt; DONE&lt;br /&gt;
    [30] =&amp;gt; Loading FM from cached file: FM3_1.xml&lt;br /&gt;
    [31] =&amp;gt; Loading FM from cached file: FM3_2.xml&lt;br /&gt;
    [32] =&amp;gt; java.lang.IllegalStateException: Min data mag is non-finite: NaN&lt;br /&gt;
    [33] =&amp;gt;     at com.google.common.base.Preconditions.checkState(Preconditions.java:588)&lt;br /&gt;
    [34] =&amp;gt;  at org.opensha.commons.data.comcat.plot.ComcatDataPlotter.plotMagTimeFunc(ComcatDataPlotter.java:873)&lt;br /&gt;
    [35] =&amp;gt;    at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.generateReport(ComcatReportPageGen.java:384)&lt;br /&gt;
    [36] =&amp;gt;   at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.main(ComcatReportPageGen.java:1418)&lt;br /&gt;
    [37] =&amp;gt; quitting with error)&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here is a snippet of &amp;lt;code&amp;gt;ComcatDataPlotter&amp;lt;/code&amp;gt;, where the error is thrown. (See Fig. 5)&lt;br /&gt;
&lt;br /&gt;
[[Image:Comcatdataplotter-fail-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 5. Failed Precondition in ComcatDataPlotter'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
It’s resolving NaN, because minDataMag is set to infinity. We failed to find any of mainshock.getMag, foreshocksFunc.getMinY, or aftershocksFunc.getMinY&lt;br /&gt;
&lt;br /&gt;
For this specific event, we’re failing to retrieve any foreshocks, aftershocks or even the mainshock. Evidently there was a mainshock, so this must either be an error in our code parsing the geodata or a malformed geojson was provided.&lt;br /&gt;
&lt;br /&gt;
The mainshock was retrieved successfully. We didn’t find any foreshocks 3 days before the mainshock, and we’ve observed 2 aftershocks in total between the date of the mainshock (2025-10-24) and the date of the attempted event page generation (2025-11-05).&lt;br /&gt;
&lt;br /&gt;
Clearly we have this data so our minDataMag should be finite.&lt;br /&gt;
&lt;br /&gt;
After debugging ComcatReportPageGen locally, I’ve determined the issue.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
aftershocksFunc.getMinY() is returning NaN. The minimum of NaN and infinity evaluates to NaN. Later on, even though mainshock.getMag() evaluates to 3.08, Math.min(NaN, 3.08) resolves NaN. I don’t believe we should see NaN summary statistics for our aftershocksFunc, but adding checks for this allows pages to build.&lt;br /&gt;
&lt;br /&gt;
See the modified code that we now use on the Central SCEC Server for page generation: https://github.com/abhatthal/opensha-fork/tree/bugfix/comcat-report-page-gen/nonfinite-min-data-mag&lt;br /&gt;
This code will be merged into the OpenSHA codebase with a PR after review. Changes incurred at merge will require a new Jar to be built and deployed to the Central SCEC Server.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucapia-comcatdataplotter-fix.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 6. Add NaN checks in ComcatDataPlotter'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
After we deployed the new Jar, we updated the docker image invoked in &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt; from &amp;lt;code&amp;gt;openjdk:8&amp;lt;/code&amp;gt; to &amp;lt;code&amp;gt;eclipse-temurin:11-jdk&amp;lt;/code&amp;gt;.&lt;br /&gt;
This executes our updated &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; successfully.&lt;br /&gt;
&lt;br /&gt;
The generated event page for Yucaipa is available at https://central.scec.org/earthquakes/eventpage/ci41113519.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
* [[Publishing_UCERF3-ETAS_Event_Reports]]&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Error-generating-m4.6-boulder-creek-20260402.png&amp;diff=30737</id>
		<title>File:Error-generating-m4.6-boulder-creek-20260402.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Error-generating-m4.6-boulder-creek-20260402.png&amp;diff=30737"/>
		<updated>2026-04-02T20:53:37Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Bhatthal_Projects_and_Presentations&amp;diff=30645</id>
		<title>Bhatthal Projects and Presentations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Bhatthal_Projects_and_Presentations&amp;diff=30645"/>
		<updated>2026-03-04T10:18:22Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Update with recent accomplishments&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;Below are links to all projects, presentations, and resources given by Akash Bhatthal.&lt;br /&gt;
&lt;br /&gt;
Projects typically each have their own SCECpedia entry with an outline, problem summary, and links to relevant resources.&lt;br /&gt;
&lt;br /&gt;
== 2026 ==&lt;br /&gt;
=== Mar 2026 ===&lt;br /&gt;
* Released [https://github.com/opensha/opensha/releases/tag/v26.1.0 OpenSHA v26.1], introducing the IM Event Set Calculator&lt;br /&gt;
&lt;br /&gt;
=== Jan 2026 ===&lt;br /&gt;
* [https://docs.google.com/presentation/d/1CPc14b8SNJgJYZFgmVhAK-s2V2h9nSCG/edit?usp=sharing&amp;amp;ouid=112829585059646673816&amp;amp;rtpof=true&amp;amp;sd=true Q1 Roadmap PPTX]&lt;br /&gt;
* Outline development pain points, user feedback, and aspirations for OpenSHA and UCERF3-ETAS&lt;br /&gt;
** [https://docs.google.com/presentation/d/1iD5LKYrEO6MiqYKj2WfhiGCQBqX2WjcJX-3mjjgI6-8/edit?usp=sharing SCEC Seismic Hazard Portfolio -- OpenSHA/UCERF3-ETAS Presentation Notes]&lt;br /&gt;
** [https://docs.google.com/presentation/d/1RRmNbAouzOdswGtMKkLk3uQSdHoS-Moh19GgWu4qEcQ/edit?usp=sharing SCEC Seismic Hazard Portfolio]&lt;br /&gt;
* Migrated the OpenSHA Legacy Server to AWS&lt;br /&gt;
** Systems design and cost analysis: [https://docs.google.com/document/d/1AV_nIxIrX-bpm1kHVs3u6ejfpSsKfhwZ5Zyxe14RHZw/edit?usp=sharing Legacy OpenSHA Server Migration]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== 2025 ==&lt;br /&gt;
=== Dec 2025 ===&lt;br /&gt;
* Began OpenSHA v26.1 [[Beta Testing]]&lt;br /&gt;
&lt;br /&gt;
=== Nov 2025 ===&lt;br /&gt;
* Resolved a production failure for the SCEC Event Page Generator: [https://strike.scec.org/scecpedia/SCEC_Event_Page_Troubleshooting#Nov_4_2025_-_Failure_to_generate_%22M3.1_-_10km_NNE_of_Yucaipa,_CA_10/23/2025_(ci41113519)%22_(Resolved) Nov 4 2025 - Failure to generate Yucaipa M3.1]&lt;br /&gt;
* Documented all [https://docs.google.com/document/d/1B0x5CQ0Mi4m4DaalTYQ3N8897VlKblpxKIByWMlnT10/edit?usp=sharing OpenSHA Repositories]&lt;br /&gt;
* Created a new UCERF3-ETAS Mail List at SCEC-UCERF3-ETAS-L@MAILLIST.USC.EDU&lt;br /&gt;
&lt;br /&gt;
=== Oct 2025 ===&lt;br /&gt;
* Investigated impact of [[Preferred Rupture Directivity in Hazard Curve Computations]]&lt;br /&gt;
* Created the ComparisonCurvePlotter CLT, an OpenSHA-CyberShake tool used to compare multiple hazard curves for Preferred Rupture Directivity study&lt;br /&gt;
&lt;br /&gt;
=== Sep 2025 ===&lt;br /&gt;
* Designed a new system for SCEC Event Pages generated on Quakeworx detailed in the [https://docs.google.com/document/d/1u-SceQPbC8DFcwLa_MMGO3Y8RLLyLNLyq7PvJ_N8-CI/edit?usp=sharing SEP Development Plan]&lt;br /&gt;
&lt;br /&gt;
=== Earlier 2025 ===&lt;br /&gt;
* SCEC 2025 Annual Meeting: [https://central.scec.org/meetings/2025/am/poster/302 New Seismic Hazard Research Capabilities and Software Improvements in OpenSHA v25.4] ([[File:2025_SCEC_OpenSHA_Poster.pdf]])&lt;br /&gt;
* Enabled [[Preferred Rupture Directivity in Hazard Curve Computations]] in HazardCurvePlotter CLT&lt;br /&gt;
* Released [https://opensha.org/ OpenSHA v25.4], the first major software release in over 4 years&lt;br /&gt;
* Documented the workflow for [[Publishing UCERF3-ETAS Event Reports]] on SCEC Event Pages&lt;br /&gt;
* Presented the [https://drive.google.com/drive/folders/1SvFy9rGyJSIZz0vj_jwRkmV0GqpFYoP2 UCERF3-ETAS] tutorial application exercise at the 2025 Quakeworx Workshop and the March 2025 Staff Meeting&lt;br /&gt;
* Write and deploy [[PdfGen]] wrapper and installation scripts for PDF plot results on Quakeworx&lt;br /&gt;
* Create [[Software Development Practices]] with first entry [https://docs.google.com/document/d/1DulfQKxiJpZuzjzLViQ7u0tuhrQkBzcQgV3_-TIoE6U/edit?usp=sharing How to Contribute Code: A practical overview of development workflows using Git]&lt;br /&gt;
&lt;br /&gt;
== 2024 ==&lt;br /&gt;
* Develop [[OpenSHA-Jupyter]] for portable and interactive demonstrations of OpenSHA code&lt;br /&gt;
* Create [[GetFile]] framework for use in OpenSHA versioned and validated file downloads&lt;br /&gt;
* Support [[SCEC VDO]] latest release across all platforms&lt;br /&gt;
* Collect [[UCERF3-ETAS Measurements]] on HPC systems and documented methodology&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=OpenSHA_Beta_Testing&amp;diff=30644</id>
		<title>OpenSHA Beta Testing</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=OpenSHA_Beta_Testing&amp;diff=30644"/>
		<updated>2026-03-04T09:12:59Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page provides experimental versions of OpenSHA applications for beta testing.&lt;br /&gt;
Stable production versions of OpenSHA are released on the [https://github.com/opensha/opensha/releases GitHub releases page]&lt;br /&gt;
&lt;br /&gt;
== OpenSHA January 2026 Release ==&lt;br /&gt;
Unlike previous releases, there are two variants for this OpenSHA beta release.&lt;br /&gt;
This release coincides with the OpenSHA server migration. Detailed instructions for beta testing, release notes, and tutorials for new applications are available in the README found in beta archives. See the Downloads table below.&lt;br /&gt;
&lt;br /&gt;
== OpenSHA April 2025 Release ==&lt;br /&gt;
&lt;br /&gt;
=== Resources ===&lt;br /&gt;
Please refer to the following resources for the release schedule and instructions on how to engage with OpenSHA software and provide feedback.&lt;br /&gt;
&lt;br /&gt;
* [https://docs.google.com/document/d/1BzhcXrnzUxfNVMZQLn4OPWod2a4NgeYKSpZVu_OWuIU/edit?usp=sharing Release Overview]&lt;br /&gt;
* [https://docs.google.com/document/d/1pZtNHaXr89pV0oGl32ewUA4-bQjqT8MR27ALZPydrmA/edit?usp=sharing Beta Testing Instructions]&lt;br /&gt;
* [https://forms.gle/4qNv46kSfeV6nDuy7 Feedback Form]&lt;br /&gt;
* [https://docs.google.com/presentation/d/1niS67--kQXX53BL3eJsOaJzaVglOPHsl/edit?usp=sharing Introduction to OpenSHA (2018)]&lt;br /&gt;
&lt;br /&gt;
== Downloads ==&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;margin:left&amp;quot;&lt;br /&gt;
|+&lt;br /&gt;
|-&lt;br /&gt;
! URL !! Size !! MD5 !! Release Date&lt;br /&gt;
|-&lt;br /&gt;
| '''(latest)''' [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-26.1.1-beta.zip opensha/release/opensha-26.1.1-beta.zip] || 1633 MB || MD5: 55c08b0180281097f0c3ee2525399001 || Released January 7, 2026&lt;br /&gt;
|-&lt;br /&gt;
| [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-26.1.0-beta.zip opensha/release/opensha-26.1.0-beta.zip] || 1619 MB || MD5: 6d71504ceed68a066346d3fbc7c469ef || Released December 17, 2025&lt;br /&gt;
|-&lt;br /&gt;
| [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-25.4.1-beta.zip opensha/release/opensha-25.4.1-beta.zip] || 459 MB || MD5: 4b4a66a24bd04553bd438c3bf7fc3253 || Released March 25, 2025&lt;br /&gt;
|-&lt;br /&gt;
| [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-25.4.0-beta.zip opensha/release/opensha-25.4.0-beta.zip] || 536 MB || MD5: 941ac69c08fed8b5966a788c955d1f78 || Released March 17, 2025&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=OpenSHA_Beta_Testing&amp;diff=30611</id>
		<title>OpenSHA Beta Testing</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=OpenSHA_Beta_Testing&amp;diff=30611"/>
		<updated>2026-01-07T17:57:09Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* Downloads */ Add beta release v26.1.1&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page provides experimental versions of OpenSHA applications for beta testing.&lt;br /&gt;
Stable production versions of OpenSHA are released on the [https://github.com/opensha/opensha/releases GitHub releases page]&lt;br /&gt;
&lt;br /&gt;
== OpenSHA January 2026 Release ==&lt;br /&gt;
Unlike previous releases, there are two variants for this OpenSHA beta release.&lt;br /&gt;
This release coincides with the OpenSHA server migration. Detailed instructions for beta testing, release notes, and tutorials for new applications are available in the README found in beta archives. See the Downloads table below.&lt;br /&gt;
&lt;br /&gt;
== OpenSHA April 2025 Release ==&lt;br /&gt;
&lt;br /&gt;
=== Resources ===&lt;br /&gt;
Please refer to the following resources for the release schedule and instructions on how to engage with OpenSHA software and provide feedback.&lt;br /&gt;
&lt;br /&gt;
* [https://docs.google.com/document/d/1BzhcXrnzUxfNVMZQLn4OPWod2a4NgeYKSpZVu_OWuIU/edit?usp=sharing Release Overview]&lt;br /&gt;
* [https://docs.google.com/document/d/1pZtNHaXr89pV0oGl32ewUA4-bQjqT8MR27ALZPydrmA/edit?usp=sharing Beta Testing Instructions]&lt;br /&gt;
* [https://forms.gle/4qNv46kSfeV6nDuy7 Feedback Form]&lt;br /&gt;
* [https://docs.google.com/presentation/d/1niS67--kQXX53BL3eJsOaJzaVglOPHsl/edit?usp=sharing Introduction to OpenSHA (2018)]&lt;br /&gt;
&lt;br /&gt;
=== Downloads ===&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;margin:left&amp;quot;&lt;br /&gt;
|+&lt;br /&gt;
|-&lt;br /&gt;
! URL !! Size !! MD5 !! Release Date&lt;br /&gt;
|-&lt;br /&gt;
| '''(latest)''' [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-26.1.1-beta.zip opensha/release/opensha-26.1.1-beta.zip] || 1633 MB || MD5: 55c08b0180281097f0c3ee2525399001 || Released January 7, 2026&lt;br /&gt;
|-&lt;br /&gt;
| [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-26.1.0-beta.zip opensha/release/opensha-26.1.0-beta.zip] || 1619 MB || MD5: 6d71504ceed68a066346d3fbc7c469ef || Released December 17, 2025&lt;br /&gt;
|-&lt;br /&gt;
| [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-25.4.1-beta.zip opensha/release/opensha-25.4.1-beta.zip] || 459 MB || MD5: 4b4a66a24bd04553bd438c3bf7fc3253 || Released March 25, 2025&lt;br /&gt;
|-&lt;br /&gt;
| [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-25.4.0-beta.zip opensha/release/opensha-25.4.0-beta.zip] || 536 MB || MD5: 941ac69c08fed8b5966a788c955d1f78 || Released March 17, 2025&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=UCERF3-ETAS_Measurements&amp;diff=30610</id>
		<title>UCERF3-ETAS Measurements</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=UCERF3-ETAS_Measurements&amp;diff=30610"/>
		<updated>2026-01-05T16:45:16Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* Docker */ Fix bug in Docker instructions&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page summarizes the performance study of UCERF3-ETAS ran locally in Docker and on [https://www.carc.usc.edu/user-guides/hpc-systems/discovery CARC Discovery], [https://www.sdsc.edu/services/hpc/expanse/ SDSC Expanse], [https://tacc.utexas.edu/systems/frontera/ TACC Frontera], and [https://tacc.utexas.edu/systems/stampede3/ TACC Stampede3]. This study allows for evaluation of resource requirements in single-node and multiple-node simulations of the [https://earthquake.usgs.gov/earthquakes/eventpage/ci38457511/executive Ridgecrest M7.1 ETAS forecast (ci38457511)].&lt;br /&gt;
&lt;br /&gt;
== Performance Results ==&lt;br /&gt;
In the tables below, &amp;quot;Core Hours&amp;quot; reflect ACCESS SUs used and are computed by dividing the runtime in minutes by 60, multiplying by the number of nodes used, and then multiplying by the number of CPU cores available on that node. As we are charged for the full node regardless of CPU utilization, this metric doesn't reflect how many cores within a node are used. The &amp;quot;Cores / Node&amp;quot; column reflects the average number of CPU cores available per node as derived by running &amp;lt;code&amp;gt;scontrol show node -a &amp;lt;node_list&amp;gt;&amp;lt;/code&amp;gt; over the list of nodes allocated per job. &amp;quot;RAM / Node&amp;quot; is a measure of RAM available to ETAS by either ETAS_MEM_GB or MEM_GIGS, not the total RAM on a given node.&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;margin:auto&amp;quot;&lt;br /&gt;
|+ Docker Measurements&lt;br /&gt;
|-&lt;br /&gt;
! Number of Nodes !! Cores / Node !! RAM / Node (GB) !! Number of Catalogs !! Runtime (min) !! Core Hours&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 14 || 75 || 10 || 1.3 || 0.30&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 14 || 75 || 100 || 4.4 || 1.0&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 14 || 75 || 1000 || 25.8 || 6.0&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 14 || 75 || 10,000 || 286.3 || 67&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Dockerized local runs with a resource allocation of 14 CPU (@ 1 thread / CPU), 96GB RAM, 1GB Swap, 64GB Disk.&lt;br /&gt;
u3etas_launcher uses 80% of available RAM, ETAS_MEM_GB=75.&lt;br /&gt;
&lt;br /&gt;
Single-node measurements below were collected using ETAS_MEM_GB=32. Multi-node measurements automatically override the default value.&lt;br /&gt;
&lt;br /&gt;
Discovery is a heterogenous system. Not all nodes within the same partition have the same number of cores available. The cores available column is calculated by taking the average of cores available over the nodes assigned.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;margin:auto&amp;quot;&lt;br /&gt;
|+ Discovery Measurements&lt;br /&gt;
|-&lt;br /&gt;
! Number of Nodes !! Cores / Node !! RAM / Node (GB) !! Threads / Node !! Total Threads !! RAM / Thread !! Scratch Enabled !! Number of Catalogs !! Runtime (min) !! Core Hours&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 24 || 32 || 30 || 30 || 1.1 || Y || 10 || 1.7 || 0.68&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 24 || 32 || 30 || 30 || 1.1 || Y || 100 || 157.2 || 63&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 20 || 32 || 30 || 30 || 1.1 || Y || 1000 || 201.4 || 67&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 24 || 32 || 30 || 30 || 1.1 || Y || 10,000 || 424.8 || 170&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 20 || 50 || 10 || 140 || 5 || Y || 10 || 2.9 || 14&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 46.86 (8x 64, 6x 24) || 50 || 10 || 140 || 5 || Y || 100 || 2.8 || 31&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 55.43 (11x 64, 3x 24) || 50 || 10 || 140 || 5 || Y || 1000 || 3.6 || 47&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 52.57 (10x 64, 4x 24) || 50 || 10 || 140 || 5 || Y || 10,000 || 17.2 || 210&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 20 || 50 || 10 || 140 || 5 || Y || 100,000 || 228.1 || 1064&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 64 || 50 || 10 || 320 || 5 || Y || 10 || 0.78 || 27&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 64 || 50 || 10 || 320 || 5 || Y || 100 || 1.2 || 41&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 60.25 (29x 64, 3x 24) || 50 || 10 || 320 || 5 || Y || 1000 || 3.9 || 125&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 50.75 (22x 64, 4x 24, 6x 20) || 50 || 10 || 320 || 5 || Y || 10,000 || 10.5 || 284&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 27.50 (4x 64, 16x 24, 12x 20) || 50 || 10 || 320 || 5 || Y || 100,000 || 99.2 || 1455&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;margin:auto&amp;quot;&lt;br /&gt;
|+ Expanse Measurements&lt;br /&gt;
|-&lt;br /&gt;
! Number of Nodes !! Cores / Node !! RAM / Node (GB) !! Threads / Node !! Total Threads !! RAM / Thread !! Scratch Enabled !! Number of Catalogs !! Runtime (min) !! Core Hours&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 128 || 32 || 10 || 10 || 3.2 ||  || 10 || 2.9 || 6.2&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 128 || 32 || 10 || 10 || 3.2 ||  || 100 || 10.4 || 22&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 128 || 32 || 10 || 10 || 3.2 ||  || 1000 || 22.6 || 48&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 128 || 32 || 10 || 10 || 3.2 ||  || 10,000 || 207.7 || 443&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 128 || 220 || 44 || 44 || 5 || Y || 10 || 1.0 || 2.1&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 128 || 220 || 44 || 44 || 5 || Y || 100 || 2.4 || 5.1&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 128 || 220 || 44 || 44 || 5 || Y || 1000 || 14.1 || 30&lt;br /&gt;
|-&lt;br /&gt;
| 1 || 128 || 200 || 40 || 40 || 5 || Y || 10,000 || 67.1 || 143&lt;br /&gt;
|-&lt;br /&gt;
| 9 || 128 || 200 || 40 || 360 || 5 || Y || 100,000 || 133.9 || 2571&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 50 || 10 || 140 || 5 ||  || 10 || 1.8 || 54&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 50 || 10 || 140 || 5 ||  || 100 || 2.1 || 63&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 50 || 10 || 140 || 5 ||  || 1000 || 5.4 || 161&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 50 || 10 || 140 || 5 ||  || 10,000 || 18.9 || 564&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 50 || 10 || 140 || 5 ||  || 100,000 || 162.4 || 4850&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 200 || 40 || 560 || 5 || Y || 10 || 1.7 || 51&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 200 || 40 || 560 || 5 || Y || 100 || 2.2 || 66&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 200 || 40 || 560 || 5 || Y || 1000 || 4.1 || 122&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 200 || 40 || 560 || 5 || Y || 10,000 || 15.3 || 457&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 200 || 40 || 560 || 5 || Y || 100,000 || 90.5 || 2703&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 200 || 25 || 350 || 8 || Y || 100,000 || 86.1 || 2572&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 128 || 224 || 14 || 196 || 16 || Y || 10,000 || 15.6 || 467&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 50 || 10 || 320 || 5 ||  || 10 || 2.1 || 143&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 50 || 10 || 320 || 5 ||  || 100 || 2.3 || 157&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 50 || 10 || 320 || 5 ||  || 1000 || 3.4 || 232&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 50 || 10 || 320 || 5 ||  || 10,000 || 11.3 || 771&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 50 || 10 || 320 || 5 ||  || 100,000 || 74.8 || 5106&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 200 || 40 || 1280 || 5 || Y || 10 || 2.0 || 137&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 200 || 40 || 1280 || 5 || Y || 100 || 2.7 || 184&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 200 || 40 || 1280 || 5 || Y || 1000 || 2.7 || 184&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 200 || 40 || 1280 || 5 || Y || 10,000 || 8.8 || 601&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 200 || 40 || 1280 || 5 || Y || 100,000 || 41.8 || 2854&lt;br /&gt;
|-&lt;br /&gt;
| 32 || 128 || 224 || 14 || 196 || 16 || Y || 100,000 || 122.2 || 8342&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;margin:auto&amp;quot;&lt;br /&gt;
|+ Frontera Measurements&lt;br /&gt;
|-&lt;br /&gt;
! Number of Nodes !! Cores / Node !! RAM / Node (GB) !! Threads / Node !! Total Threads !! RAM / Thread !! Scratch Enabled !! Number of Catalogs !! Runtime (min) !! Core Hours&lt;br /&gt;
|-&lt;br /&gt;
| 2 || 56 || 160 || 20 || 40 || 8 || Y || 10 || 2.1 || 3.9&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 56 || 160 || 20 || 280 || 8 || Y || 10,000 || 13.2 || 172&lt;br /&gt;
|-&lt;br /&gt;
| 14 || 56 || 160 || 20 || 280 || 8 || Y || 100,000 || 103.8 || 1356&lt;br /&gt;
|-&lt;br /&gt;
| 18 || 56 || 160 || 20 || 360 || 8 || Y || 100,000 || 81.2 || 1364&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;margin:auto&amp;quot;&lt;br /&gt;
|+ Stampede3 Measurements&lt;br /&gt;
|-&lt;br /&gt;
! Number of Nodes !! Queue !! Cores / Node !! RAM / Node (GB) !! Threads / Node !! Total Threads !! RAM / Thread !! Scratch Enabled !! Number of Catalogs !! Runtime (min) !! Core Hours !! Node Hours !! Charge Rate (SU/Node Hour) !! Charge (SU)&lt;br /&gt;
|-&lt;br /&gt;
| 14 || ICX || 80 || 200 || 25 || 350 || 8 || Y || 10,000 || 10.6 || 198 || 2.48 || 1.5 || 3.7&lt;br /&gt;
|-&lt;br /&gt;
| 14 || ICX || 80 || 200 || 25 || 350 || 8 || Y || 100,000 || 83.5 || 1559 || 19.5 || 1.5 || 29.3&lt;br /&gt;
|-&lt;br /&gt;
| 14 || SKX || 48 || 144 || 18 || 252 || 8 || Y || 10,000 || 13.9 || 156 || 3.25 || 1 || 3.3&lt;br /&gt;
|-&lt;br /&gt;
| 14 || SKX || 48 || 144 || 18 || 252 || 8 || Y || 100,000 || 111.1 || 1244 || 25.9 || 1 || 25.9&lt;br /&gt;
|-&lt;br /&gt;
| 14 || SPR || 112 || 104 || 13 || 182 || 8 || Y || 10,000 || 18.9 || 494 || 4.41 || 2 || 8.8&lt;br /&gt;
|-&lt;br /&gt;
| 14 || SPR || 112 || 104 || 13 || 182 || 8 || Y || 100,000 || 164.5 || 4299 || 38.4 || 2 || 76.8&lt;br /&gt;
|-&lt;br /&gt;
| 20 || SKX || 48 || 144 || 18 || 360 || 8 || Y || 100,000 || 78.7 || 1259 || 26.2 || 1 || 26.2&lt;br /&gt;
|-&lt;br /&gt;
| 27 || SPR || 112 || 104 || 13 || 351 || 8 || Y || 100,000 || 87 || 4385 || 39.2 || 2 || 78.4&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
Stampede3 SUs billed = (# nodes) x (job duration in wall clock hours) x (charge rate per node-hour)&lt;br /&gt;
&lt;br /&gt;
== Installation and Configuration ==&lt;br /&gt;
Running ETAS simulations on OpenSHA is simplified through a collection of launcher binaries and scripts called [https://github.com/opensha/ucerf3-etas-launcher ucerf3-etas-launcher]. The process of installation and configuration varies across systems, however the foundations remain the same. Running simulations always occurs in three phases.&lt;br /&gt;
# Building configuration files for a specified event, where we configure MPI nodes and number of simulations&lt;br /&gt;
# Launching the simulation with the configuration files&lt;br /&gt;
# Consolidating and plotting simulations data&lt;br /&gt;
&lt;br /&gt;
=== Docker ===&lt;br /&gt;
When running UCERF3-ETAS simulations locally, using Docker allows for a consistent environment without the need to manage dependencies and the ability to easily provision resources. Download the Docker image for the M7.1 Ridgecrest main shock by running &amp;lt;code&amp;gt;docker pull sceccode/ucerf3_jup&amp;lt;/code&amp;gt; or searching for &amp;quot;ucerf3_jup&amp;quot; on Docker Desktop. I prefer to use Docker Desktop, but the command-line is sufficient.&lt;br /&gt;
&lt;br /&gt;
Under Docker Desktop settings, I allocated 14 CPUs, 96GB of RAM, 1GB of Swap, and 64GB of disk storage to the Docker environment.&lt;br /&gt;
Open a terminal on your system with the Docker CLT installed and run &amp;lt;code&amp;gt;docker run -d -p 8888:8888 sceccode/ucerf3_jup --name ucerf3-etas&amp;lt;/code&amp;gt;. This allows you to run a container forwarding the port 8888 for the Jupyter Notebook server.&lt;br /&gt;
From here, you can navigate to the Jupyter Notebook web application at http://localhost:8888 to access an interactive terminal for the container. Alternatively, you can run the container directly in Docker Desktop and navigate to the &amp;quot;Exec&amp;quot; tab to access the terminal without needing Jupyter Notebook or port-fortwarding.&lt;br /&gt;
&lt;br /&gt;
Once inside your container, use the following workflow to run local simulations and plot data, where $NUM_SIM is the number of simulations desired.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
u3etas_comcat_event_config_builder.sh --event-id ci38457511 --mag-complete 3.5 --radius 25 --num-simulations $NUM_SIM --days-before 7 --max-point-src-mag 6 --finite-surf-shakemap --finite-surf-shakemap-min-mag 4.5 --nodes 1 --hours 24 --output-dir target/docker-comcat-ridgecrest-m7.1-n1-s$NUM_SIM&lt;br /&gt;
&lt;br /&gt;
u3etas_launcher.sh $HOME/target/docker-comcat-ridgecrest-m7.1-n1-s${NUM_SIM}/config.json | tee target/docker-comcat-ridgecrest-m7.1-n1-s${NUM_SIM}/u3etas_launcher.log&lt;br /&gt;
&lt;br /&gt;
u3etas_plot_generator.sh $HOME/target/docker-comcat-ridgecrest-m7.1-n1-s${NUM_SIM}/config.json&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You'll notice that we didn't specify an &amp;lt;code&amp;gt;hpc-site&amp;lt;/code&amp;gt; parameter. As we are running these simulations locally, and not on a High Performance Computing system, we don't need to define a site to generate a corresponding Slurm file. Instead of passing a slurm file to sbatch, we can execute the launcher directly with the generated &amp;lt;code&amp;gt;config.json&amp;lt;/code&amp;gt; file. I also pipe into the tee command to capture output for logging purposes.&lt;br /&gt;
&lt;br /&gt;
In Docker Desktop, you can navigate to the &amp;quot;Volumes&amp;quot; tab to find the stored data for the containers and download them onto your host system.&lt;br /&gt;
&lt;br /&gt;
=== Discovery ===&lt;br /&gt;
Establish a Discovery SSH or [https://ondemand.carc.usc.edu/ CARC OnDemand] connection and clone the ucerf3-etas-launcher GitHub repository at the path &amp;lt;code&amp;gt;/project/scec_608/$USER/ucerf3/ucerf3-etas-launcher&amp;lt;/code&amp;gt;, where $USER is your username.&lt;br /&gt;
&lt;br /&gt;
Edit the bashrc file at &amp;lt;code&amp;gt;$HOME/.bashrc&amp;lt;/code&amp;gt; to update the PATH to include the downloaded ETAS scripts and load HPC modules necessary to run ETAS in a multiple-node environment.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# .bashrc&lt;br /&gt;
&lt;br /&gt;
# Source global definitions&lt;br /&gt;
if [ -f /etc/bashrc ]; then&lt;br /&gt;
        . /etc/bashrc&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
# Switch groups, but only if necessary&lt;br /&gt;
if [[ `id -gn` != &amp;quot;scec_608&amp;quot; &amp;amp;&amp;amp; $- =~ i ]]&lt;br /&gt;
then&lt;br /&gt;
#    echo &amp;quot;switching group&amp;quot;&lt;br /&gt;
    newgrp scec_608&lt;br /&gt;
    exit&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
PATH=$PATH:$HOME/.local/bin:$HOME/bin&lt;br /&gt;
export TERM=linux&lt;br /&gt;
&lt;br /&gt;
## MODULES&lt;br /&gt;
module load usc # this is loaded by default on login nodes, but not on compute nodes, so we need to add 'usc' here so that the subsequent modules will work&lt;br /&gt;
module load gcc/11.3&lt;br /&gt;
module load openjdk&lt;br /&gt;
module load git&lt;br /&gt;
module load vim&lt;br /&gt;
# every once in a while CARC breaks java, and we need this to avoid unsatisfied link errors&lt;br /&gt;
# if you get them looking related to libawt_xawt.so: libXext.so.6 or similar, uncommend the following&lt;br /&gt;
# previously encountered and then went away, but came back after may 2024 maintenence window&lt;br /&gt;
module load libxtst # no clue why we suddently needed this to avoid a weird JVM unsatisfied link exception&lt;br /&gt;
&lt;br /&gt;
# compute nodes don't have unzip...&lt;br /&gt;
which unzip &amp;gt; /dev/null 2&amp;gt; /dev/null&lt;br /&gt;
if [[ $? -ne 0 ]];then&lt;br /&gt;
        module load unzip&lt;br /&gt;
        module load bzip2&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
## https://github.com/opensha/ucerf3-etas-launcher/tree/master/parallel/README.md&lt;br /&gt;
export PROJFS=/project/scec_608/$USER&lt;br /&gt;
export ETAS_LAUNCHER=$PROJFS/ucerf3/ucerf3-etas-launcher&lt;br /&gt;
export ETAS_SIM_DIR=$PROJFS/ucerf3/etas_sim&lt;br /&gt;
export ETAS_MEM_GB=5 # this will be overridden in batch scripts for parallel jobs, set low enough so that the regular U3ETAS scripts can run on the login node to configure jobs&lt;br /&gt;
export MPJ_HOME=/project/scec_608/kmilner/mpj/mpj-current&lt;br /&gt;
export PATH=$ETAS_LAUNCHER/parallel/slurm_sbin/:$ETAS_LAUNCHER/sbin/:$MPJ_HOME/bin:$PATH&lt;br /&gt;
&lt;br /&gt;
if [[ `hostname` == e19-* ]];then&lt;br /&gt;
        # on a compute node in the SCEC queue&lt;br /&gt;
        export OPENSHA_MEM_GB=50&lt;br /&gt;
        export FST_HAZARD_SPACING=0.2&lt;br /&gt;
        export OPENSHA_JAR_DISABLE_UPDATE=1&lt;br /&gt;
elif [[ -n &amp;quot;$SLURM_JOB_ID&amp;quot; ]];then&lt;br /&gt;
        # on a compute node otherwise&lt;br /&gt;
        export OPENSHA_JAR_DISABLE_UPDATE=1&lt;br /&gt;
        unset OPENSHA_MEM_GB&lt;br /&gt;
else&lt;br /&gt;
        export OPENSHA_MEM_GB=10&lt;br /&gt;
fi&lt;br /&gt;
export OPENSHA_FST=/project/scec_608/kmilner/git/opensha-fault-sys-tools&lt;br /&gt;
export OPENSHA_FS_GIT_BRANCH=master&lt;br /&gt;
export PATH=$PATH:$OPENSHA_FST/sbin&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You'll notice that in the bashrc there are references to user &amp;quot;kmilner&amp;quot;, do not change these. The files here are readable by other users and are necessary for running ETAS. There are future plans to migrate much of this code outside of the user bash file and into an MPJ Express wrapper script, to improve portability and simplify the configuration process.&lt;br /&gt;
&lt;br /&gt;
After editing the bashrc file, either login and logout or run &amp;lt;code&amp;gt;source ~/.bashrc&amp;lt;/code&amp;gt; to load the new changes.&lt;br /&gt;
&lt;br /&gt;
Utilizing launcher scripts, an interactive compute node can be accessed to build configuration files directly on Discovery, as opposed to building locally and transferring over SCP/SFTP. Non-trivial jobs cannot be executed on the head node, which is why configuration files are built in such a way. Do so now by running &amp;lt;code&amp;gt;slurm_interactive.sh&amp;lt;/code&amp;gt;. After waiting for resource provisioning, build the configuration inside the interactive compute node with &lt;br /&gt;
&amp;lt;pre&amp;gt;cd $ETAS_SIM_DIR &amp;amp;&amp;amp; u3etas_comcat_event_config_builder.sh --event-id ci38457511 --mag-complete 3.5 --radius 25 --num-simulations $NUM_SIM --days-before 7 --max-point-src-mag 6 --finite-surf-shakemap --finite-surf-shakemap-min-mag 4.5 --hpc-site USC_CARC --nodes $NUM_NODE --hours 24 --output-dir $ETAS_SIM_DIR/discovery-comcat-ridgecrest-m7.1-n${NUM_NODE}-s$NUM_SIM&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
where $NUM_SIM is the number of simulations to run and $NUM_NODE is the number of nodes to utilize.&lt;br /&gt;
&lt;br /&gt;
The generated configuration does require a bit of manual tweaking prior to execution. Navigate to the generated simulation directory. You'll notice that unlike before with localized Docker runs, we have a a Slurm file.&lt;br /&gt;
&lt;br /&gt;
[[Slurm]] files invoke the launcher script with the config JSON file through an MPJ wrapper. MPJ, or [http://mpjexpress.org/ Message Passing in Java], is utilized to enable the parallel distribution of work across the HPC compute nodes. The Slurm file also specifies the number of nodes and other parameters relevant to work distribution.&lt;br /&gt;
In your simulation folder you should see the files &amp;quot;etas_sim_mpj.slurm&amp;quot;, &amp;quot;plot_results.slurm&amp;quot;, and &amp;quot;opensha-all.jar&amp;quot;. If the jar file failed to copy, copy it manually from &amp;lt;code&amp;gt;${ETAS_LAUNCHER}/opensha/opensha-all.jar&amp;lt;/code&amp;gt;&lt;br /&gt;
Make the following changes to etas_sim_mpj.slurm:&lt;br /&gt;
* Rename partition from scec -&amp;gt; main: &amp;lt;code&amp;gt;#SBATCH -p main&amp;lt;/code&amp;gt;&lt;br /&gt;
* Ensure the ETAS JSON config path is prefixed by simulation directory: &amp;lt;code&amp;gt;ETAS_CONF_JSON=&amp;quot;${ETAS_SIM_DIR}/...&amp;lt;/code&amp;gt;&lt;br /&gt;
* Update scratch directory from scratch2 -&amp;gt; scratch1: &amp;lt;code&amp;gt; SCRATCH_OPTION=&amp;quot;--scratch-dir /scratch1/$USER/etas_scratch&amp;quot;&amp;lt;/code&amp;gt;&lt;br /&gt;
* If this simulation is on a single-node, don't invoke the MPJ wrapper:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
date&lt;br /&gt;
echo &amp;quot;RUNNING ETAS-LAUNCHER&amp;quot;&lt;br /&gt;
u3etas_launcher.sh --threads $THREADS $ETAS_CONF_JSON&lt;br /&gt;
ret=$?&lt;br /&gt;
date&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Additionally, inside the config.json file, update the &amp;quot;outputDir&amp;quot; to be prefixed with &amp;quot;${ETAS_SIM_DIR}/&amp;quot; prior to the output name, to prevent the creation of a duplicate folder.&lt;br /&gt;
&lt;br /&gt;
After making the necessary changes, place the ETAS simulation on the job queue by running &amp;lt;code&amp;gt;slurm_submit.sh etas_sim_mpj.slurm&amp;lt;/code&amp;gt;. You can rename the slurm file prior to submission to set the job name to more easily manage jobs. Stdout and stderr is written files to {JOB}.o{ID} and {JOB}.e{ID} respectively. Runtime is derived from the timestamps in the output file. Results are written to either a results/ directory or the binary &amp;quot;results_complete.bin&amp;quot;.&lt;br /&gt;
&lt;br /&gt;
Generate plots with &amp;quot;plot_results.slurm&amp;quot;. Similarly, you must also update the partition name from &amp;quot;scec&amp;quot; to &amp;quot;main&amp;quot; and submit the job with slurm_submit.sh. View final plots in the generated &amp;quot;index.html&amp;quot;. If you do not have a graphical session, you may need to download the simulation folder to view plots locally.&lt;br /&gt;
&lt;br /&gt;
=== Expanse ===&lt;br /&gt;
The Expanse Configuration takes into consideration the [https://www.sdsc.edu/support/user_guides/expanse.html Expanse User Guide]  and existing Quakeworx Dev Configuration.&lt;br /&gt;
&lt;br /&gt;
In order to establish an SSH connection to Expanse, you must first verify your Expanse project allocation. Verify project allocation at Expanse Portal -&amp;gt; OnDemand -&amp;gt; Allocation and Usage Information at Resource “Expanse”.&lt;br /&gt;
If not present, file a troubleshooting ticket at support.access-ci.org&lt;br /&gt;
&lt;br /&gt;
Unlike on Discovery, we are going to set up our own MPJ Express installation and configure an MPJ Express Wrapper. A similar process will be rolled out to Discovery in the future.&lt;br /&gt;
# Clone MPJ Express to &amp;lt;code&amp;gt;/expanse/lustre/projects/usc143/$USER/mpj-express&amp;lt;/code&amp;gt;: &amp;lt;code&amp;gt;$ git clone https://github.com/kevinmilner/mpj-express.git&amp;lt;/code&amp;gt;&lt;br /&gt;
# Set Wrapper path in &amp;lt;code&amp;gt;mpj-express/conf/mpjexpress.conf&amp;lt;/code&amp;gt;: &amp;lt;code&amp;gt;mpjexpress.ssh.wrapper=/expanse/lustre/projects/usc143/$USER/ucerf3/ucerf3-etas-env-wrapper.sh&amp;lt;/code&amp;gt;. You may want to explicitly write your username in the path here instead of using $USER.&lt;br /&gt;
# Create the MPJ Wrapper file at the specified path as follows:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
&lt;br /&gt;
module load cpu/0.15.4&lt;br /&gt;
module load openjdk/11.0.2&lt;br /&gt;
&lt;br /&gt;
export PROJFS=/expanse/lustre/projects/usc143/$USER&lt;br /&gt;
export ETAS_LAUNCHER=$PROJFS/ucerf3/ucerf3-etas-launcher&lt;br /&gt;
export ETAS_SIM_DIR=$PROJFS/ucerf3/u3etas_sim&lt;br /&gt;
export MPJ_HOME=$PROJFS/mpj-express&lt;br /&gt;
export PATH=$ETAS_LAUNCHER/parallel/slurm_sbin:$ETAS_LAUNCHER/sbin/:$MPJ_HOME/bin:$PATH&lt;br /&gt;
&lt;br /&gt;
&amp;quot;$@&amp;quot;&lt;br /&gt;
exit $?&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Add the following to the bashrc:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
module load sdsc&lt;br /&gt;
module load cpu/0.15.4&lt;br /&gt;
module load openjdk/11.0.2&lt;br /&gt;
&lt;br /&gt;
# compute nodes don't have unzip...&lt;br /&gt;
which unzip &amp;gt; /dev/null 2&amp;gt; /dev/null&lt;br /&gt;
if [[ $? -ne 0 ]];then&lt;br /&gt;
        module load unzip&lt;br /&gt;
        module load bzip2&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
# https://github.com/opensha/ucerf3-etas-launcher/tree/master/parallel/README.md&lt;br /&gt;
export PROJFS=/expanse/lustre/projects/usc143/$USER&lt;br /&gt;
export ETAS_LAUNCHER=$PROJFS/ucerf3/ucerf3-etas-launcher&lt;br /&gt;
export ETAS_SIM_DIR=$PROJFS/ucerf3/u3etas_sim&lt;br /&gt;
export ETAS_MEM_GB=5 # this will be overridden in batch scripts for parallel jobs, set low enough so that the regular U3ETAS scripts can run on the login node to configure jobs&lt;br /&gt;
export MPJ_HOME=$PROJFS/mpj-express&lt;br /&gt;
export PATH=$ETAS_LAUNCHER/parallel/slurm_sbin/:$ETAS_LAUNCHER/sbin/:$MPJ_HOME/bin:$PATH&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
Single-node simulations won't invoke the MPJ Express Wrapper, which is why these changes are necessary.&lt;br /&gt;
&lt;br /&gt;
Connect to an interactive compute node:&lt;br /&gt;
&amp;lt;pre&amp;gt;srun --partition=debug  --pty --account=usc143 --nodes=1 --ntasks-per-node=4 --mem=16G -t 00:30:00 --wait=0 --export=ALL /bin/bash&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
and build the simulation with NUM_SIM catalogs and NUM_NODE nodes.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
u3etas_comcat_event_config_builder.sh --event-id ci38457511 --mag-complete 3.5 --radius 25 --num-simulations $NUM_SIM --days-before 7 --max-point-src-mag 6 --finite-surf-shakemap --finite-surf-shakemap-min-mag 4.5 --hpc-site USC_CARC --nodes $NUM_NODE --hours 24 --output-dir $ETAS_SIM_DIR/expanse-comcat-ridgecrest-m7.1-n${NUM_NODE}-s$NUM_SIM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Inside the slurm config file, set to use the &amp;quot;compute&amp;quot; partition instead of &amp;quot;scec&amp;quot;.&lt;br /&gt;
&amp;lt;pre&amp;gt;#SBATCH --partition compute&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
I can't confirm if it's necessary to set MPJ_HOME in the Slurm configuration, given our wrapper sets this already, but I have set it here as well.&lt;br /&gt;
&lt;br /&gt;
Take care to explicitly set ETAS_MEM_GB for single-node runs to desired memory available &amp;lt; MEM_GIGS. &lt;br /&gt;
Consider the total RAM available per node, which is ~256GB according to the Expanse User Guide.&lt;br /&gt;
&lt;br /&gt;
Set to directly invoke u3etas_launcher for single-node simulations just like previously done on Discovery.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Quakeworx Dev doesn’t use a scratch file. Scratch files aren't necessary, but may speed up I/O operations. Comment out the SCRATCH parameter in the slurm configuration.&lt;br /&gt;
&lt;br /&gt;
Set the account for your research project, in my case it's &amp;quot;usc143&amp;quot;.&lt;br /&gt;
&amp;lt;pre&amp;gt;#SBATCH --account=usc143&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Depending on your account quota, you may struggle to run 32 node simulations. In my case I used another project &amp;quot;ddp408&amp;quot; for these simulations.&lt;br /&gt;
&lt;br /&gt;
Check available projects for expanse resource with &amp;lt;code&amp;gt;expanse-client user -r expanse&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Unlike on Discovery, we must set ntasks-per-node or ntasks.&lt;br /&gt;
&amp;lt;pre&amp;gt;#SBATCH --ntasks 20&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
I ran successfully with 20, although you can try a higher value. Too many tasks may result in a job quota failure.&lt;br /&gt;
&lt;br /&gt;
As Expanse has 128 cores available per node, and we are charged for the full node regardless of utilization, take care to set cores-per-node=128. I didn't do this for my runs, but we still reflect the 128 cores in the Measurements table to accurately reflect the cost.&lt;br /&gt;
&lt;br /&gt;
Job execution and data plotting instructions are identical to Discovery.&lt;br /&gt;
&lt;br /&gt;
=== Frontera ===&lt;br /&gt;
Before attempting to configure Frontera or Stampede3, consider that as TACC systems they share the Stockyard filesystem.&lt;br /&gt;
Configuration for Frontera and Stampede3 are nearly identical, which means we could share many files from $STOCKYARD/frontera to $STOCKYARD/stampede3. In fact, mpj-express, ucerf3-etas-launcher, and jdk-22 can be stored directly in $STOCKYARD and shared across systems. This wasn’t done in our example as I didn’t realize this until later. u3etas_sim should still be under $WORK for each system for easier organization. In both Frontera and Stampede3 there are paths for $WORK and $SCRATCH that are already set by default inside Stockyard.&lt;br /&gt;
&lt;br /&gt;
Firstly, we configure the user bashrc with compute modules and paths for MPJ and ETAS.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
############&lt;br /&gt;
# SECTION 1&lt;br /&gt;
#&lt;br /&gt;
# There are three independent and safe ways to modify the standard&lt;br /&gt;
# module setup. Below are three ways from the simplest to hardest.&lt;br /&gt;
#   a) Use &amp;quot;module save&amp;quot;  (see &amp;quot;module help&amp;quot; for details).&lt;br /&gt;
#   b) Place module commands in ~/.modules&lt;br /&gt;
#   c) Place module commands in this file inside the if block below.&lt;br /&gt;
#&lt;br /&gt;
# Note that you should only do one of the above.  You do not want&lt;br /&gt;
# to override the inherited module environment by having module&lt;br /&gt;
# commands outside of the if block[3].&lt;br /&gt;
&lt;br /&gt;
if [ -z &amp;quot;$__BASHRC_SOURCED__&amp;quot; -a &amp;quot;$ENVIRONMENT&amp;quot; != BATCH ]; then&lt;br /&gt;
  export __BASHRC_SOURCED__=1&lt;br /&gt;
&lt;br /&gt;
  ##################################################################&lt;br /&gt;
  # **** PLACE MODULE COMMANDS HERE and ONLY HERE.              ****&lt;br /&gt;
  ##################################################################&lt;br /&gt;
&lt;br /&gt;
  module load gcc&lt;br /&gt;
  module load git&lt;br /&gt;
&lt;br /&gt;
  # compute nodes don't have unzip...&lt;br /&gt;
  which unzip &amp;gt; /dev/null 2&amp;gt; /dev/null&lt;br /&gt;
  if [[ $? -ne 0 ]]; then&lt;br /&gt;
    module load unzip&lt;br /&gt;
    module load bzip2&lt;br /&gt;
  fi&lt;br /&gt;
&lt;br /&gt;
fi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
############&lt;br /&gt;
# SECTION 2&lt;br /&gt;
#&lt;br /&gt;
# Please set or modify any environment variables inside the if block&lt;br /&gt;
# below.  For example, modifying PATH or other path like variables&lt;br /&gt;
# (e.g LD_LIBRARY_PATH), the guard variable (__PERSONAL_PATH___)&lt;br /&gt;
# prevents your PATH from having duplicate directories on sub-shells.&lt;br /&gt;
&lt;br /&gt;
if [ -z &amp;quot;$__PERSONAL_PATH__&amp;quot; ]; then&lt;br /&gt;
  export __PERSONAL_PATH__=1&lt;br /&gt;
&lt;br /&gt;
  ###################################################################&lt;br /&gt;
  # **** PLACE Environment Variables including PATH here.        ****&lt;br /&gt;
  ###################################################################&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
  export JAVA_HOME=$WORK/jdk-22.0.1&lt;br /&gt;
  export PATH=$HOME/bin:$JAVA_HOME/bin:$PATH&lt;br /&gt;
&lt;br /&gt;
  # https://github.com/opensha/ucerf3-etas-launcher/tree/master/parallel/README.md&lt;br /&gt;
  export ETAS_LAUNCHER=$WORK/ucerf3/ucerf3-etas-launcher&lt;br /&gt;
  export ETAS_SIM_DIR=$WORK/ucerf3/u3etas_sim&lt;br /&gt;
  export ETAS_MEM_GB=5 # this will be overridden in batch scripts for parallel jobs, set low enough so that the regular U3ETAS scripts can run on the login node to configure jobs&lt;br /&gt;
  export MPJ_HOME=$WORK/mpj-express&lt;br /&gt;
  export PATH=$ETAS_LAUNCHER/parallel/slurm_sbin/:$ETAS_LAUNCHER/sbin/:$MPJ_HOME/bin:$PATH&lt;br /&gt;
&lt;br /&gt;
fi&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
There is no module for OpenJDK available on TACC systems, nor is there any existing MPJExpress installation.&lt;br /&gt;
&lt;br /&gt;
Install Java from tarball at https://www.oracle.com/java/technologies/javase/jdk22-archive-downloads.html&lt;br /&gt;
and &amp;lt;code&amp;gt;tar -xzvf&amp;lt;/code&amp;gt; into $WORK/jdk-22.0.1&lt;br /&gt;
&lt;br /&gt;
Download Kevin's fork of MPJExpress at $WORK/mpj-express&lt;br /&gt;
&amp;lt;pre&amp;gt;git clone https://github.com/kevinmilner/mpj-express&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Unlike on Expanse, an MPJ Wrapper script was not necessary, just ensure you configure the slurm script and bashrc correctly.&lt;br /&gt;
&lt;br /&gt;
Connect to an interactive node by running &amp;lt;code&amp;gt;idev&amp;lt;/code&amp;gt;. Defaults to 30 minutes on default queue. srun is also available on Frontera, but idev is preferred as per the Frontera User Guide.&lt;br /&gt;
&amp;lt;pre&amp;gt;idev -A EAR20006 -p flex -N 1 -n 4 -m 30&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
From within an interactive node, build the configuration with desired number of catalogs, &amp;lt;code&amp;gt;NUM_SIM&amp;lt;/code&amp;gt; and nodes, &amp;lt;code&amp;gt;NUM_NODE&amp;lt;/code&amp;gt;.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ETAS_SIM_DIR &amp;amp;&amp;amp; u3etas_comcat_event_config_builder.sh --event-id ci38457511 --mag-complete 3.5 --radius 25 --num-simulations $NUM_SIM --days-before 7 --max-point-src-mag 6 --finite-surf-shakemap --finite-surf-shakemap-min-mag 4.5 --hpc-site TACC_FRONTERA --nodes $NUM_NODE --hours 24 --queue normal --output-dir $ETAS_SIM_DIR/frontera-comcat-ridgecrest-m7.1-n${NUM_NODE}-s$NUM_SIM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Navigate to the generated config's etas_sim_mpj.slurm and make the following changes. Note that this step will not be necessary after we update the default slurm script for this hpc-site in the u3etas-launcher code.&lt;br /&gt;
&lt;br /&gt;
Set SBATCH&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#SBATCH -t 24:00:00&lt;br /&gt;
#SBATCH --nodes 14&lt;br /&gt;
#SBATCH --ntasks 14&lt;br /&gt;
#SBATCH --cpus-per-task=56&lt;br /&gt;
#SBATCH --partition normal&lt;br /&gt;
#SBATCH --mem 0&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Don't use FastMPJ, use ExpressMPJ. Use your own MPJ_HOME path, not mine.&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
# FMPJ_HOME directory, fine to use mine&lt;br /&gt;
#FMPJ_HOME=/home1/00950/kevinm/FastMPJ&lt;br /&gt;
MPJ_HOME=/work2/10177/bhatthal/frontera/mpj-express&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Update PATH to use MPJ_HOME instead of FMPJ_HOME&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
export PBS_NODEFILE=$NEW_NODEFILE&lt;br /&gt;
export PATH=$PATH:$MPJ_HOME/bin&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Add timers and call ExpressMPJ instead&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
t1=$(date +%s) # epoch start time in seconds&lt;br /&gt;
&lt;br /&gt;
date&lt;br /&gt;
echo &amp;quot;RUNNING MPJ&amp;quot;&lt;br /&gt;
mpjrun_errdetect_wrapper.sh $PBS_NODEFILE -dev hybdev -Djava.library.path=$MPJ_HOME/lib -Xmx${MEM_GIGS}G -cp $JAR_FILE scratch.UCERF3.erf.ETAS.launcher.MPJ_ETAS_Launcher --min-dispatch $MIN_DISPATCH --max-dispatch $MAX_DISPATCH --threads $THREADS $TEMP_OPTION $SCRATCH_OPTION $CLEAN_OPTION --end-time `scontrol show job $SLURM_JOB_ID | egrep --only-matching 'EndTime=[^ ]+' | cut -c 9-` $ETAS_CONF_JSON&lt;br /&gt;
ret=$?&lt;br /&gt;
date&lt;br /&gt;
&lt;br /&gt;
t2=$(date +%s) # epoch end time in seconds&lt;br /&gt;
numSec=$(echo $t2 - $t1 | bc -q ) # the number of seconds the process took.&lt;br /&gt;
runTime=$(date -ud @$numSec +%T) # Convert the seconds into Hours:Mins:Sec&lt;br /&gt;
echo &amp;quot;Time to build: $runTime ($numSec seconds)&amp;quot;&lt;br /&gt;
&lt;br /&gt;
exit $ret&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Make sure you’re not on an interactive compute node. From a login node, execute &amp;lt;code&amp;gt;slurm_submit.sh etas_sim_mpj.slurm&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
=== Stampede3 ===&lt;br /&gt;
In this example we run 14 nodes on Icelake (ICX), Skylake (SKX), and Sapphire Rapids (SPR). Refer to Stampede3 User guide for all queues.&lt;br /&gt;
&lt;br /&gt;
Just like on Frontera, set the user bashrc with modules and paths.&lt;br /&gt;
&lt;br /&gt;
Create interactive session with &amp;lt;code&amp;gt;idev&amp;lt;/code&amp;gt;, in my case I was issued 1 node, 48 tasks per node on skx-dev (Skylake) using project DS-Sybershake.&lt;br /&gt;
&lt;br /&gt;
I noticed that the TACC_STAMPEDE3 enum constant is missing from ucerf3-etas-launcher, even though it is present in the OpenSHA repository: https://github.com/opensha/opensha/blob/9df7200b6ed8984b9024a67f81ad630da8278a92/src/main/java/scratch/UCERF3/erf/ETAS/launcher/util/ETAS_ConfigBuilder.java#L47&lt;br /&gt;
&lt;br /&gt;
This configuration uses FastMPJ anyway, so when we eventually transition to using my configuration in the OpenSHA repository, we’ll update ucerf3-etas-launcher to bundle the latest opensha-all.jar with the TACC_STAMPEDE3 enum instead of the deprecated TACC_STAMPEDE2. We may also move USC_CARC into CARC_DISCOVERY and CARC_ENDEAVOUR, as their configurations are different. We’ll use the TACC_FRONTERA configuration for now.&lt;br /&gt;
&lt;br /&gt;
ICX&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ETAS_SIM_DIR &amp;amp;&amp;amp; u3etas_comcat_event_config_builder.sh --event-id ci38457511 --mag-complete 3.5 --radius 25 --num-simulations $NUM_SIM --days-before 7 --max-point-src-mag 6 --finite-surf-shakemap --finite-surf-shakemap-min-mag 4.5 --hpc-site TACC_FRONTERA --nodes $NUM_NODE --hours 24 --queue normal --output-dir $ETAS_SIM_DIR/stampede3-icx-comcat-ridgecrest-m7.1-n${NUM_NODE}-s$NUM_SIM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
SPR&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ETAS_SIM_DIR &amp;amp;&amp;amp; u3etas_comcat_event_config_builder.sh --event-id ci38457511 --mag-complete 3.5 --radius 25 --num-simulations $NUM_SIM --days-before 7 --max-point-src-mag 6 --finite-surf-shakemap --finite-surf-shakemap-min-mag 4.5 --hpc-site TACC_FRONTERA --nodes $NUM_NODE --hours 24 --queue normal --output-dir $ETAS_SIM_DIR/stampede3-spr-comcat-ridgecrest-m7.1-n${NUM_NODE}-s$NUM_SIM&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
SKX&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
cd $ETAS_SIM_DIR &amp;amp;&amp;amp; u3etas_comcat_event_config_builder.sh --event-id ci38457511 --mag-complete 3.5 --radius 25 --num-simulations $NUM_SIM --days-before 7 --max-point-src-mag 6 --finite-surf-shakemap --finite-surf-shakemap-min-mag 4.5 --hpc-site TACC_FRONTERA --nodes $NUM_NODE --hours 24 --queue normal --output-dir $ETAS_SIM_DIR/stampede3-skx-comcat-ridgecrest-m7.1-n${NUM_NODE}-s$NUM_SIM&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For plotting results, use skx as it is the cheapest at 1 SU. Set -n 48.&lt;br /&gt;
For each config's etas_sim_mpj.slurm, set -p to name of each queue icx, spr, and skx respectively.&lt;br /&gt;
Take care to also set the MEM_GIGS to less than the amount present on each system. System Architecture for RAM/node and CPU/node for each queue is available at the [https://docs.tacc.utexas.edu/hpc/stampede3/ Stampede3 User Guide].&lt;br /&gt;
&lt;br /&gt;
Here's an example slurm configuration for Skylake:&lt;br /&gt;
&amp;lt;pre&amp;gt;&lt;br /&gt;
#!/bin/bash&lt;br /&gt;
&lt;br /&gt;
#SBATCH -t 24:00:00&lt;br /&gt;
#SBATCH --nodes 14&lt;br /&gt;
#SBATCH --ntasks 14&lt;br /&gt;
#SBATCH --cpus-per-task=48&lt;br /&gt;
#SBATCH -p skx&lt;br /&gt;
#SBATCH --mem 0&lt;br /&gt;
&lt;br /&gt;
######################&lt;br /&gt;
## INPUT PARAMETERS ##&lt;br /&gt;
######################&lt;br /&gt;
&lt;br /&gt;
# the above '#SBATCH' lines are requred, and are supposed to start with a '#'. They must be at the beginning of the file&lt;br /&gt;
# the '-t hh:mm:ss' argument is the wall clock time of the job&lt;br /&gt;
# the '-N 10' argument specifies the number of nodes required, in this case 10&lt;br /&gt;
# the '-n 560' argument specifies the number of cores, required by TACC. Set it to no more than 56*the number of nodes&lt;br /&gt;
# the 'p normal' argument specifies the queue, in this case we use the normal queue&lt;br /&gt;
&lt;br /&gt;
## ETAS PARAMETERS ##&lt;br /&gt;
&lt;br /&gt;
# path to the JSON configuration file&lt;br /&gt;
ETAS_CONF_JSON=&amp;quot;/work2/10177/bhatthal/stampede3/ucerf3/u3etas_sim/stampede3-skx-comcat-ridgecrest-m7.1-n14-s10000/config.json&amp;quot;&lt;br /&gt;
&lt;br /&gt;
## JAVA/MPJ PARAMETERS ##&lt;br /&gt;
&lt;br /&gt;
# maxmimum memory in gigabytes. should be close to, but not over, total memory available&lt;br /&gt;
MEM_GIGS=144&lt;br /&gt;
&lt;br /&gt;
# number of etas threads. should be approximately MEM_GIGS/5, and no more than the total number of threads available&lt;br /&gt;
THREADS=18&lt;br /&gt;
&lt;br /&gt;
# FMPJ_HOME directory, fine to use mine&lt;br /&gt;
#FMPJ_HOME=/home1/00950/kevinm/FastMPJ&lt;br /&gt;
MPJ_HOME=/work2/10177/bhatthal/stampede3/mpj-express&lt;br /&gt;
&lt;br /&gt;
# path to the opensha-ucerf3 jar file&lt;br /&gt;
JAR_FILE=${ETAS_LAUNCHER}/opensha/opensha-all.jar&lt;br /&gt;
&lt;br /&gt;
# simulations are sent out in batches to each compute node. these paramters control the size of those batches&lt;br /&gt;
# smaller max size will allow for better checking of progress with watch_logparse.sh, but more wasted time at the end of batches waiting on a single calculation to finish&lt;br /&gt;
MIN_DISPATCH=$THREADS&lt;br /&gt;
MAX_DISPATCH=500&lt;br /&gt;
&lt;br /&gt;
# this allows for catalogs to be written locally on each compute node in a temporary directory, then only copied back onto shared storage after they complete. this reduces I/O load, but makes it harder to track progress of individual simulations. comment this out to disable this option&lt;br /&gt;
TEMP_OPTION=&amp;quot;--temp-dir /tmp/etas-results-tmp&amp;quot;&lt;br /&gt;
&lt;br /&gt;
# this allows for the results directory to be hosted on a different filesystem, in this case the $SCRATCH filesystem. this will prevent many files from being written to $WORK, as well as reducing I/O load&lt;br /&gt;
SCRATCH_OPTION=&amp;quot;--scratch-dir $SCRATCH/etas-results-tmp&amp;quot;&lt;br /&gt;
&lt;br /&gt;
# this automatically deletes subdirectories of the results directory once a catalog has been sucessfully written to the master binary file. comment out to disable&lt;br /&gt;
CLEAN_OPTION=&amp;quot;--clean&amp;quot;&lt;br /&gt;
&lt;br /&gt;
##########################&lt;br /&gt;
## END INPUT PARAMETERS ##&lt;br /&gt;
##   DO NOT EDIT BELOW  ##&lt;br /&gt;
##########################&lt;br /&gt;
&lt;br /&gt;
NEW_JAR=&amp;quot;`dirname ${ETAS_CONF_JSON}`/`basename $JAR_FILE`&amp;quot;&lt;br /&gt;
cp $JAR_FILE $NEW_JAR&lt;br /&gt;
if [[ -e $NEW_JAR ]];then&lt;br /&gt;
        JAR_FILE=$NEW_JAR&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
PBS_NODEFILE=&amp;quot;/tmp/${USER}-hostfile-${SLURM_JOBID}&amp;quot;&lt;br /&gt;
echo &amp;quot;creating PBS_NODEFILE: $PBS_NODEFILE&amp;quot;&lt;br /&gt;
scontrol show hostnames $SLURM_NODELIST &amp;gt; $PBS_NODEFILE&lt;br /&gt;
&lt;br /&gt;
NEW_NODEFILE=&amp;quot;/tmp/${USER}-hostfile-fmpj-${PBS_JOBID}&amp;quot;&lt;br /&gt;
echo &amp;quot;creating PBS_NODEFILE: $NEW_NODEFILE&amp;quot;&lt;br /&gt;
hname=$(hostname)&lt;br /&gt;
if [ &amp;quot;$hname&amp;quot; == &amp;quot;&amp;quot; ]&lt;br /&gt;
then&lt;br /&gt;
  echo &amp;quot;Error getting hostname. Exiting&amp;quot;&lt;br /&gt;
  exit 1&lt;br /&gt;
else&lt;br /&gt;
  cat $PBS_NODEFILE | sort | uniq | fgrep -v $hname &amp;gt; $NEW_NODEFILE&lt;br /&gt;
fi&lt;br /&gt;
&lt;br /&gt;
export PBS_NODEFILE=$NEW_NODEFILE&lt;br /&gt;
export PATH=$PATH:$MPJ_HOME/bin&lt;br /&gt;
&lt;br /&gt;
JVM_MEM_MB=26624&lt;br /&gt;
&lt;br /&gt;
t1=$(date +%s) # epoch start time in seconds&lt;br /&gt;
&lt;br /&gt;
date&lt;br /&gt;
echo &amp;quot;RUNNING MPJ&amp;quot;&lt;br /&gt;
mpjrun_errdetect_wrapper.sh $PBS_NODEFILE -dev hybdev -Djava.library.path=$MPJ_HOME/lib -Xmx${MEM_GIGS}G -cp $JAR_FILE scratch.UCERF3.erf.ETAS.launcher.MPJ_ETAS_Launcher --min-dispatch $MIN_DISPATCH --max-dispatch $MAX_DISPATCH --threads $THREADS $TEMP_OPTION $SCRATCH_OPTION $CLEAN_OPTION --end-time `scontrol show job $SLURM_JOB_ID | egrep --only-matching 'EndTime=[^ ]+' | cut -c 9-` $ETAS_CONF_JSON&lt;br /&gt;
ret=$?&lt;br /&gt;
date&lt;br /&gt;
&lt;br /&gt;
t2=$(date +%s) # epoch end time in seconds&lt;br /&gt;
numSec=$(echo $t2 - $t1 | bc -q ) # the number of seconds the process took.&lt;br /&gt;
runTime=$(date -ud @$numSec +%T) # Convert the seconds into Hours:Mins:Sec&lt;br /&gt;
echo &amp;quot;Time to build: $runTime ($numSec seconds)&amp;quot;&lt;br /&gt;
&lt;br /&gt;
exit $ret&lt;br /&gt;
&amp;lt;/pre&amp;gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=SCEC_VDO&amp;diff=30608</id>
		<title>SCEC VDO</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=SCEC_VDO&amp;diff=30608"/>
		<updated>2025-12-19T10:08:51Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Update Globus URLs to reference CARC project2 instead of CARC project.&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;= SCEC Virtual Display of Objects (SCEC VDO) =&lt;br /&gt;
&lt;br /&gt;
Researchers and interns at the Southern California Earthquake Center (SCEC) have built a seismic data visualization software tool called the SCEC Virtual Display of Objects (SCEC-VDO). Written in Java with the Swing GUI toolkit to create interactive menus and the Visualization Toolkit (VTK) to render 3D content, SCEC-VDO allows for the visualization of 3D earthquake and fault objects on maps and the creation of images and movies for analysis, presentation, and publication.&lt;br /&gt;
&lt;br /&gt;
[https://github.com/SCECcode/scec-vdo/ View source code on GitHub]&lt;br /&gt;
&lt;br /&gt;
== Releases ==&lt;br /&gt;
=== v24.11.0 ===&lt;br /&gt;
==== Info ====&lt;br /&gt;
* Precompile bytecode with Ant for improved start-time performance&lt;br /&gt;
* Transition to DMG, EXE, and tarballs for macOS, Windows, and Linux respectively&lt;br /&gt;
* macOS and Windows installers properly install an application for easier updates&lt;br /&gt;
* Linux tarball doesn't install to system itself, but now bundles dependencies&lt;br /&gt;
&lt;br /&gt;
==== Downloads ====&lt;br /&gt;
* Linux: [https://g-3a9041.a78b8.36fe.data.globus.org/scec-vdo/v24.11.0/SCEC-VDO-24.11.0.tar.gz SCEC-VDO-24.11.0.tar.gz (435 MB)] (MD5: 2a76884bed6b008ca06c46cd035e8414)&lt;br /&gt;
* macOS Intel: [https://g-3a9041.a78b8.36fe.data.globus.org/scec-vdo/v24.11.0/SCEC-VDO-24.11.0-Intel.dmg SCEC-VDO-24.11.0-Intel.dmg (501 MB)] (MD5: 91771caaa998c3f3cf1a439521a5bd5e)&lt;br /&gt;
* macOS AppleSilicon: [https://g-3a9041.a78b8.36fe.data.globus.org/scec-vdo/v24.11.0/SCEC-VDO-24.11.0-AppleSilicon.dmg SCEC-VDO-24.11.0-AppleSilicon.dmg (552 MB)] (MD5: 3321cd9cbb7327c3dcbcdba7a45ef890)&lt;br /&gt;
* Windows: [https://g-3a9041.a78b8.36fe.data.globus.org/scec-vdo/v24.11.0/SCEC-VDO-24.11.0.exe SCEC-VDO-24.11.0.exe (283 MB)] (MD5: d43c66a4e51182e04e70a12c103d32b2)&lt;br /&gt;
&lt;br /&gt;
=== v24.10.0 ===&lt;br /&gt;
==== Info ====&lt;br /&gt;
* Added support for Apple Silicon Macs&lt;br /&gt;
* Create Windows batch launcher&lt;br /&gt;
* Upgraded all platforms to VTK9.1&lt;br /&gt;
* Bundled OpenJDK 23+37, OpenJ9 0.47.0&lt;br /&gt;
==== Downloads ====&lt;br /&gt;
* Linux: [https://g-3a9041.a78b8.36fe.data.globus.org/scec-vdo/v24.10.0/scec-vdo-linux.zip v24.10.0/scec-vdo-linux.zip (532 MB)] (MD5: 03fb57593ae33d9b475799d0d153b6b2)&lt;br /&gt;
* macOS: [https://g-3a9041.a78b8.36fe.data.globus.org/scec-vdo/v24.10.0/scec-vdo-mac-osx.zip v24.10.0/scec-vdo-mac-osx.zip (516 MB)] (MD5: 9e4c9a8592baf574a8e9bfac5b8a07fb)&lt;br /&gt;
* Windows: [https://g-3a9041.a78b8.36fe.data.globus.org/scec-vdo/v24.10.0/scec-vdo-windows.zip v24.10.0/scec-vdo-windows.zip (390 MB)] (MD5: 9aeb73faebcd154d850d58ba86638e3e)&lt;br /&gt;
&lt;br /&gt;
== Troubleshooting ==&lt;br /&gt;
=== v24.11.0 macOS ===&lt;br /&gt;
When attempting to run the SCEC-VDO application on macOS for the first time, you may encounter an error pop-up message that the application cannot be opened. This is likely due to the application not being signed by a trusted developer.&lt;br /&gt;
&lt;br /&gt;
[[File:SCEC-VDO CantBeOpened.png|300px|thumb|v24.11.0 macOS &amp;quot;cannot be opened&amp;quot; error]]&lt;br /&gt;
&lt;br /&gt;
Right-click the application and press &amp;quot;Open&amp;quot; for the first time opening the app.&lt;br /&gt;
This should prompt you to bypass Apple's malicious software / unverified developer warnings.&lt;br /&gt;
&lt;br /&gt;
=== v24.10.0 macOS ===&lt;br /&gt;
As of SCEC-VDO v24.10.0, the macOS application is bundled as a zip file instead of an app file. macOS GateKeeper refuses to execute applications bundled this way by default, regardless of codesigning.&lt;br /&gt;
We'll consider bundling a proper macOS app instead of a zip file as a permanent solution. &lt;br /&gt;
&lt;br /&gt;
[[File:java_cannot_be_opened.png|400px|thumb|v24.10.0 macOS &amp;quot;java cannot be opened&amp;quot; error]]&lt;br /&gt;
&lt;br /&gt;
[[File:JRE_Signed.png|400px|thumb|v24.10.0 macOS app will not run because it's bundled as a zip file]]&lt;br /&gt;
&lt;br /&gt;
Unauthorized applications cannot be directly executed in the Terminal or opened by double-clicking without making a GateKeeper exception.&lt;br /&gt;
This is a very simple, one time process that should not take longer than a minute.&lt;br /&gt;
&lt;br /&gt;
To create a security exception, you need to enable Developer Tools and allow the Terminal permission to bypass the system security policy.&lt;br /&gt;
Enable Developer Tools by running &amp;lt;code&amp;gt;spctl developer-mode enable-terminal&amp;lt;/code&amp;gt; and quitting the Terminal.&lt;br /&gt;
Then navigate to Developer Tools inside System Settings, enable the Terminal to bypass security, and try running the SCEC-VDO application.&lt;br /&gt;
&lt;br /&gt;
I've created video demonstrations showing exactly how to do this. &lt;br /&gt;
&lt;br /&gt;
==== Intel Mac Demo (1:11) ====&lt;br /&gt;
[[File:Intel_Mac_4GB_SCEC-VDO_Demo.mp4]]&lt;br /&gt;
&lt;br /&gt;
==== M1 Mac Demo (1:37) ====&lt;br /&gt;
[[File:M1_Mac_8GB_SCEC-VDO_Demo.mp4]]&lt;br /&gt;
&lt;br /&gt;
=== v24.10.0 Linux ===&lt;br /&gt;
For our Linux users, the &amp;lt;code&amp;gt;launch_linux.sh&amp;lt;/code&amp;gt; script installs the dependency &amp;lt;code&amp;gt;freeglut3-dev&amp;lt;/code&amp;gt;&lt;br /&gt;
using the Aptitude package manager. For users of non-Debian based distributions, you must&lt;br /&gt;
install an equivalent with your package manager or compile [https://github.com/freeglut/freeglut from source].&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=OpenSHA_Beta_Testing&amp;diff=30607</id>
		<title>OpenSHA Beta Testing</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=OpenSHA_Beta_Testing&amp;diff=30607"/>
		<updated>2025-12-18T01:18:44Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Release 26.1.0-beta available&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page provides experimental versions of OpenSHA applications for beta testing.&lt;br /&gt;
Stable production versions of OpenSHA are released on the [https://github.com/opensha/opensha/releases GitHub releases page]&lt;br /&gt;
&lt;br /&gt;
== OpenSHA January 2026 Release ==&lt;br /&gt;
Unlike previous releases, there are two variants for this OpenSHA beta release.&lt;br /&gt;
This release coincides with the OpenSHA server migration. Detailed instructions for beta testing, release notes, and tutorials for new applications are available in the README found in beta archives. See the Downloads table below.&lt;br /&gt;
&lt;br /&gt;
== OpenSHA April 2025 Release ==&lt;br /&gt;
&lt;br /&gt;
=== Resources ===&lt;br /&gt;
Please refer to the following resources for the release schedule and instructions on how to engage with OpenSHA software and provide feedback.&lt;br /&gt;
&lt;br /&gt;
* [https://docs.google.com/document/d/1BzhcXrnzUxfNVMZQLn4OPWod2a4NgeYKSpZVu_OWuIU/edit?usp=sharing Release Overview]&lt;br /&gt;
* [https://docs.google.com/document/d/1pZtNHaXr89pV0oGl32ewUA4-bQjqT8MR27ALZPydrmA/edit?usp=sharing Beta Testing Instructions]&lt;br /&gt;
* [https://forms.gle/4qNv46kSfeV6nDuy7 Feedback Form]&lt;br /&gt;
* [https://docs.google.com/presentation/d/1niS67--kQXX53BL3eJsOaJzaVglOPHsl/edit?usp=sharing Introduction to OpenSHA (2018)]&lt;br /&gt;
&lt;br /&gt;
=== Downloads ===&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;margin:left&amp;quot;&lt;br /&gt;
|+&lt;br /&gt;
|-&lt;br /&gt;
! URL !! Size !! MD5 !! Release Date&lt;br /&gt;
|-&lt;br /&gt;
| '''(latest)''' [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-26.1.0-beta.zip opensha/release/opensha-26.1.0-beta.zip] || 1619 MB || MD5: 6d71504ceed68a066346d3fbc7c469ef || Released December 17, 2025&lt;br /&gt;
|-&lt;br /&gt;
| [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-25.4.1-beta.zip opensha/release/opensha-25.4.1-beta.zip] || 459 MB || MD5: 4b4a66a24bd04553bd438c3bf7fc3253 || Released March 25, 2025&lt;br /&gt;
|-&lt;br /&gt;
| [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-25.4.0-beta.zip opensha/release/opensha-25.4.0-beta.zip] || 536 MB || MD5: 941ac69c08fed8b5966a788c955d1f78 || Released March 17, 2025&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=OpenSHA_Beta_Testing&amp;diff=30606</id>
		<title>OpenSHA Beta Testing</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=OpenSHA_Beta_Testing&amp;diff=30606"/>
		<updated>2025-12-17T18:16:26Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Move resources from project to project2. Create entry for new release beta.&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page provides experimental versions of OpenSHA applications for beta testing.&lt;br /&gt;
Stable production versions of OpenSHA are released on the [https://github.com/opensha/opensha/releases GitHub releases page]&lt;br /&gt;
&lt;br /&gt;
== OpenSHA January 2026 Release ==&lt;br /&gt;
Unlike previous releases, there are two variants for this OpenSHA beta release.&lt;br /&gt;
This release coincides with the OpenSHA server migration. Detailed instructions for beta testing, release notes, and tutorials for new applications are available in the README found in beta archives. See the Downloads table below.&lt;br /&gt;
&lt;br /&gt;
== OpenSHA April 2025 Release ==&lt;br /&gt;
&lt;br /&gt;
=== Resources ===&lt;br /&gt;
Please refer to the following resources for the release schedule and instructions on how to engage with OpenSHA software and provide feedback.&lt;br /&gt;
&lt;br /&gt;
* [https://docs.google.com/document/d/1BzhcXrnzUxfNVMZQLn4OPWod2a4NgeYKSpZVu_OWuIU/edit?usp=sharing Release Overview]&lt;br /&gt;
* [https://docs.google.com/document/d/1pZtNHaXr89pV0oGl32ewUA4-bQjqT8MR27ALZPydrmA/edit?usp=sharing Beta Testing Instructions]&lt;br /&gt;
* [https://forms.gle/4qNv46kSfeV6nDuy7 Feedback Form]&lt;br /&gt;
* [https://docs.google.com/presentation/d/1niS67--kQXX53BL3eJsOaJzaVglOPHsl/edit?usp=sharing Introduction to OpenSHA (2018)]&lt;br /&gt;
&lt;br /&gt;
=== Downloads ===&lt;br /&gt;
{| class=&amp;quot;wikitable&amp;quot; style=&amp;quot;margin:left&amp;quot;&lt;br /&gt;
|+&lt;br /&gt;
|-&lt;br /&gt;
! URL !! Size !! MD5 !! Release Date&lt;br /&gt;
|-&lt;br /&gt;
| '''(latest)''' [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-25.4.1-beta.zip opensha/release/opensha-26.1.0-beta.zip] || x MB || MD5: x || Released December 17, 2025&lt;br /&gt;
|-&lt;br /&gt;
| [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-25.4.1-beta.zip opensha/release/opensha-25.4.1-beta.zip] || 459 MB || MD5: 4b4a66a24bd04553bd438c3bf7fc3253 || Released March 25, 2025&lt;br /&gt;
|-&lt;br /&gt;
| [https://g-3a9041.a78b8.36fe.data.globus.org/opensha/release/opensha-25.4.0-beta.zip opensha/release/opensha-25.4.0-beta.zip] || 562 MB || MD5: 941ac69c08fed8b5966a788c955d1f78 || Released March 17, 2025&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30587</id>
		<title>SCEC Event Page Troubleshooting</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30587"/>
		<updated>2025-11-16T22:36:05Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page outlines all encountered bugs with the process of generating SCEC Event Pages. Each bug should have a header with a succinct summary and the date it was first encountered. Bugs that have been resolved will have &amp;quot;'''(Resolved)'''&amp;quot; appended to its header.&lt;br /&gt;
&lt;br /&gt;
== Nov 4 2025 - Failure to generate &amp;quot;M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)&amp;quot; '''(Resolved)''' ==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
Ambiguous error, “error generating page” and simply “error” using the “Enter USGS ID to Generate Page” and “Recent Earthquakes” Generate buttons. (See Fig 1.)&lt;br /&gt;
Bug is only encountered with this specific event. Able to generate an event page for 10/23/2205 Woodside, but not 10/23/2025 Yucaipa.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-error-msg.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 1. Error generating Yucaipa event'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
Chrome Developer Tools allow for analysis of the JSON error message. (See Fig. 2)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-err-json-response.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 2. Resolve ambiguous error message with Chrome Developer Tools'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
A match for the error detail (See Fig 3.) is found in the &amp;lt;code&amp;gt;earthquake_event_page.php&amp;lt;/code&amp;gt; script at function &amp;lt;code&amp;gt;scec_earthquake_event_page_generate($event_id)&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-eq-event-page-php-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 3. Error detail match found in code'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
This error is returned due to a non-zero return value from the execution of &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt;. Upon analysis of Docker logs we see this failure is in the Docker execution of &amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; (See Fig. 4)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-update-eq-event-report-sh-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 4. update_eq_event_report.sh invokes Docker Java 8 for generate_report.sh'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; invokes the Java OpenSHA &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; CLT with appropriate parameters.&lt;br /&gt;
&lt;br /&gt;
See logs below:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
Array(&lt;br /&gt;
    [0] =&amp;gt; cd /data/scectmp&lt;br /&gt;
    [1] =&amp;gt; run docker&lt;br /&gt;
    [2] =&amp;gt; https://earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ci41113519.geojson&lt;br /&gt;
    [3] =&amp;gt; Count of events received = 1&lt;br /&gt;
    [4] =&amp;gt; WC 1994 Radius: 0.08038963&lt;br /&gt;
    [5] =&amp;gt; Reverting to min radius of 10.0&lt;br /&gt;
    [6] =&amp;gt; Mainshock is a M3.08&lt;br /&gt;
    [7] =&amp;gt;         Hypocenter: 34.11083, -116.98350, 13.42000&lt;br /&gt;
    [8] =&amp;gt; Place name: 10 km NNE of Yucaipa, CA&lt;br /&gt;
    [9] =&amp;gt; Fetching 3.0 days of foreshocks&lt;br /&gt;
    [10] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-10-24T03:12:37.610Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-21T03:12:37.610Z&lt;br /&gt;
    [11] =&amp;gt; Count of events received = 0&lt;br /&gt;
    [12] =&amp;gt; Count of events after filtering = 0&lt;br /&gt;
    [13] =&amp;gt; Total number of events returned = 0&lt;br /&gt;
    [14] =&amp;gt; Found 0 foreshocks, maxMag=-Infinity&lt;br /&gt;
    [15] =&amp;gt; Fetching aftershocks&lt;br /&gt;
    [16] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-11-05T17:03:42.497Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-24T03:12:37.610Z&lt;br /&gt;
    [17] =&amp;gt; Count of events received = 4&lt;br /&gt;
    [18] =&amp;gt; Count of events after filtering = 2&lt;br /&gt;
    [19] =&amp;gt; Events filtered due to conversion = 0, location = 1, id = 1&lt;br /&gt;
    [20] =&amp;gt; Total number of events returned = 2&lt;br /&gt;
    [21] =&amp;gt; Found 2 aftershocks, maxMag=1.2&lt;br /&gt;
    [22] =&amp;gt; Output dir: /reports/ci41113519&lt;br /&gt;
    [23] =&amp;gt; URL: https://earthquake.usgs.gov/earthquakes/eventpage/ci41113519&lt;br /&gt;
    [24] =&amp;gt; Shakemap image: https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg&lt;br /&gt;
    [25] =&amp;gt; DYFI image: https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg&lt;br /&gt;
    [26] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg to /reports/ci41113519/resources/ci41113519_shakemap.jpg&lt;br /&gt;
    [27] =&amp;gt; DONE&lt;br /&gt;
    [28] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg to /reports/ci41113519/resources/ci41113519_dyfi.jpg&lt;br /&gt;
    [29] =&amp;gt; DONE&lt;br /&gt;
    [30] =&amp;gt; Loading FM from cached file: FM3_1.xml&lt;br /&gt;
    [31] =&amp;gt; Loading FM from cached file: FM3_2.xml&lt;br /&gt;
    [32] =&amp;gt; java.lang.IllegalStateException: Min data mag is non-finite: NaN&lt;br /&gt;
    [33] =&amp;gt;     at com.google.common.base.Preconditions.checkState(Preconditions.java:588)&lt;br /&gt;
    [34] =&amp;gt;  at org.opensha.commons.data.comcat.plot.ComcatDataPlotter.plotMagTimeFunc(ComcatDataPlotter.java:873)&lt;br /&gt;
    [35] =&amp;gt;    at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.generateReport(ComcatReportPageGen.java:384)&lt;br /&gt;
    [36] =&amp;gt;   at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.main(ComcatReportPageGen.java:1418)&lt;br /&gt;
    [37] =&amp;gt; quitting with error)&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here is a snippet of &amp;lt;code&amp;gt;ComcatDataPlotter&amp;lt;/code&amp;gt;, where the error is thrown. (See Fig. 5)&lt;br /&gt;
&lt;br /&gt;
[[Image:Comcatdataplotter-fail-snippet.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 5. Failed Precondition in ComcatDataPlotter'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
It’s resolving NaN, because minDataMag is set to infinity. We failed to find any of mainshock.getMag, foreshocksFunc.getMinY, or aftershocksFunc.getMinY&lt;br /&gt;
&lt;br /&gt;
For this specific event, we’re failing to retrieve any foreshocks, aftershocks or even the mainshock. Evidently there was a mainshock, so this must either be an error in our code parsing the geodata or a malformed geojson was provided.&lt;br /&gt;
&lt;br /&gt;
The mainshock was retrieved successfully. We didn’t find any foreshocks 3 days before the mainshock, and we’ve observed 2 aftershocks in total between the date of the mainshock (2025-10-24) and the date of the attempted event page generation (2025-11-05).&lt;br /&gt;
&lt;br /&gt;
Clearly we have this data so our minDataMag should be finite.&lt;br /&gt;
&lt;br /&gt;
After debugging ComcatReportPageGen locally, I’ve determined the issue.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
aftershocksFunc.getMinY() is returning NaN. The minimum of NaN and infinity evaluates to NaN. Later on, even though mainshock.getMag() evaluates to 3.08, Math.min(NaN, 3.08) resolves NaN. I don’t believe we should see NaN summary statistics for our aftershocksFunc, but adding checks for this allows pages to build.&lt;br /&gt;
&lt;br /&gt;
See the modified code that we now use on the Central SCEC Server for page generation: https://github.com/abhatthal/opensha-fork/tree/bugfix/comcat-report-page-gen/nonfinite-min-data-mag&lt;br /&gt;
This code will be merged into the OpenSHA codebase with a PR after review. Changes incurred at merge will require a new Jar to be built and deployed to the Central SCEC Server.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucapia-comcatdataplotter-fix.png|500px|frameless]]&lt;br /&gt;
&lt;br /&gt;
'''Fig 6. Add NaN checks in ComcatDataPlotter'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
After we deployed the new Jar, we updated the docker image invoked in &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt; from &amp;lt;code&amp;gt;openjdk:8&amp;lt;/code&amp;gt; to &amp;lt;code&amp;gt;eclipse-temurin:11-jdk&amp;lt;/code&amp;gt;.&lt;br /&gt;
This executes our updated &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; successfully.&lt;br /&gt;
&lt;br /&gt;
The generated event page for Yucaipa is available at https://central.scec.org/earthquakes/eventpage/ci41113519.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
* [[Publishing_UCERF3-ETAS_Event_Reports]]&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30586</id>
		<title>SCEC Event Page Troubleshooting</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30586"/>
		<updated>2025-11-16T21:53:16Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page outlines all encountered bugs with the process of generating SCEC Event Pages. Each bug should have a header with a succinct summary and the date it was first encountered. Bugs that have been resolved will have &amp;quot;'''(Resolved)'''&amp;quot; appended to its header.&lt;br /&gt;
&lt;br /&gt;
== Nov 4 2025 - Failure to generate &amp;quot;M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)&amp;quot; '''(Resolved)''' ==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
Ambiguous error, “error generating page” and simply “error” using the “Enter USGS ID to Generate Page” and “Recent Earthquakes” Generate buttons. (See Fig 1.)&lt;br /&gt;
Bug is only encountered with this specific event. Able to generate an event page for 10/23/2205 Woodside, but not 10/23/2025 Yucaipa.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-error-msg.png|450px|thumb|Fig 1. Error generating Yucaipa event]]&lt;br /&gt;
&lt;br /&gt;
Chrome Developer Tools allow for analysis of the JSON error message. (See Fig. 2)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-err-json-response.png|450px|thumb|Fig 2. Resolve ambiguous error message with Chrome Developer Tools]]&lt;br /&gt;
&lt;br /&gt;
A match for the error detail (See Fig 3.) is found in the &amp;lt;code&amp;gt;earthquake_event_page.php&amp;lt;/code&amp;gt; script at function &amp;lt;code&amp;gt;scec_earthquake_event_page_generate($event_id)&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-eq-event-page-php-snippet.png|450px|thumb|Fig 3. Error detail match found in code]]&lt;br /&gt;
&lt;br /&gt;
This error is returned due to a non-zero return value from the execution of &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt;. Upon analysis of Docker logs we see this failure is in the Docker execution of &amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; (See Fig. 4)&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-update-eq-event-report-sh-snippet.png|450px|thumb|Fig 4. update_eq_event_report.sh invokes Docker Java 8 for generate_report.sh]]&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;generate_report.sh&amp;lt;/code&amp;gt; invokes the Java OpenSHA &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; CLT with appropriate parameters.&lt;br /&gt;
&lt;br /&gt;
See logs below:&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
Array&lt;br /&gt;
(&lt;br /&gt;
    [0] =&amp;gt; cd /data/scectmp&lt;br /&gt;
    [1] =&amp;gt; run docker&lt;br /&gt;
    [2] =&amp;gt; https://earthquake.usgs.gov/earthquakes/feed/v1.0/detail/ci41113519.geojson&lt;br /&gt;
    [3] =&amp;gt; Count of events received = 1&lt;br /&gt;
    [4] =&amp;gt; WC 1994 Radius: 0.08038963&lt;br /&gt;
    [5] =&amp;gt; Reverting to min radius of 10.0&lt;br /&gt;
    [6] =&amp;gt; Mainshock is a M3.08&lt;br /&gt;
    [7] =&amp;gt;         Hypocenter: 34.11083, -116.98350, 13.42000&lt;br /&gt;
    [8] =&amp;gt; Place name: 10 km NNE of Yucaipa, CA&lt;br /&gt;
    [9] =&amp;gt; Fetching 3.0 days of foreshocks&lt;br /&gt;
    [10] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-10-24T03:12:37.610Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-21T03:12:37.610Z&lt;br /&gt;
    [11] =&amp;gt; Count of events received = 0&lt;br /&gt;
    [12] =&amp;gt; Count of events after filtering = 0&lt;br /&gt;
    [13] =&amp;gt; Total number of events returned = 0&lt;br /&gt;
    [14] =&amp;gt; Found 0 foreshocks, maxMag=-Infinity&lt;br /&gt;
    [15] =&amp;gt; Fetching aftershocks&lt;br /&gt;
    [16] =&amp;gt; https://earthquake.usgs.gov/fdsnws/event/1/query?endtime=2025-11-05T17:03:42.497Z&amp;amp;format=geojson&amp;amp;limit=20000&amp;amp;maxdepth=30.000&amp;amp;maxlatitude=34.20077&amp;amp;maxlongitude=-116.87488&amp;amp;mindepth=-10.000&amp;amp;minlatitude=34.02090&amp;amp;minlongitude=-117.09212&amp;amp;minmagnitude=0.000&amp;amp;orderby=time&amp;amp;starttime=2025-10-24T03:12:37.610Z&lt;br /&gt;
    [17] =&amp;gt; Count of events received = 4&lt;br /&gt;
    [18] =&amp;gt; Count of events after filtering = 2&lt;br /&gt;
    [19] =&amp;gt; Events filtered due to conversion = 0, location = 1, id = 1&lt;br /&gt;
    [20] =&amp;gt; Total number of events returned = 2&lt;br /&gt;
    [21] =&amp;gt; Found 2 aftershocks, maxMag=1.2&lt;br /&gt;
    [22] =&amp;gt; Output dir: /reports/ci41113519&lt;br /&gt;
    [23] =&amp;gt; URL: https://earthquake.usgs.gov/earthquakes/eventpage/ci41113519&lt;br /&gt;
    [24] =&amp;gt; Shakemap image: https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg&lt;br /&gt;
    [25] =&amp;gt; DYFI image: https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg&lt;br /&gt;
    [26] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/shakemap/41113519ci/ci/1761362118906/download/intensity.jpg to /reports/ci41113519/resources/ci41113519_shakemap.jpg&lt;br /&gt;
    [27] =&amp;gt; DONE&lt;br /&gt;
    [28] =&amp;gt; Downloading https:// earthquake.usgs.gov/realtime/product/dyfi/ci41113519/us/1761801878221/ci41113519_ciim.jpg to /reports/ci41113519/resources/ci41113519_dyfi.jpg&lt;br /&gt;
    [29] =&amp;gt; DONE&lt;br /&gt;
    [30] =&amp;gt; Loading FM from cached file: FM3_1.xml&lt;br /&gt;
    [31] =&amp;gt; Loading FM from cached file: FM3_2.xml&lt;br /&gt;
    [32] =&amp;gt; java.lang.IllegalStateException: Min data mag is non-finite: NaN&lt;br /&gt;
    [33] =&amp;gt;     at com.google.common.base.Preconditions.checkState(Preconditions.java:588)&lt;br /&gt;
    [34] =&amp;gt;  at org.opensha.commons.data.comcat.plot.ComcatDataPlotter.plotMagTimeFunc(ComcatDataPlotter.java:873)&lt;br /&gt;
    [35] =&amp;gt;    at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.generateReport(ComcatReportPageGen.java:384)&lt;br /&gt;
    [36] =&amp;gt;   at org.opensha.commons.data.comcat.plot.ComcatReportPageGen.main(ComcatReportPageGen.java:1418)&lt;br /&gt;
    [37] =&amp;gt; quitting with error&lt;br /&gt;
)&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Here is a snippet of &amp;lt;code&amp;gt;ComcatDataPlotter&amp;lt;/code&amp;gt;, where the error is thrown. (See Fig. 5)&lt;br /&gt;
&lt;br /&gt;
[[Image:Comcatdataplotter-fail-snippet.png|450px|thumb|Fig 5. Failed Precondition in ComcatDataPlotter]]&lt;br /&gt;
&lt;br /&gt;
It’s resolving NaN, because minDataMag is set to infinity. We failed to find any of mainshock.getMag, foreshocksFunc.getMinY, or aftershocksFunc.getMinY&lt;br /&gt;
&lt;br /&gt;
For this specific event, we’re failing to retrieve any foreshocks, aftershocks or even the mainshock. Evidently there was a mainshock, so this must either be an error in our code parsing the geodata or a malformed geojson was provided.&lt;br /&gt;
&lt;br /&gt;
The mainshock was retrieved successfully. We didn’t find any foreshocks 3 days before the mainshock, and we’ve observed 2 aftershocks in total between the date of the mainshock (2025-10-24) and the date of the attempted event page generation (2025-11-05).&lt;br /&gt;
&lt;br /&gt;
Clearly we have this data so our minDataMag should be finite.&lt;br /&gt;
&lt;br /&gt;
After debugging ComcatReportPageGen locally, I’ve determined the issue.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
aftershocksFunc.getMinY() is returning NaN. The minimum of NaN and infinity evaluates to NaN. Later on, even though mainshock.getMag() evaluates to 3.08, Math.min(NaN, 3.08) resolves NaN. I don’t believe we should see NaN summary statistics for our aftershocksFunc, but adding checks for this allows pages to build.&lt;br /&gt;
&lt;br /&gt;
See the modified code that we now use on the Central SCEC Server for page generation: https://github.com/abhatthal/opensha-fork/tree/bugfix/comcat-report-page-gen/nonfinite-min-data-mag&lt;br /&gt;
This code will be merged into the OpenSHA codebase with a PR after review. Changes incurred at merge will require a new Jar to be built and deployed to the Central SCEC Server.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucapia-comcatdataplotter-fix.png|450px|thumb|Fig 6. Add NaN checks in ComcatDataPlotter]]&lt;br /&gt;
&lt;br /&gt;
After we deployed the new Jar, we updated the docker image invoked in &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt; from &amp;lt;code&amp;gt;openjdk:8&amp;lt;/code&amp;gt; to &amp;lt;code&amp;gt;eclipse-temurin:11-jdk&amp;lt;/code&amp;gt;.&lt;br /&gt;
This executes our updated &amp;lt;code&amp;gt;ComcatReportPageGen&amp;lt;/code&amp;gt; successfully.&lt;br /&gt;
&lt;br /&gt;
The generated event page for Yucaipa is available at https://central.scec.org/earthquakes/eventpage/ci41113519.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
* [[Publishing_UCERF3-ETAS_Event_Reports]]&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Yucapia-comcatdataplotter-fix.png&amp;diff=30585</id>
		<title>File:Yucapia-comcatdataplotter-fix.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Yucapia-comcatdataplotter-fix.png&amp;diff=30585"/>
		<updated>2025-11-16T21:49:26Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Comcatdataplotter-fail-snippet.png&amp;diff=30584</id>
		<title>File:Comcatdataplotter-fail-snippet.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Comcatdataplotter-fail-snippet.png&amp;diff=30584"/>
		<updated>2025-11-16T21:47:23Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Yucaipa-update-eq-event-report-sh-snippet.png&amp;diff=30583</id>
		<title>File:Yucaipa-update-eq-event-report-sh-snippet.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Yucaipa-update-eq-event-report-sh-snippet.png&amp;diff=30583"/>
		<updated>2025-11-16T21:43:07Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Yucaipa-eq-event-page-php-snippet.png&amp;diff=30582</id>
		<title>File:Yucaipa-eq-event-page-php-snippet.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Yucaipa-eq-event-page-php-snippet.png&amp;diff=30582"/>
		<updated>2025-11-16T21:36:55Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Publishing_UCERF3-ETAS_Event_Reports&amp;diff=30581</id>
		<title>Publishing UCERF3-ETAS Event Reports</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Publishing_UCERF3-ETAS_Event_Reports&amp;diff=30581"/>
		<updated>2025-11-16T21:32:13Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Add link to SCEC Event Page Troubleshooting&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;After successfully running a UCERF3-ETAS simulation and generating our plots, the generated UCERF3-ETAS forecast results can be published as part of a larger SCEC Event page. This page outlines the process by which results are published.&lt;br /&gt;
&lt;br /&gt;
Refer to [[UCERF3-ETAS Measurements]] for detailed instructions on how to run simulations and generate plots across HPC systems. The following examples allow us to take our generated results directly from the HPC system on which the computation occurs, although results can be published from any system with an internet connection.&lt;br /&gt;
&lt;br /&gt;
== Creating and Updating SCEC Event Pages ==&lt;br /&gt;
SCEC Event Pages detail earthquake events. For a given mainshock, recorded magnitude, time and location, and aftershock sequences are recorded. Identify a given event from its USGS ID and use this to generate an event on the SCEC.org website at https://central.scec.org/earthquakes/eventpage/generate. (See Figure 1).&lt;br /&gt;
&lt;br /&gt;
After generating an event page, it should be populated and available to view from the SCEC Event Pages list. This page is not yet published and isn't available to the public. Navigate to the &amp;quot;View&amp;quot; link to view the event page. (See Figure 2).&lt;br /&gt;
Under the Table of Contents, you shouldn't see a &amp;quot;UCERF3-ETAS Forecast&amp;quot; yet, but after generating your results and making them available, you will be able to update this event page by selecting the &amp;quot;Regenerate Page with Latest Data&amp;quot; button. (See Figure 3)&lt;br /&gt;
&lt;br /&gt;
[[File:SCEC Event Page Generator.png|400px|thumb|Fig 1. SCEC Event Page Generator available at scec.org]]&lt;br /&gt;
[[File:Malibu EQ.png|400px|thumb|Fig 2. SCEC Event Page for M4 earthquake near Malibu, CA]]&lt;br /&gt;
[[File:UCERF3-ETAS Forecast in Event Page.png|400px|thumb|Fig 3. UCERF3-ETAS Forecast as seen on the Event Page]]&lt;br /&gt;
&lt;br /&gt;
For any errors generating SCEC Event Pages, refer to [[SCEC Event Page Troubleshooting]].&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Git Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
Pushing your changes upstream requires you to have an added SSH key on Frontera to allow you to make changes with your GitHub account. You must also use a GitHub account that is authorized to push changes directly to the ucerf3-etas-results/master branch.&lt;br /&gt;
&lt;br /&gt;
You can request edit permissions to the repository owner, Akash Bhatthal &amp;lt;[mailto:bhatthal@usc.edu bhattha@usc.edu]&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Consider the following brief instructions on getting set up with SSH.&lt;br /&gt;
&lt;br /&gt;
1. Check if you already have an SSH key generated.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;ls ~/.ssh/id_rsa.pub&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If not, then generate one with &amp;lt;code&amp;gt;ssh-keygen&amp;lt;/code&amp;gt;. It's possible you may have a non-RSA public key. Just check for any file ending in &amp;lt;code&amp;gt;.pub&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
2. Copy the contents of this public SSH key to your clipboard.&amp;lt;br/&amp;gt;&lt;br /&gt;
You can either select it directly from your Terminal, or copy the file over SSH.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3. Add as an authenticated key on your GitHub account.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Go to GitHub Settings -&amp;gt; SSH and GPG Keys -&amp;gt; New SSH key&lt;br /&gt;
and paste the public key you copied. It should start with &amp;lt;code&amp;gt;ssh-rsa&amp;lt;/code&amp;gt;.&lt;br /&gt;
Be careful to not use the private key, the public key is from the file&lt;br /&gt;
ending in &amp;lt;code&amp;gt;.pub&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
4. If you cloned the repository with HTTPS instead of SSH, then you need to update your remote accordingly.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;git remote set-url origin git@github.com:(your_user_name)/ucerf3-etas-results.git&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
5. If your account has sufficient permissions, you should be able to push directly upstream to ucerf3-etas-results/master.&lt;br /&gt;
&lt;br /&gt;
== Publishing the UCERF3-ETAS Forecast ==&lt;br /&gt;
&lt;br /&gt;
Generate your UCERF3-ETAS forecast for a given earthquake event following instructions available at [[UCERF3-ETAS Measurements]].&lt;br /&gt;
Our generated results are pushed to a private GitHub repository (SCECcode/ucerf3-etas-results) that is directly read by the SCEC Event Page. You must be granted permission to view and contribute to this repository to continue publishing UCERF3-ETAS results for use in SCEC Event Pages.&lt;br /&gt;
&lt;br /&gt;
1) Clone this repository to your home directory with the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;cd $HOME &amp;amp;&amp;amp; git clone git@github.com:SCECcode/ucerf3-etas-results.git&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
2) Get scripts&lt;br /&gt;
&lt;br /&gt;
We can't directly copy results into this repository. The data is prepared using the u3etas_jar_wrapper shell script. You should already have such scripts downloaded and made available to execute by adding to your PATH. If not, clone the repository.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;git clone https://github.com/opensha/ucerf3-etas-launcher.git&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
3) Ensure that you have selected the correct Event ID and directories. See the following section for an explanation of the parameters used.&lt;br /&gt;
&lt;br /&gt;
# Set environment variables in bash config&lt;br /&gt;
* Ensure you're on an interactive node:&lt;br /&gt;
* Update your bash config with:&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
module load cpu/0.15.4&lt;br /&gt;
&lt;br /&gt;
module load openjdk/11.0.2&lt;br /&gt;
&lt;br /&gt;
export ETAS_JAR_DISABLE_UPDATE=1&lt;br /&gt;
&lt;br /&gt;
ETAS_LAUNCHER=/expanse/lustre/projects/usc143/qwxdev/apps/expanse/rocky8.8/ucerf3-etas/069e27e/ucerf3-etas-launcher&lt;br /&gt;
&lt;br /&gt;
export PATH=$ETAS_LAUNCHER/sbin:$PATH&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
4) Connect to an interactive node&lt;br /&gt;
&amp;lt;code&amp;gt;srun --partition=debug  --pty --account=usc143 --nodes=1 --ntasks-per-node=4 --mem=16G -t 00:30:00 --wait=0 --export=ALL /bin/bash&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
5) Assuming you have reserved an interactive node with sufficient memory, ensure that you have sufficient memory for execution on this node.&lt;br /&gt;
Set &amp;lt;code&amp;gt;ETAS_MEM_GB=32&amp;lt;/code&amp;gt;. &lt;br /&gt;
* export ETAS_MEM_GB=32&lt;br /&gt;
This doesn't have to be in your bash config, just execute directly. Adjust memory needed if you encounter an OutOfMemoryError during execution.&lt;br /&gt;
&lt;br /&gt;
6) Run ComCatReportPageGen&lt;br /&gt;
The following example has the ucerf3-etas-results repository cloned in our home directory on Expanse. Execution updates the local repository, after which you can &amp;lt;code&amp;gt; git push&amp;lt;/code&amp;gt; the changes upstream given sufficient permissions. Recall that such changes don't update an already generated SCEC Event page, in which case the page will have to be regenerated.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
u3etas_jar_wrapper.sh org.opensha.commons.data.comcat.plot.ComcatReportPageGen --event-id ci41075584 \&lt;br /&gt;
&lt;br /&gt;
--min-mag 0d --radius 50 --output-parent-dir /home1/10177/bhatthal/ucerf3-etas-results \&lt;br /&gt;
&lt;br /&gt;
--etas-dir $ETAS_SIM_DIR/frontera-comcat-malibu-m3.9-n14-s100000 \&lt;br /&gt;
&lt;br /&gt;
--etas-output-dir /home1/10177/bhatthal/ucerf3-etas-results/ucerf3-etas&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You will need to update the above command with the relevant absolute paths on your system. If you encounter issues with the specified outputDir, remove the conflicting &amp;quot;outputDir&amp;quot; value in your simulation's `config.json` and try again.&lt;br /&gt;
&lt;br /&gt;
If you are unable to identify the event-id, try passing with the shortened parameter and ensuring these is no space, i.e. `-eci41075584`.&lt;br /&gt;
&lt;br /&gt;
7) Commit the changes in `ucerf3-etas-results` and push upstream to origin/master.&lt;br /&gt;
See the Git Troubleshooting section if you're unable to do this.&lt;br /&gt;
&lt;br /&gt;
== ComcatReportPageGen Usage ==&lt;br /&gt;
The [https://github.com/opensha/ucerf3-etas-launcher/blob/master/sbin/u3etas_jar_wrapper.sh u3etas_jar_wrapper.sh] shell script is used to execute any Java application in the provided OpenSHA Jar. In this case, we're executing the [https://github.com/opensha/opensha/blob/master/src/main/java/org/opensha/commons/data/comcat/plot/ComcatReportPageGen.java ComcatReportPageGen] application.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
usage: ComcatReportPageGen [-?] [-d &amp;lt;arg&amp;gt;] -e &amp;lt;arg&amp;gt; [-eod &amp;lt;arg&amp;gt;] [-etas &amp;lt;arg&amp;gt;] [-m &amp;lt;arg&amp;gt;] [-o &amp;lt;arg&amp;gt;] [-opd &amp;lt;arg&amp;gt;] [-r &amp;lt;arg&amp;gt;]&lt;br /&gt;
 -?,--help                        Display this message&lt;br /&gt;
 -d,--days-before &amp;lt;arg&amp;gt;           Number of days of events before the mainshock to fetch (default: 3)&lt;br /&gt;
 -e,--event-id &amp;lt;arg&amp;gt;              ComCat event id, e.g. 'ci39126079'&lt;br /&gt;
 -eod,--etas-output-dir &amp;lt;arg&amp;gt;     If supplied, ETAS only results will also be written to &amp;lt;path&amp;gt;/&amp;lt;event-id&amp;gt;&lt;br /&gt;
 -etas,--etas-dir &amp;lt;arg&amp;gt;           Path to a UCERF3-ETAS simulation directory&lt;br /&gt;
 -m,--min-mag &amp;lt;arg&amp;gt;               Minimum magnitude of events to fetch (default: 0.0)&lt;br /&gt;
 -o,--output-dir &amp;lt;arg&amp;gt;            Output dirctory. Must supply either this or --output-parent-dir&lt;br /&gt;
 -opd,--output-parent-dir &amp;lt;arg&amp;gt;   Output parent dirctory. The directory name will be generated automatically from the&lt;br /&gt;
                                  event name, date, and magnitude. Must supply either this or --output-dir&lt;br /&gt;
 -r,--radius &amp;lt;arg&amp;gt;                Search radius around mainshock for aftershocks. Default is the greater of 10.0 km and&lt;br /&gt;
                                  twice the Wells &amp;amp; Coppersmith (1994) median rupture length for the mainshock magnitude&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In our example, we specify the absolute path to where we cloned the ucerf3-etas-results repository, as well as the absolute path to a UCERF3-ETAS results directory within the ucerf3-etas-results repository. Our results are written into two folders to allow us to filter by either date or Event ID.&lt;br /&gt;
&lt;br /&gt;
If running on Expanse, copy any plots from scratch into your own account as we can't run in someone else's folder. You can create a tarball and copy the whole folder into your account. The same logic should apply to any other HPC system. Ensure you execute the script on an interactive node, not on the head node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Workflow Automation and Potential Challenges ==&lt;br /&gt;
This process of getting our UCERF3-ETAS forecasts into SCEC Event Pages could be automated for jobs run on Quakeworx.&lt;br /&gt;
&lt;br /&gt;
We face the following challenges in doing so:&lt;br /&gt;
* Not all Quakeworx users should have permission to publish results for SCEC Event Pages&lt;br /&gt;
* If there's an existing result, does the latest run overwrite? New UI for selecting result to use?&lt;br /&gt;
* We would have to start tagging commits to track overwritten events and update the web service to checkout accordingly&lt;br /&gt;
* SCEC Event Page regeneration would need to be triggered from Quakeworx. &lt;br /&gt;
&lt;br /&gt;
Handling of permissions for a Quakeworx GitHub account, structural changes for storing and reading results from the event-reports repository, and implementing a &amp;quot;Write to Event Page&amp;quot; boolean field on job submission is all feasible. Implementing a UI to retroactively change the selected forecast and trigger page regeneration would require further investigation into the capabilities of the Quakeworx framework and if there's an API for externally triggering the generator. These changes may improve the user experience and make publishing UCERF3-ETAS forecasts easier without requiring knowledge of a Linux terminal.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Quakeworx ==&lt;br /&gt;
Results can be published even when generated on Quakeworx, not just directly via the command-line.&lt;br /&gt;
Results are stored inside the Quakeworx development (qwxdev) scratch directory, but each user has a unique ID where their jobs are written to. Below is a list of known Drupaluid IDs for users who frequently run UCERF3-ETAS on Quakeworx.&lt;br /&gt;
* Phil Maechling: 6&lt;br /&gt;
* Fabio Silva: 7&lt;br /&gt;
* Akash Bhatthal: 20&lt;br /&gt;
* Scott Callaghan: 22&lt;br /&gt;
&lt;br /&gt;
For example, a job Phil ran on Quakeworx labelled &amp;quot;DeepSprings_EQ&amp;quot; would be found at &amp;lt;code&amp;gt;/expanse/lustre/scratch/qwxdev/temp_project/qwx1/users/drupaluid_6/jobs/DeepSprings_EQ/&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
In order to run the ComcatReportPageGen, you should copy this event from scratch into your own home directory. Here you can make changes to the config.json configuration as necessary and execute the generator (See step 6) with an updated etas-dir.&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30580</id>
		<title>SCEC Event Page Troubleshooting</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=SCEC_Event_Page_Troubleshooting&amp;diff=30580"/>
		<updated>2025-11-16T18:18:58Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Create troubleshooting entry&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page outlines all encountered bugs with the process of generating SCEC Event Pages. Each bug should have a header with a succinct summary and the date it was first encountered. Bugs that have been resolved will have &amp;quot;'''(Resolved)'''&amp;quot; appended to its header.&lt;br /&gt;
&lt;br /&gt;
== Nov 4 2025 - Failure to generate &amp;quot;M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)&amp;quot; '''(Resolved)''' ==&lt;br /&gt;
Failed to generate a SCEC Event Page for the “M3.1 - 10km NNE of Yucaipa, CA 10/23/2025 (ci41113519)” event on the event generation page at https://central.scec.org/earthquakes/eventpage/generate.&lt;br /&gt;
Ambiguous error, “error generating page” and simply “error” using the “Enter USGS ID to Generate Page” and “Recent Earthquakes” Generate buttons.&lt;br /&gt;
Bug is only encountered with this specific event. Able to generate an event page for 10/23/2205 Woodside, but not 10/23/2025 Yucaipa.&lt;br /&gt;
&lt;br /&gt;
[[Image:Yucaipa-error-msg.png|450px]]&lt;br /&gt;
Chrome Developer Tools allow for analysis of the JSON error message.&lt;br /&gt;
[[Image:Yucaipa-err-json-response.png|450px]]&lt;br /&gt;
&lt;br /&gt;
A match for the error detail is found in the &amp;lt;code&amp;gt;earthquake_event_page.php&amp;lt;/code&amp;gt; script at function &amp;lt;code&amp;gt;scec_earthquake_event_page_generate($event_id)&amp;lt;/code&amp;gt;.&lt;br /&gt;
This error is returned due to a non-zero return value from the execution of &amp;lt;code&amp;gt;update_eq_event_report.sh&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
'''TODO: Finish this section'''&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Related Entries ==&lt;br /&gt;
* [[Publishing_UCERF3-ETAS_Event_Reports]]&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Yucaipa-err-json-response.png&amp;diff=30579</id>
		<title>File:Yucaipa-err-json-response.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Yucaipa-err-json-response.png&amp;diff=30579"/>
		<updated>2025-11-16T18:14:17Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Yucaipa-error-msg.png&amp;diff=30578</id>
		<title>File:Yucaipa-error-msg.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Yucaipa-error-msg.png&amp;diff=30578"/>
		<updated>2025-11-16T18:11:23Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Main_Page&amp;diff=30577</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Main_Page&amp;diff=30577"/>
		<updated>2025-11-16T18:01:45Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* Current Activities */ Create entry `SCEC Event Page Troubleshooting`&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:SCEC_logo-2colors_BlueRed.png|left|150px]][[Image:nsf1.jpg|right|100px]][[Image:usgs-logo-color.jpg|right|150px]]&lt;br /&gt;
&lt;br /&gt;
== Community Modeling Environment (CME) ==&lt;br /&gt;
&lt;br /&gt;
This is a collaborative wiki site for SCEC's Community Modeling Environment (SCEC/CME). The CME is a collaborative, interdisciplinary research group that applies advanced computer science technology to the problem of seismic hazard analysis. This SCEC community wiki is configured to support our distributed research by providing a collection point for information about SCEC scientific computing research projects.&lt;br /&gt;
&lt;br /&gt;
== Collaborative Project Entries ==&lt;br /&gt;
The following link will take you to an alphabetically sorted list of all SCECpedia pages.&lt;br /&gt;
*[https://strike.scec.org/scecpedia/Special:AllPages List of All SCECpedia Pages]&lt;br /&gt;
&lt;br /&gt;
[[File:pathways.png|256px|thumb|right|Fig 1: SCEC/CME computational pathways provide a scientific framework for improving seismic ground motion forecasts. The SCEC/CME Project began as an NSF information technology research (ITR) project in 2001. (Image Credit: Thomas H. Jordan) ]]&lt;br /&gt;
&lt;br /&gt;
== SCEC Scientific Software  ==&lt;br /&gt;
The following table contains links to SCEC open-source scientific software descriptions and distributions.&lt;br /&gt;
*[[SCEC Scientific Software]]&lt;br /&gt;
&lt;br /&gt;
== Current Activities ==&lt;br /&gt;
* [[SCEC Event Page Troubleshooting]]&lt;br /&gt;
* [[Research Computing Continuity Plans]]&lt;br /&gt;
* [[CARC Storage Migration]]&lt;br /&gt;
* [[UCVM Release v25.7]]&lt;br /&gt;
* [[UCVM Install Stampede3]]&lt;br /&gt;
* [[Preferred Rupture Directivity in Hazard Curve Computations]]&lt;br /&gt;
* [[U3ETAS Configurations]]&lt;br /&gt;
* [[HPC Troubleshooting]]&lt;br /&gt;
* [[Software Development Practices]]&lt;br /&gt;
* [[Bhatthal Projects and Presentations]]&lt;br /&gt;
*[[Quakeworx]]&lt;br /&gt;
** [[PdfGen]]&lt;br /&gt;
**[[Quakeworx Workshop]]&lt;br /&gt;
**[[Mendocino_Event]]&lt;br /&gt;
**[[CI3144585]]&lt;br /&gt;
*[[CyberShake]]&lt;br /&gt;
**[[CyberShake_Study 24.8]]&lt;br /&gt;
**[[CyberShake Study 22.12]]&lt;br /&gt;
**[[CyberShake FAIR]]&lt;br /&gt;
**[[NorCal CyberShake SW4 Mesh]]&lt;br /&gt;
**[[SW4 Mesh Development]]&lt;br /&gt;
**[[NCAL Study]]&lt;br /&gt;
**[[CyberShake Basin Ground Motions]]&lt;br /&gt;
**[[Broadband CyberShake aggregate comparisons]]&lt;br /&gt;
*[[OpenSHA]]&lt;br /&gt;
**[[Beta Testing]]&lt;br /&gt;
**[[OpenSHA-Jupyter]]&lt;br /&gt;
**[[UCERF3-ETAS Measurements]]&lt;br /&gt;
**[[UCERF3-ETAS Documentation]]&lt;br /&gt;
**[[GetFile]]&lt;br /&gt;
*[[Publishing UCERF3-ETAS Event Reports]]&lt;br /&gt;
*[[SCEC VDO]]&lt;br /&gt;
*[[CXM Website Review]]&lt;br /&gt;
**[http://moho.scec.org/cvm-explorer/explorer.php Prototype CVM Explorer]&lt;br /&gt;
**[http://moho.scec.org/UCVM_web/web/viewer.php CVM]&lt;br /&gt;
**[[CTM]]&lt;br /&gt;
**[[CGM]]&lt;br /&gt;
**[[GFM]]&lt;br /&gt;
**[[CFM]]&lt;br /&gt;
**[[UCVM]]&lt;br /&gt;
**[http://moho.scec.org/egd-viewer/ EGD Earthquake Geology Database]&lt;br /&gt;
**[[UCVM on Frontier]]&lt;br /&gt;
**[[UCVM Basin Query Tests]]&lt;br /&gt;
**[[SCEC_NetCDF_CVMS]]&lt;br /&gt;
**[[CVM_S4_Testing]]&lt;br /&gt;
**[[CRESCENT CVM]]&lt;br /&gt;
**[[CRM_Query]]&lt;br /&gt;
**[[CEM How to generate tiles for leaflet basemap]]&lt;br /&gt;
*[[SCEC Media]]&lt;br /&gt;
*[[CSEP]]&lt;br /&gt;
**[[floatCSEP]]&lt;br /&gt;
**[[CSEP1 Archives]]&lt;br /&gt;
*[[AWP-ODC Distributions]]&lt;br /&gt;
*[[Scenario ShakeMaps]]&lt;br /&gt;
*[[Dockerized Websites]]&lt;br /&gt;
*[[National Cyberinfrastructure]]&lt;br /&gt;
*[[OLCF Summit]]&lt;br /&gt;
*[[Broadband Platform]]&lt;br /&gt;
**[[BBP Strong Ground Motion Project]]&lt;br /&gt;
**[[BBP_Data_Products]]&lt;br /&gt;
**[[BBP on Discovery]]&lt;br /&gt;
**[[CARC BBP Setup]]&lt;br /&gt;
**[[3D Broadband Platform]]&lt;br /&gt;
*[[Southern California Seismic Velocity Model Vertical Profiles]]&lt;br /&gt;
*[[Magnitude_Versus_Intensity]]&lt;br /&gt;
*[[Workplans]]&lt;br /&gt;
*[[Software Sustainability Project]]&lt;br /&gt;
**[[Release Planning]]&lt;br /&gt;
**[[Hypocenter Replacement]]&lt;br /&gt;
**[[Software Testing References]]&lt;br /&gt;
**[[Software Reproducibility]]&lt;br /&gt;
*[[SCEC Server Migration]]&lt;br /&gt;
**[[SCEC CARC Migration]]&lt;br /&gt;
**[[SCEC ACB Migration]]&lt;br /&gt;
*[[Docker Hub]]&lt;br /&gt;
*[[Forecast Data]]&lt;br /&gt;
*[[Adding iVIP Users]]&lt;br /&gt;
*[[SCEC RC Working Group]]&lt;br /&gt;
*[[SDSC Expanse]]&lt;br /&gt;
*[[ALCF ML Workshop]]&lt;br /&gt;
*[[CyberShake_BBP_Validation]]&lt;br /&gt;
*[[Multi-resolution Meshes]]&lt;br /&gt;
*[[SCEC RC Meeting 2021]]&lt;br /&gt;
*[[Staff Meeting Oct 2021]]&lt;br /&gt;
**[[Software Searches]]&lt;br /&gt;
**[[Open Source Software Questions]]&lt;br /&gt;
*[[CIG Workshop]]&lt;br /&gt;
*[[SCEC CyberInfrastructure White Papers (2017-2020)]]&lt;br /&gt;
*[[IRIS_EMC]]&lt;br /&gt;
*[[Software Licenses]]&lt;br /&gt;
*[[CyberShake Distribution]]&lt;br /&gt;
*[[Testing UCVM Vs Values]]&lt;br /&gt;
**[[Albacore CVM]]&lt;br /&gt;
**[[CVM Webviewer Development]]&lt;br /&gt;
**[[CFM Web Development]]&lt;br /&gt;
*[[SCEC Testing Systems]]&lt;br /&gt;
*[[Staff Meeting]]&lt;br /&gt;
*[[High-F]]&lt;br /&gt;
**[[HighF_2018]]&lt;br /&gt;
*[[Stand-up Meetings]]&lt;br /&gt;
*[[Software Projects]]&lt;br /&gt;
*[[Disk Usage]]&lt;br /&gt;
*[[Magnitude Versus Intensity]]&lt;br /&gt;
*[[CyberShake_Study_20.5]]&lt;br /&gt;
*[[UCVM Verification]]&lt;br /&gt;
*[[CSEP_Working_Group]]&lt;br /&gt;
**[[CSEP_Gitlab]]&lt;br /&gt;
**[[CSEP_Training]]&lt;br /&gt;
**[[CSEP_Computers]]&lt;br /&gt;
**[[CSEP_Results]]&lt;br /&gt;
**[[CSEP_Workflows]]&lt;br /&gt;
**[[CSEP Test Results]]&lt;br /&gt;
**[[CSEP]]&lt;br /&gt;
*[[SCEC_EarthScience_DTS_HPC]]&lt;br /&gt;
*[[Cyberinfrastructure Center of Excellence]]&lt;br /&gt;
*[[Dornsife Technology Services (DTS)]]&lt;br /&gt;
*[[SC19]]&lt;br /&gt;
*[[Open Storage Network]]&lt;br /&gt;
*[[Rupture_Variation_Generator_v5.4.2]]&lt;br /&gt;
*[[Cascadia Simulations]]&lt;br /&gt;
*[[Ridgecrest Simulations]]&lt;br /&gt;
*[https://www.scec.org/workshops/2019/computing Research Computing Meeting]&lt;br /&gt;
*[http://www.scec.org/research/cxm SCEC CXM Inventory]&lt;br /&gt;
*[[ShakeMovies]]&lt;br /&gt;
*[[Brawley Seismic Zone Simulations]]&lt;br /&gt;
*[[SVN]]&lt;br /&gt;
*[[Zenodo and GitHub]]&lt;br /&gt;
*[[Callaghan Presentations]]&lt;br /&gt;
*[[Maechling Presentations]]&lt;br /&gt;
*[[Allocation Planning]]&lt;br /&gt;
&lt;br /&gt;
== Recent Activities ==&lt;br /&gt;
*[[Software At SCEC Responses]]&lt;br /&gt;
*[[Research Computing]]&lt;br /&gt;
*[[SSA Velocity Model Workshop 2019]]&lt;br /&gt;
*[[Software Workshop 2018]]&lt;br /&gt;
*[[Validation_Events]]&lt;br /&gt;
*[[Software]]&lt;br /&gt;
**[[Software Terms]]&lt;br /&gt;
*[[Transient Detection]]&lt;br /&gt;
*[[AGU Fall 2018]]&lt;br /&gt;
*[[SC18]]&lt;br /&gt;
*[[LA Vertical Profiles]]&lt;br /&gt;
*[[CVM-H 15.1 Maps]]&lt;br /&gt;
*[[Wills Map]]&lt;br /&gt;
*[[CyberShake Training]]&lt;br /&gt;
*[[CyberShake_Data_Access]]&lt;br /&gt;
*[[CyberShake Study 17.3]]&lt;br /&gt;
*[[Git]]&lt;br /&gt;
**[[Git Basics]]&lt;br /&gt;
**[[github API examples]]&lt;br /&gt;
*[[GMSV Simulation Datasets]]&lt;br /&gt;
*[[Blue_Waters_Project]]&lt;br /&gt;
**[[Blue Waters Symposium 2017]]&lt;br /&gt;
*[[CME_Projects]]&lt;br /&gt;
**[[SEISM2]]&lt;br /&gt;
*[[HPC Software]]&lt;br /&gt;
**[[Exascale Computing]]&lt;br /&gt;
**[[INCITE_Project]]&lt;br /&gt;
**[[NERSC]]&lt;br /&gt;
**[[Slurm]]&lt;br /&gt;
**[[Seismtools]]&lt;br /&gt;
**[[Build_Tools]]&lt;br /&gt;
*[[CME Software Development Group]]&lt;br /&gt;
**[[Scrum]]&lt;br /&gt;
**[[Staff]]&lt;br /&gt;
*[[Machine Learning]]&lt;br /&gt;
**[[tensorflow]]&lt;br /&gt;
&lt;br /&gt;
*[[BBP Flat File Format]]&lt;br /&gt;
*[[BBP Validation Events]]&lt;br /&gt;
*[[BBP Batch Jobs]]&lt;br /&gt;
&lt;br /&gt;
**[[UCVM Release Planning]]&lt;br /&gt;
* UCVM Tutorial Pages&lt;br /&gt;
**[https://github.com/sceccode/ucvm/wiki/testing UCVM Testing]&lt;br /&gt;
**[https://github.com/sceccode/ucvm_plotting/wiki UCVM Plotting]&lt;br /&gt;
**[https://github.com/sceccode/ucvm/wiki/Examples UCVM Examples]&lt;br /&gt;
**[[README.md Template]]&lt;br /&gt;
**[[UCVM Install]]&lt;br /&gt;
**[[Running UCVM on Discovery]]&lt;br /&gt;
**[[UCVM Basin Query]]&lt;br /&gt;
**[[UCVM on Compute Nodes]]&lt;br /&gt;
**[[Export XWindows to Client]]&lt;br /&gt;
**[[UCVM Plotting on Discovery]]&lt;br /&gt;
**[[UCVM Usage Notes]]&lt;br /&gt;
* UCVM Notes&lt;br /&gt;
**[[UCVM FAQ]]&lt;br /&gt;
**[[UCVM v25.7 with external model data directory CVM_LARGEDATA_DIR]]&lt;br /&gt;
**[[UCVM ucvm with sw4 using cvmsi]]&lt;br /&gt;
**[[UCVM create new model with ucvm2mesh]]&lt;br /&gt;
**[[UCVM cvmsi tapering]]&lt;br /&gt;
**[[UCVM sfcvm geomodelgrid]]&lt;br /&gt;
**[[UCVM install on Frontera]]&lt;br /&gt;
**[[UCVM cvmsi tapering for CyberShake Study 22.12]] &lt;br /&gt;
**[[UCVM cvms, cvmsi near-surface comparison]]&lt;br /&gt;
**[[UCVM cca/cvms5 comparing builtin-gtl vs elygtl:ely]]&lt;br /&gt;
**[[UCVM VS30 tree map(Thompson 2018) ]]&lt;br /&gt;
**[[UCVM VS30 etree map (Wills 2015) UCVM's interpolation]]&lt;br /&gt;
**[[UCVM z1,z2.5 for CCA06 with GTL, CVM-S4.26.M01]]&lt;br /&gt;
**[[UCVM 3D Viz]]&lt;br /&gt;
**[[UCVM svm1d]]&lt;br /&gt;
**[[UCVM svm1d and elygtl]]&lt;br /&gt;
**[[UCVM etree for Garner Valley]]&lt;br /&gt;
**[[UCVM elevation vs depth, model boundary]]&lt;br /&gt;
**[[UCVMC basin depth study, poly tech, Pomona ]]&lt;br /&gt;
**[[UCVMC how to plot cross_section and depth_profile]]&lt;br /&gt;
**[[CVM for CyberShake Study 18.8]]&lt;br /&gt;
**[[UCVMC How to process bin data]]&lt;br /&gt;
**[[UCVMC_CS17.3-H_plots]]&lt;br /&gt;
**[[UCVM Density Formula]]&lt;br /&gt;
**[[UCVM v18.5]]&lt;br /&gt;
**[[UCVM_Vs30]]&lt;br /&gt;
**[[CS173-H]]&lt;br /&gt;
**[[UCVM Review]]&lt;br /&gt;
**[[Registering CS173 into UCVM]]&lt;br /&gt;
**[[CVM Projection Issue Discussion]]&lt;br /&gt;
**[[Mesh Plotting Scripts]]&lt;br /&gt;
**[[ucvm2mesh-mpi]]&lt;br /&gt;
**[[Compare_UCVMC_to_UCVMP]]&lt;br /&gt;
**[[CCA06 Test Points]]&lt;br /&gt;
**[[Bay Area Velocity Model Z2.5 data]]&lt;br /&gt;
&lt;br /&gt;
== CME Outcomes ==&lt;br /&gt;
*[[CME Work Areas]]&lt;br /&gt;
*[[Publications]]&lt;br /&gt;
*[[Press Coverage]]&lt;br /&gt;
&lt;br /&gt;
== Recent Earthquake Information ==&lt;br /&gt;
An important goal of SCEC earthquake research is to develop improved seismic hazard information about future earthquakes by developing physics-based predictive models of earthquake processes and Improved seismic hazard estimates should lead to reduced seismic hazard risks to people and important societal infrastructure.&lt;br /&gt;
*[http://earthquake.usgs.gov/earthquakes/map/ USGS Recent California Earthquakes]&lt;br /&gt;
*[http://earthquake.usgs.gov/earthquakes/map/ USGS Recent Worldwide Earthquakes]&lt;br /&gt;
&lt;br /&gt;
== CME Research Support ==&lt;br /&gt;
[http://www.scec.org Southern California Earthquake Center (SCEC)] and [http://www.scec.org/cme SCEC/CME] research is funded by [http://www.nsf.gov National Science Foundation (NSF)] Cooperative Agreements EAR-0106924 and USGS Cooperative Agreement 02HQAG0008, and NSF awards EAR- 074493, EAR-0949443, OCI-0832698, and OCI-0832698. This research is supported by an allocation of advanced computing resources provided by the National Science Foundation (NSF). Computations are performed at [http://www.sdsc.edu San Diego Supercomputer Center], and the [http://www.tacc.utexas.edu Texas Advanced Computing Center (TACC)] at The University of Texas at Austin, the [http://www.ncsa.illinois.edu National Center for Supercomputer Applications (NCSA)] provide HPC resources. Computations are supported by the [http://www.usc.edu University of Southern California] Center for [http://www.usc.edu/hpc Center for High-Performance Computing (HPC)]. Our research uses HPC resources provided by the [http://www.energy.gov/ U.S. Department of Energy (DOE)] through an [http://www.science.doe.gov/ascr/incite/index.html Innovative and Novel Computational Impact on Theory and Experiment (INCITE)] program allocation award. An award of computer time was provided by the INCITE program. This research uses resources of the [http://www.alcf.anl.gov/ Argonne Leadership Computing Facility] at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02- 06CH11357. This research also used resources of the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.&lt;br /&gt;
&lt;br /&gt;
== See Also ==&lt;br /&gt;
Additional information about SCEC earthquake system science research is available on related SCEC web sites including:&lt;br /&gt;
*[http://www.scec.org/ SCEC Home Page]&lt;br /&gt;
*[http://scec.usc.edu/scecpedia/Special:AllPages List of All SCECpedia Pages]&lt;br /&gt;
&lt;br /&gt;
== License ==&lt;br /&gt;
Except as otherwise noted, the contents of this site are licensed under the [http://creativecommons.org/licenses/by/3.0/deed.en_US Creative Commons Attribution 3.0 Unported License], and software distributions are licensed under the [https://opensource.org/ Open Source Initiative] approved licenses including BSD-3 and [http://www.apache.org/licenses/LICENSE-2.0 Apache 2.0 License]. For details, see our [[Site Policies]].&lt;br /&gt;
[[image:Cc3_88x31.png]] &amp;lt;br /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Main_Page&amp;diff=30576</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Main_Page&amp;diff=30576"/>
		<updated>2025-11-16T18:00:56Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* Current Activities */ Restructure `Publishing UCERF3-ETAS Event Reports` and `SCEC VDO` out of `OpenSHA` bulletpoint.&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;[[Image:SCEC_logo-2colors_BlueRed.png|left|150px]][[Image:nsf1.jpg|right|100px]][[Image:usgs-logo-color.jpg|right|150px]]&lt;br /&gt;
&lt;br /&gt;
== Community Modeling Environment (CME) ==&lt;br /&gt;
&lt;br /&gt;
This is a collaborative wiki site for SCEC's Community Modeling Environment (SCEC/CME). The CME is a collaborative, interdisciplinary research group that applies advanced computer science technology to the problem of seismic hazard analysis. This SCEC community wiki is configured to support our distributed research by providing a collection point for information about SCEC scientific computing research projects.&lt;br /&gt;
&lt;br /&gt;
== Collaborative Project Entries ==&lt;br /&gt;
The following link will take you to an alphabetically sorted list of all SCECpedia pages.&lt;br /&gt;
*[https://strike.scec.org/scecpedia/Special:AllPages List of All SCECpedia Pages]&lt;br /&gt;
&lt;br /&gt;
[[File:pathways.png|256px|thumb|right|Fig 1: SCEC/CME computational pathways provide a scientific framework for improving seismic ground motion forecasts. The SCEC/CME Project began as an NSF information technology research (ITR) project in 2001. (Image Credit: Thomas H. Jordan) ]]&lt;br /&gt;
&lt;br /&gt;
== SCEC Scientific Software  ==&lt;br /&gt;
The following table contains links to SCEC open-source scientific software descriptions and distributions.&lt;br /&gt;
*[[SCEC Scientific Software]]&lt;br /&gt;
&lt;br /&gt;
== Current Activities ==&lt;br /&gt;
* [[Research Computing Continuity Plans]]&lt;br /&gt;
* [[CARC Storage Migration]]&lt;br /&gt;
* [[UCVM Release v25.7]]&lt;br /&gt;
* [[UCVM Install Stampede3]]&lt;br /&gt;
* [[Preferred Rupture Directivity in Hazard Curve Computations]]&lt;br /&gt;
* [[U3ETAS Configurations]]&lt;br /&gt;
* [[HPC Troubleshooting]]&lt;br /&gt;
* [[Software Development Practices]]&lt;br /&gt;
* [[Bhatthal Projects and Presentations]]&lt;br /&gt;
*[[Quakeworx]]&lt;br /&gt;
** [[PdfGen]]&lt;br /&gt;
**[[Quakeworx Workshop]]&lt;br /&gt;
**[[Mendocino_Event]]&lt;br /&gt;
**[[CI3144585]]&lt;br /&gt;
*[[CyberShake]]&lt;br /&gt;
**[[CyberShake_Study 24.8]]&lt;br /&gt;
**[[CyberShake Study 22.12]]&lt;br /&gt;
**[[CyberShake FAIR]]&lt;br /&gt;
**[[NorCal CyberShake SW4 Mesh]]&lt;br /&gt;
**[[SW4 Mesh Development]]&lt;br /&gt;
**[[NCAL Study]]&lt;br /&gt;
**[[CyberShake Basin Ground Motions]]&lt;br /&gt;
**[[Broadband CyberShake aggregate comparisons]]&lt;br /&gt;
*[[OpenSHA]]&lt;br /&gt;
**[[Beta Testing]]&lt;br /&gt;
**[[OpenSHA-Jupyter]]&lt;br /&gt;
**[[UCERF3-ETAS Measurements]]&lt;br /&gt;
**[[UCERF3-ETAS Documentation]]&lt;br /&gt;
**[[GetFile]]&lt;br /&gt;
*[[Publishing UCERF3-ETAS Event Reports]]&lt;br /&gt;
*[[SCEC VDO]]&lt;br /&gt;
*[[CXM Website Review]]&lt;br /&gt;
**[http://moho.scec.org/cvm-explorer/explorer.php Prototype CVM Explorer]&lt;br /&gt;
**[http://moho.scec.org/UCVM_web/web/viewer.php CVM]&lt;br /&gt;
**[[CTM]]&lt;br /&gt;
**[[CGM]]&lt;br /&gt;
**[[GFM]]&lt;br /&gt;
**[[CFM]]&lt;br /&gt;
**[[UCVM]]&lt;br /&gt;
**[http://moho.scec.org/egd-viewer/ EGD Earthquake Geology Database]&lt;br /&gt;
**[[UCVM on Frontier]]&lt;br /&gt;
**[[UCVM Basin Query Tests]]&lt;br /&gt;
**[[SCEC_NetCDF_CVMS]]&lt;br /&gt;
**[[CVM_S4_Testing]]&lt;br /&gt;
**[[CRESCENT CVM]]&lt;br /&gt;
**[[CRM_Query]]&lt;br /&gt;
**[[CEM How to generate tiles for leaflet basemap]]&lt;br /&gt;
*[[SCEC Media]]&lt;br /&gt;
*[[CSEP]]&lt;br /&gt;
**[[floatCSEP]]&lt;br /&gt;
**[[CSEP1 Archives]]&lt;br /&gt;
*[[AWP-ODC Distributions]]&lt;br /&gt;
*[[Scenario ShakeMaps]]&lt;br /&gt;
*[[Dockerized Websites]]&lt;br /&gt;
*[[National Cyberinfrastructure]]&lt;br /&gt;
*[[OLCF Summit]]&lt;br /&gt;
*[[Broadband Platform]]&lt;br /&gt;
**[[BBP Strong Ground Motion Project]]&lt;br /&gt;
**[[BBP_Data_Products]]&lt;br /&gt;
**[[BBP on Discovery]]&lt;br /&gt;
**[[CARC BBP Setup]]&lt;br /&gt;
**[[3D Broadband Platform]]&lt;br /&gt;
*[[Southern California Seismic Velocity Model Vertical Profiles]]&lt;br /&gt;
*[[Magnitude_Versus_Intensity]]&lt;br /&gt;
*[[Workplans]]&lt;br /&gt;
*[[Software Sustainability Project]]&lt;br /&gt;
**[[Release Planning]]&lt;br /&gt;
**[[Hypocenter Replacement]]&lt;br /&gt;
**[[Software Testing References]]&lt;br /&gt;
**[[Software Reproducibility]]&lt;br /&gt;
*[[SCEC Server Migration]]&lt;br /&gt;
**[[SCEC CARC Migration]]&lt;br /&gt;
**[[SCEC ACB Migration]]&lt;br /&gt;
*[[Docker Hub]]&lt;br /&gt;
*[[Forecast Data]]&lt;br /&gt;
*[[Adding iVIP Users]]&lt;br /&gt;
*[[SCEC RC Working Group]]&lt;br /&gt;
*[[SDSC Expanse]]&lt;br /&gt;
*[[ALCF ML Workshop]]&lt;br /&gt;
*[[CyberShake_BBP_Validation]]&lt;br /&gt;
*[[Multi-resolution Meshes]]&lt;br /&gt;
*[[SCEC RC Meeting 2021]]&lt;br /&gt;
*[[Staff Meeting Oct 2021]]&lt;br /&gt;
**[[Software Searches]]&lt;br /&gt;
**[[Open Source Software Questions]]&lt;br /&gt;
*[[CIG Workshop]]&lt;br /&gt;
*[[SCEC CyberInfrastructure White Papers (2017-2020)]]&lt;br /&gt;
*[[IRIS_EMC]]&lt;br /&gt;
*[[Software Licenses]]&lt;br /&gt;
*[[CyberShake Distribution]]&lt;br /&gt;
*[[Testing UCVM Vs Values]]&lt;br /&gt;
**[[Albacore CVM]]&lt;br /&gt;
**[[CVM Webviewer Development]]&lt;br /&gt;
**[[CFM Web Development]]&lt;br /&gt;
*[[SCEC Testing Systems]]&lt;br /&gt;
*[[Staff Meeting]]&lt;br /&gt;
*[[High-F]]&lt;br /&gt;
**[[HighF_2018]]&lt;br /&gt;
*[[Stand-up Meetings]]&lt;br /&gt;
*[[Software Projects]]&lt;br /&gt;
*[[Disk Usage]]&lt;br /&gt;
*[[Magnitude Versus Intensity]]&lt;br /&gt;
*[[CyberShake_Study_20.5]]&lt;br /&gt;
*[[UCVM Verification]]&lt;br /&gt;
*[[CSEP_Working_Group]]&lt;br /&gt;
**[[CSEP_Gitlab]]&lt;br /&gt;
**[[CSEP_Training]]&lt;br /&gt;
**[[CSEP_Computers]]&lt;br /&gt;
**[[CSEP_Results]]&lt;br /&gt;
**[[CSEP_Workflows]]&lt;br /&gt;
**[[CSEP Test Results]]&lt;br /&gt;
**[[CSEP]]&lt;br /&gt;
*[[SCEC_EarthScience_DTS_HPC]]&lt;br /&gt;
*[[Cyberinfrastructure Center of Excellence]]&lt;br /&gt;
*[[Dornsife Technology Services (DTS)]]&lt;br /&gt;
*[[SC19]]&lt;br /&gt;
*[[Open Storage Network]]&lt;br /&gt;
*[[Rupture_Variation_Generator_v5.4.2]]&lt;br /&gt;
*[[Cascadia Simulations]]&lt;br /&gt;
*[[Ridgecrest Simulations]]&lt;br /&gt;
*[https://www.scec.org/workshops/2019/computing Research Computing Meeting]&lt;br /&gt;
*[http://www.scec.org/research/cxm SCEC CXM Inventory]&lt;br /&gt;
*[[ShakeMovies]]&lt;br /&gt;
*[[Brawley Seismic Zone Simulations]]&lt;br /&gt;
*[[SVN]]&lt;br /&gt;
*[[Zenodo and GitHub]]&lt;br /&gt;
*[[Callaghan Presentations]]&lt;br /&gt;
*[[Maechling Presentations]]&lt;br /&gt;
*[[Allocation Planning]]&lt;br /&gt;
&lt;br /&gt;
== Recent Activities ==&lt;br /&gt;
*[[Software At SCEC Responses]]&lt;br /&gt;
*[[Research Computing]]&lt;br /&gt;
*[[SSA Velocity Model Workshop 2019]]&lt;br /&gt;
*[[Software Workshop 2018]]&lt;br /&gt;
*[[Validation_Events]]&lt;br /&gt;
*[[Software]]&lt;br /&gt;
**[[Software Terms]]&lt;br /&gt;
*[[Transient Detection]]&lt;br /&gt;
*[[AGU Fall 2018]]&lt;br /&gt;
*[[SC18]]&lt;br /&gt;
*[[LA Vertical Profiles]]&lt;br /&gt;
*[[CVM-H 15.1 Maps]]&lt;br /&gt;
*[[Wills Map]]&lt;br /&gt;
*[[CyberShake Training]]&lt;br /&gt;
*[[CyberShake_Data_Access]]&lt;br /&gt;
*[[CyberShake Study 17.3]]&lt;br /&gt;
*[[Git]]&lt;br /&gt;
**[[Git Basics]]&lt;br /&gt;
**[[github API examples]]&lt;br /&gt;
*[[GMSV Simulation Datasets]]&lt;br /&gt;
*[[Blue_Waters_Project]]&lt;br /&gt;
**[[Blue Waters Symposium 2017]]&lt;br /&gt;
*[[CME_Projects]]&lt;br /&gt;
**[[SEISM2]]&lt;br /&gt;
*[[HPC Software]]&lt;br /&gt;
**[[Exascale Computing]]&lt;br /&gt;
**[[INCITE_Project]]&lt;br /&gt;
**[[NERSC]]&lt;br /&gt;
**[[Slurm]]&lt;br /&gt;
**[[Seismtools]]&lt;br /&gt;
**[[Build_Tools]]&lt;br /&gt;
*[[CME Software Development Group]]&lt;br /&gt;
**[[Scrum]]&lt;br /&gt;
**[[Staff]]&lt;br /&gt;
*[[Machine Learning]]&lt;br /&gt;
**[[tensorflow]]&lt;br /&gt;
&lt;br /&gt;
*[[BBP Flat File Format]]&lt;br /&gt;
*[[BBP Validation Events]]&lt;br /&gt;
*[[BBP Batch Jobs]]&lt;br /&gt;
&lt;br /&gt;
**[[UCVM Release Planning]]&lt;br /&gt;
* UCVM Tutorial Pages&lt;br /&gt;
**[https://github.com/sceccode/ucvm/wiki/testing UCVM Testing]&lt;br /&gt;
**[https://github.com/sceccode/ucvm_plotting/wiki UCVM Plotting]&lt;br /&gt;
**[https://github.com/sceccode/ucvm/wiki/Examples UCVM Examples]&lt;br /&gt;
**[[README.md Template]]&lt;br /&gt;
**[[UCVM Install]]&lt;br /&gt;
**[[Running UCVM on Discovery]]&lt;br /&gt;
**[[UCVM Basin Query]]&lt;br /&gt;
**[[UCVM on Compute Nodes]]&lt;br /&gt;
**[[Export XWindows to Client]]&lt;br /&gt;
**[[UCVM Plotting on Discovery]]&lt;br /&gt;
**[[UCVM Usage Notes]]&lt;br /&gt;
* UCVM Notes&lt;br /&gt;
**[[UCVM FAQ]]&lt;br /&gt;
**[[UCVM v25.7 with external model data directory CVM_LARGEDATA_DIR]]&lt;br /&gt;
**[[UCVM ucvm with sw4 using cvmsi]]&lt;br /&gt;
**[[UCVM create new model with ucvm2mesh]]&lt;br /&gt;
**[[UCVM cvmsi tapering]]&lt;br /&gt;
**[[UCVM sfcvm geomodelgrid]]&lt;br /&gt;
**[[UCVM install on Frontera]]&lt;br /&gt;
**[[UCVM cvmsi tapering for CyberShake Study 22.12]] &lt;br /&gt;
**[[UCVM cvms, cvmsi near-surface comparison]]&lt;br /&gt;
**[[UCVM cca/cvms5 comparing builtin-gtl vs elygtl:ely]]&lt;br /&gt;
**[[UCVM VS30 tree map(Thompson 2018) ]]&lt;br /&gt;
**[[UCVM VS30 etree map (Wills 2015) UCVM's interpolation]]&lt;br /&gt;
**[[UCVM z1,z2.5 for CCA06 with GTL, CVM-S4.26.M01]]&lt;br /&gt;
**[[UCVM 3D Viz]]&lt;br /&gt;
**[[UCVM svm1d]]&lt;br /&gt;
**[[UCVM svm1d and elygtl]]&lt;br /&gt;
**[[UCVM etree for Garner Valley]]&lt;br /&gt;
**[[UCVM elevation vs depth, model boundary]]&lt;br /&gt;
**[[UCVMC basin depth study, poly tech, Pomona ]]&lt;br /&gt;
**[[UCVMC how to plot cross_section and depth_profile]]&lt;br /&gt;
**[[CVM for CyberShake Study 18.8]]&lt;br /&gt;
**[[UCVMC How to process bin data]]&lt;br /&gt;
**[[UCVMC_CS17.3-H_plots]]&lt;br /&gt;
**[[UCVM Density Formula]]&lt;br /&gt;
**[[UCVM v18.5]]&lt;br /&gt;
**[[UCVM_Vs30]]&lt;br /&gt;
**[[CS173-H]]&lt;br /&gt;
**[[UCVM Review]]&lt;br /&gt;
**[[Registering CS173 into UCVM]]&lt;br /&gt;
**[[CVM Projection Issue Discussion]]&lt;br /&gt;
**[[Mesh Plotting Scripts]]&lt;br /&gt;
**[[ucvm2mesh-mpi]]&lt;br /&gt;
**[[Compare_UCVMC_to_UCVMP]]&lt;br /&gt;
**[[CCA06 Test Points]]&lt;br /&gt;
**[[Bay Area Velocity Model Z2.5 data]]&lt;br /&gt;
&lt;br /&gt;
== CME Outcomes ==&lt;br /&gt;
*[[CME Work Areas]]&lt;br /&gt;
*[[Publications]]&lt;br /&gt;
*[[Press Coverage]]&lt;br /&gt;
&lt;br /&gt;
== Recent Earthquake Information ==&lt;br /&gt;
An important goal of SCEC earthquake research is to develop improved seismic hazard information about future earthquakes by developing physics-based predictive models of earthquake processes and Improved seismic hazard estimates should lead to reduced seismic hazard risks to people and important societal infrastructure.&lt;br /&gt;
*[http://earthquake.usgs.gov/earthquakes/map/ USGS Recent California Earthquakes]&lt;br /&gt;
*[http://earthquake.usgs.gov/earthquakes/map/ USGS Recent Worldwide Earthquakes]&lt;br /&gt;
&lt;br /&gt;
== CME Research Support ==&lt;br /&gt;
[http://www.scec.org Southern California Earthquake Center (SCEC)] and [http://www.scec.org/cme SCEC/CME] research is funded by [http://www.nsf.gov National Science Foundation (NSF)] Cooperative Agreements EAR-0106924 and USGS Cooperative Agreement 02HQAG0008, and NSF awards EAR- 074493, EAR-0949443, OCI-0832698, and OCI-0832698. This research is supported by an allocation of advanced computing resources provided by the National Science Foundation (NSF). Computations are performed at [http://www.sdsc.edu San Diego Supercomputer Center], and the [http://www.tacc.utexas.edu Texas Advanced Computing Center (TACC)] at The University of Texas at Austin, the [http://www.ncsa.illinois.edu National Center for Supercomputer Applications (NCSA)] provide HPC resources. Computations are supported by the [http://www.usc.edu University of Southern California] Center for [http://www.usc.edu/hpc Center for High-Performance Computing (HPC)]. Our research uses HPC resources provided by the [http://www.energy.gov/ U.S. Department of Energy (DOE)] through an [http://www.science.doe.gov/ascr/incite/index.html Innovative and Novel Computational Impact on Theory and Experiment (INCITE)] program allocation award. An award of computer time was provided by the INCITE program. This research uses resources of the [http://www.alcf.anl.gov/ Argonne Leadership Computing Facility] at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02- 06CH11357. This research also used resources of the Oak Ridge Leadership Computing Facility, which is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.&lt;br /&gt;
&lt;br /&gt;
== See Also ==&lt;br /&gt;
Additional information about SCEC earthquake system science research is available on related SCEC web sites including:&lt;br /&gt;
*[http://www.scec.org/ SCEC Home Page]&lt;br /&gt;
*[http://scec.usc.edu/scecpedia/Special:AllPages List of All SCECpedia Pages]&lt;br /&gt;
&lt;br /&gt;
== License ==&lt;br /&gt;
Except as otherwise noted, the contents of this site are licensed under the [http://creativecommons.org/licenses/by/3.0/deed.en_US Creative Commons Attribution 3.0 Unported License], and software distributions are licensed under the [https://opensource.org/ Open Source Initiative] approved licenses including BSD-3 and [http://www.apache.org/licenses/LICENSE-2.0 Apache 2.0 License]. For details, see our [[Site Policies]].&lt;br /&gt;
[[image:Cc3_88x31.png]] &amp;lt;br /&amp;gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=GetFile&amp;diff=30575</id>
		<title>GetFile</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=GetFile&amp;diff=30575"/>
		<updated>2025-11-16T17:50:57Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: list of all supported hazard models and GetFile server endpoints&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;== OpenSHA Problem ==&lt;br /&gt;
Fault section data and fault system rupture sets are loaded by OpenSHA to compute earthquake rupture forecasts. The geospatial data for several models is stored directly with the OpenSHA code that operates on it. Models for seismic hazard analysis under the OpenSHA framework are becoming progressively larger. Unfortunately, there are file size constraints of 100MB on GitHub, which can't fit the new 2023 US National Seismic Hazard Model (NSHM23).&lt;br /&gt;
&lt;br /&gt;
== Current Solution ==&lt;br /&gt;
Smaller models can continue to be hosted on GitHub with the OpenSHA code, but UCERF3 has been moved to a server on USC campus. Currently, OpenSHA downloads the model from the ASB &amp;quot;cheesegrater&amp;quot; server. This solution is not scalable and has the potential to partially download or otherwise download a corrupted version of the UCERF3 model. These older servers are going to be decommissioned soon and we need to transition to a better long-term solution.&lt;br /&gt;
&lt;br /&gt;
== Proposed Solution ==&lt;br /&gt;
GetFile is a more robust solution to hosting hazard models for use in OpenSHA. It will be used to download and validate the UCERF3 and NSHM23 models. It may see use in other models and several projects across SCEC that need to download and validate files, such as UCVM. Scientific models can be stored on USC CARC and downloaded via the GetFile framework. GetFile provides a sophisticated feature-set for data validation, rolling back to older model snapshots, and enabling automatic updates of the GetFile framework for seamless deployment of new features and bug fixes.&lt;br /&gt;
&lt;br /&gt;
== Supported Models ==&lt;br /&gt;
OpenSHA hazard models are available for download using GetFile to try in sequential order:&lt;br /&gt;
# CARC /project2/scec_608/public/getfile and&lt;br /&gt;
# CARC /project/scec_608/public/getfile.&lt;br /&gt;
See links to access files via the Globus Connect Service at [[Hypocenter_Replacement]].&lt;br /&gt;
Note that /project is deprecated and may not be available soon. The first location GetFile is able to make a connection and validate metadata with MD5 checksums will be used for the duration of the GetFile lifetime for all subsequent downloads.&lt;br /&gt;
&lt;br /&gt;
The following list of ERF models are supported as of writing (Nov 16 2025) in the OpenSHA v25.12.0-alpha with GetFile v25.11.0:&lt;br /&gt;
* branch_avgs_combined (v24.11.0)  &lt;br /&gt;
* cached_dep100.0_depMean_rakeMean (v24.11.0)  &lt;br /&gt;
* cached_FM3_1_dep100.0_depMean_rakeMean (v24.11.0)  &lt;br /&gt;
* cached_FM3_2_dep100.0_depMean_rakeMean (v24.11.0)  &lt;br /&gt;
* FM3_1_ABM_Shaw09Mod_DsrUni_CharConst_M5Rate7.9_MMaxOff7.6_NoFix_SpatSeisU3 (v24.11.0)  &lt;br /&gt;
* FM3_1_branch_averaged_full_modules (v24.11.0)  &lt;br /&gt;
* FM3_1_branch_averaged_with_logic_tree (v24.11.0)  &lt;br /&gt;
* FM3_1_branch_averaged (v24.11.0)  &lt;br /&gt;
* FM3_1_GEOL_Shaw09Mod_DsrUni_CharConst_M5Rate7.9_MMaxOff7.6_NoFix_SpatSeisU3 (v24.11.0)  &lt;br /&gt;
* FM3_1_mean_ucerf3_sol (v24.11.0)  &lt;br /&gt;
* FM3_1_NEOK_Shaw09Mod_DsrUni_CharConst_M5Rate7.9_MMaxOff7.6_NoFix_SpatSeisU3 (v24.11.0)  &lt;br /&gt;
* FM3_1_SpatSeisU2_branch_averaged_full_modules (v24.11.0)  &lt;br /&gt;
* FM3_1_SpatSeisU2_branch_averaged (v24.11.0)  &lt;br /&gt;
* FM3_1_SpatSeisU3_branch_averaged_full_modules (v24.11.0)  &lt;br /&gt;
* FM3_1_SpatSeisU3_branch_averaged (v24.11.0)  &lt;br /&gt;
* FM3_1_ZENGBB_Shaw09Mod_DsrTap_CharConst_M5Rate7.9_MMaxOff7.6_NoFix_SpatSeisU3 (v24.11.0)  &lt;br /&gt;
* FM3_1_ZENGBB_Shaw09Mod_DsrUni_CharConst_M5Rate7.9_MMaxOff7.6_NoFix_SpatSeisU3 (v24.11.0)  &lt;br /&gt;
* FM3_2_ABM_Shaw09Mod_DsrUni_CharConst_M5Rate7.9_MMaxOff7.6_NoFix_SpatSeisU3 (v24.11.0)  &lt;br /&gt;
* FM3_2_branch_averaged_full_modules (v24.11.0)  &lt;br /&gt;
* FM3_2_branch_averaged_with_logic_tree (v24.11.0)  &lt;br /&gt;
* FM3_2_branch_averaged (v24.11.0)  &lt;br /&gt;
* FM3_2_GEOL_Shaw09Mod_DsrUni_CharConst_M5Rate7.9_MMaxOff7.6_NoFix_SpatSeisU3 (v24.11.0)  &lt;br /&gt;
* FM3_2_mean_ucerf3_sol (v24.11.0)  &lt;br /&gt;
* FM3_2_NEOK_Shaw09Mod_DsrUni_CharConst_M5Rate7.9_MMaxOff7.6_NoFix_SpatSeisU3 (v24.11.0)  &lt;br /&gt;
* FM3_2_SpatSeisU2_branch_averaged_full_modules (v24.11.0)  &lt;br /&gt;
* FM3_2_SpatSeisU2_branch_averaged (v24.11.0)  &lt;br /&gt;
* FM3_2_SpatSeisU3_branch_averaged_full_modules (v24.11.0)  &lt;br /&gt;
* FM3_2_SpatSeisU3_branch_averaged (v24.11.0)  &lt;br /&gt;
* FM3_2_ZENGBB_Shaw09Mod_DsrTap_CharConst_M5Rate7.9_MMaxOff7.6_NoFix_SpatSeisU3 (v24.11.0)  &lt;br /&gt;
* FM3_2_ZENGBB_Shaw09Mod_DsrUni_CharConst_M5Rate7.9_MMaxOff7.6_NoFix_SpatSeisU3 (v24.11.0)  &lt;br /&gt;
* full_logic_tree_with_gridded (v24.11.0)  &lt;br /&gt;
* full_logic_tree (v24.11.0)  &lt;br /&gt;
* mean_ucerf3_sol_with_mappings (v24.11.0)  &lt;br /&gt;
* mean_ucerf3_sol (v24.11.0)  &lt;br /&gt;
* rake_basis (v24.11.0)  &lt;br /&gt;
* full_ucerf3_compound_sol (v24.12.0)&lt;br /&gt;
* WUS_branch_averaged_gridded_simplified (v25.1.0)&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Docs and Code ==&lt;br /&gt;
The source code and detailed usage and setup documentation: https://github.com/abhatthal/getfile&lt;br /&gt;
&lt;br /&gt;
Demo applications using the GetFile library: https://github.com/abhatthal/getfile-demo&lt;br /&gt;
&lt;br /&gt;
Video demonstration of downloading files with GetFile in HazardCurveApplication. [[File:Getfile-opensha-demo.mp4]]&lt;br /&gt;
&lt;br /&gt;
This video demo verifies the following behavior for file versions and network status.&lt;br /&gt;
# no existing files / download multiple&lt;br /&gt;
# already up to date / don’t download&lt;br /&gt;
# no files + no wifi / fails with error message&lt;br /&gt;
# no files + wifi outage / fails with error message&lt;br /&gt;
# outdated files / update only outdated&lt;br /&gt;
# outdated files + no wifi / silently use outdated files&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Publishing_UCERF3-ETAS_Event_Reports&amp;diff=30505</id>
		<title>Publishing UCERF3-ETAS Event Reports</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Publishing_UCERF3-ETAS_Event_Reports&amp;diff=30505"/>
		<updated>2025-10-16T17:24:26Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* Quakeworx */  Add Scott's drupal id&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;After successfully running a UCERF3-ETAS simulation and generating our plots, the generated UCERF3-ETAS forecast results can be published as part of a larger SCEC Event page. This page outlines the process by which results are published.&lt;br /&gt;
&lt;br /&gt;
Refer to [[UCERF3-ETAS Measurements]] for detailed instructions on how to run simulations and generate plots across HPC systems. The following examples allow us to take our generated results directly from the HPC system on which the computation occurs, although results can be published from any system with an internet connection.&lt;br /&gt;
&lt;br /&gt;
== Creating and Updating SCEC Event Pages ==&lt;br /&gt;
SCEC Event Pages detail earthquake events. For a given mainshock, recorded magnitude, time and location, and aftershock sequences are recorded. Identify a given event from its USGS ID and use this to generate an event on the SCEC.org website at https://central.scec.org/earthquakes/eventpage/generate. (See Figure 1).&lt;br /&gt;
&lt;br /&gt;
After generating an event page, it should be populated and available to view from the SCEC Event Pages list. This page is not yet published and isn't available to the public. Navigate to the &amp;quot;View&amp;quot; link to view the event page. (See Figure 2).&lt;br /&gt;
Under the Table of Contents, you shouldn't see a &amp;quot;UCERF3-ETAS Forecast&amp;quot; yet, but after generating your results and making them available, you will be able to update this event page by selecting the &amp;quot;Regenerate Page with Latest Data&amp;quot; button. (See Figure 3)&lt;br /&gt;
&lt;br /&gt;
[[File:SCEC Event Page Generator.png|400px|thumb|Fig 1. SCEC Event Page Generator available at scec.org]]&lt;br /&gt;
[[File:Malibu EQ.png|400px|thumb|Fig 2. SCEC Event Page for M4 earthquake near Malibu, CA]]&lt;br /&gt;
[[File:UCERF3-ETAS Forecast in Event Page.png|400px|thumb|Fig 3. UCERF3-ETAS Forecast as seen on the Event Page]]&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Git Troubleshooting ==&lt;br /&gt;
&lt;br /&gt;
Pushing your changes upstream requires you to have an added SSH key on Frontera to allow you to make changes with your GitHub account. You must also use a GitHub account that is authorized to push changes directly to the ucerf3-etas-results/master branch.&lt;br /&gt;
&lt;br /&gt;
You can request edit permissions to the repository owner, Akash Bhatthal &amp;lt;[mailto:bhatthal@usc.edu bhattha@usc.edu]&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Consider the following brief instructions on getting set up with SSH.&lt;br /&gt;
&lt;br /&gt;
1. Check if you already have an SSH key generated.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;ls ~/.ssh/id_rsa.pub&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
If not, then generate one with &amp;lt;code&amp;gt;ssh-keygen&amp;lt;/code&amp;gt;. It's possible you may have a non-RSA public key. Just check for any file ending in &amp;lt;code&amp;gt;.pub&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
2. Copy the contents of this public SSH key to your clipboard.&amp;lt;br/&amp;gt;&lt;br /&gt;
You can either select it directly from your Terminal, or copy the file over SSH.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
3. Add as an authenticated key on your GitHub account.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Go to GitHub Settings -&amp;gt; SSH and GPG Keys -&amp;gt; New SSH key&lt;br /&gt;
and paste the public key you copied. It should start with &amp;lt;code&amp;gt;ssh-rsa&amp;lt;/code&amp;gt;.&lt;br /&gt;
Be careful to not use the private key, the public key is from the file&lt;br /&gt;
ending in &amp;lt;code&amp;gt;.pub&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
4. If you cloned the repository with HTTPS instead of SSH, then you need to update your remote accordingly.&amp;lt;br/&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;git remote set-url origin git@github.com:(your_user_name)/ucerf3-etas-results.git&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
5. If your account has sufficient permissions, you should be able to push directly upstream to ucerf3-etas-results/master.&lt;br /&gt;
&lt;br /&gt;
== Publishing the UCERF3-ETAS Forecast ==&lt;br /&gt;
&lt;br /&gt;
Generate your UCERF3-ETAS forecast for a given earthquake event following instructions available at [[UCERF3-ETAS Measurements]].&lt;br /&gt;
Our generated results are pushed to a private GitHub repository (SCECcode/ucerf3-etas-results) that is directly read by the SCEC Event Page. You must be granted permission to view and contribute to this repository to continue publishing UCERF3-ETAS results for use in SCEC Event Pages.&lt;br /&gt;
&lt;br /&gt;
1) Clone this repository to your home directory with the following command:&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;cd $HOME &amp;amp;&amp;amp; git clone git@github.com:SCECcode/ucerf3-etas-results.git&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
2) Get scripts&lt;br /&gt;
&lt;br /&gt;
We can't directly copy results into this repository. The data is prepared using the u3etas_jar_wrapper shell script. You should already have such scripts downloaded and made available to execute by adding to your PATH. If not, clone the repository.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;git clone https://github.com/opensha/ucerf3-etas-launcher.git&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
3) Ensure that you have selected the correct Event ID and directories. See the following section for an explanation of the parameters used.&lt;br /&gt;
&lt;br /&gt;
# Set environment variables in bash config&lt;br /&gt;
* Ensure you're on an interactive node:&lt;br /&gt;
* Update your bash config with:&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
module load cpu/0.15.4&lt;br /&gt;
&lt;br /&gt;
module load openjdk/11.0.2&lt;br /&gt;
&lt;br /&gt;
export ETAS_JAR_DISABLE_UPDATE=1&lt;br /&gt;
&lt;br /&gt;
ETAS_LAUNCHER=/expanse/lustre/projects/usc143/qwxdev/apps/expanse/rocky8.8/ucerf3-etas/069e27e/ucerf3-etas-launcher&lt;br /&gt;
&lt;br /&gt;
export PATH=$ETAS_LAUNCHER/sbin:$PATH&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
4) Connect to an interactive node&lt;br /&gt;
&amp;lt;code&amp;gt;srun --partition=debug  --pty --account=usc143 --nodes=1 --ntasks-per-node=4 --mem=16G -t 00:30:00 --wait=0 --export=ALL /bin/bash&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
5) Assuming you have reserved an interactive node with sufficient memory, ensure that you have sufficient memory for execution on this node.&lt;br /&gt;
Set &amp;lt;code&amp;gt;ETAS_MEM_GB=32&amp;lt;/code&amp;gt;. &lt;br /&gt;
* export ETAS_MEM_GB=32&lt;br /&gt;
This doesn't have to be in your bash config, just execute directly. Adjust memory needed if you encounter an OutOfMemoryError during execution.&lt;br /&gt;
&lt;br /&gt;
6) Run ComCatReportPageGen&lt;br /&gt;
The following example has the ucerf3-etas-results repository cloned in our home directory on Expanse. Execution updates the local repository, after which you can &amp;lt;code&amp;gt; git push&amp;lt;/code&amp;gt; the changes upstream given sufficient permissions. Recall that such changes don't update an already generated SCEC Event page, in which case the page will have to be regenerated.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
u3etas_jar_wrapper.sh org.opensha.commons.data.comcat.plot.ComcatReportPageGen --event-id ci41075584 \&lt;br /&gt;
&lt;br /&gt;
--min-mag 0d --radius 50 --output-parent-dir /home1/10177/bhatthal/ucerf3-etas-results \&lt;br /&gt;
&lt;br /&gt;
--etas-dir $ETAS_SIM_DIR/frontera-comcat-malibu-m3.9-n14-s100000 \&lt;br /&gt;
&lt;br /&gt;
--etas-output-dir /home1/10177/bhatthal/ucerf3-etas-results/ucerf3-etas&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
You will need to update the above command with the relevant absolute paths on your system. If you encounter issues with the specified outputDir, remove the conflicting &amp;quot;outputDir&amp;quot; value in your simulation's `config.json` and try again.&lt;br /&gt;
&lt;br /&gt;
If you are unable to identify the event-id, try passing with the shortened parameter and ensuring these is no space, i.e. `-eci41075584`.&lt;br /&gt;
&lt;br /&gt;
7) Commit the changes in `ucerf3-etas-results` and push upstream to origin/master.&lt;br /&gt;
See the Git Troubleshooting section if you're unable to do this.&lt;br /&gt;
&lt;br /&gt;
== ComcatReportPageGen Usage ==&lt;br /&gt;
The [https://github.com/opensha/ucerf3-etas-launcher/blob/master/sbin/u3etas_jar_wrapper.sh u3etas_jar_wrapper.sh] shell script is used to execute any Java application in the provided OpenSHA Jar. In this case, we're executing the [https://github.com/opensha/opensha/blob/master/src/main/java/org/opensha/commons/data/comcat/plot/ComcatReportPageGen.java ComcatReportPageGen] application.&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
usage: ComcatReportPageGen [-?] [-d &amp;lt;arg&amp;gt;] -e &amp;lt;arg&amp;gt; [-eod &amp;lt;arg&amp;gt;] [-etas &amp;lt;arg&amp;gt;] [-m &amp;lt;arg&amp;gt;] [-o &amp;lt;arg&amp;gt;] [-opd &amp;lt;arg&amp;gt;] [-r &amp;lt;arg&amp;gt;]&lt;br /&gt;
 -?,--help                        Display this message&lt;br /&gt;
 -d,--days-before &amp;lt;arg&amp;gt;           Number of days of events before the mainshock to fetch (default: 3)&lt;br /&gt;
 -e,--event-id &amp;lt;arg&amp;gt;              ComCat event id, e.g. 'ci39126079'&lt;br /&gt;
 -eod,--etas-output-dir &amp;lt;arg&amp;gt;     If supplied, ETAS only results will also be written to &amp;lt;path&amp;gt;/&amp;lt;event-id&amp;gt;&lt;br /&gt;
 -etas,--etas-dir &amp;lt;arg&amp;gt;           Path to a UCERF3-ETAS simulation directory&lt;br /&gt;
 -m,--min-mag &amp;lt;arg&amp;gt;               Minimum magnitude of events to fetch (default: 0.0)&lt;br /&gt;
 -o,--output-dir &amp;lt;arg&amp;gt;            Output dirctory. Must supply either this or --output-parent-dir&lt;br /&gt;
 -opd,--output-parent-dir &amp;lt;arg&amp;gt;   Output parent dirctory. The directory name will be generated automatically from the&lt;br /&gt;
                                  event name, date, and magnitude. Must supply either this or --output-dir&lt;br /&gt;
 -r,--radius &amp;lt;arg&amp;gt;                Search radius around mainshock for aftershocks. Default is the greater of 10.0 km and&lt;br /&gt;
                                  twice the Wells &amp;amp; Coppersmith (1994) median rupture length for the mainshock magnitude&lt;br /&gt;
&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
In our example, we specify the absolute path to where we cloned the ucerf3-etas-results repository, as well as the absolute path to a UCERF3-ETAS results directory within the ucerf3-etas-results repository. Our results are written into two folders to allow us to filter by either date or Event ID.&lt;br /&gt;
&lt;br /&gt;
If running on Expanse, copy any plots from scratch into your own account as we can't run in someone else's folder. You can create a tarball and copy the whole folder into your account. The same logic should apply to any other HPC system. Ensure you execute the script on an interactive node, not on the head node.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Workflow Automation and Potential Challenges ==&lt;br /&gt;
This process of getting our UCERF3-ETAS forecasts into SCEC Event Pages could be automated for jobs run on Quakeworx.&lt;br /&gt;
&lt;br /&gt;
We face the following challenges in doing so:&lt;br /&gt;
* Not all Quakeworx users should have permission to publish results for SCEC Event Pages&lt;br /&gt;
* If there's an existing result, does the latest run overwrite? New UI for selecting result to use?&lt;br /&gt;
* We would have to start tagging commits to track overwritten events and update the web service to checkout accordingly&lt;br /&gt;
* SCEC Event Page regeneration would need to be triggered from Quakeworx. &lt;br /&gt;
&lt;br /&gt;
Handling of permissions for a Quakeworx GitHub account, structural changes for storing and reading results from the event-reports repository, and implementing a &amp;quot;Write to Event Page&amp;quot; boolean field on job submission is all feasible. Implementing a UI to retroactively change the selected forecast and trigger page regeneration would require further investigation into the capabilities of the Quakeworx framework and if there's an API for externally triggering the generator. These changes may improve the user experience and make publishing UCERF3-ETAS forecasts easier without requiring knowledge of a Linux terminal.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Quakeworx ==&lt;br /&gt;
Results can be published even when generated on Quakeworx, not just directly via the command-line.&lt;br /&gt;
Results are stored inside the Quakeworx development (qwxdev) scratch directory, but each user has a unique ID where their jobs are written to. Below is a list of known Drupaluid IDs for users who frequently run UCERF3-ETAS on Quakeworx.&lt;br /&gt;
* Phil Maechling: 6&lt;br /&gt;
* Fabio Silva: 7&lt;br /&gt;
* Akash Bhatthal: 20&lt;br /&gt;
* Scott Callaghan: 22&lt;br /&gt;
&lt;br /&gt;
For example, a job Phil ran on Quakeworx labelled &amp;quot;DeepSprings_EQ&amp;quot; would be found at &amp;lt;code&amp;gt;/expanse/lustre/scratch/qwxdev/temp_project/qwx1/users/drupaluid_6/jobs/DeepSprings_EQ/&amp;lt;/code&amp;gt;.&lt;br /&gt;
&lt;br /&gt;
In order to run the ComcatReportPageGen, you should copy this event from scratch into your own home directory. Here you can make changes to the config.json configuration as necessary and execute the generator (See step 6) with an updated etas-dir.&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Preferred_Rupture_Directivity_in_Hazard_Curve_Computations&amp;diff=30502</id>
		<title>Preferred Rupture Directivity in Hazard Curve Computations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Preferred_Rupture_Directivity_in_Hazard_Curve_Computations&amp;diff=30502"/>
		<updated>2025-10-08T04:37:01Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* Plots */  Update conclusion for Exp1&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents efforts to investigate the impact that preferred rupture directions would have on hazard as calculated with CyberShake ground motions in OpenSHA.&lt;br /&gt;
&lt;br /&gt;
== Research Plan ==&lt;br /&gt;
&lt;br /&gt;
There is some evidence that some faults have a preferred rupture direction.  To investigate the impact this could have on hazard, we'll identify a handful of faults, modify the probabilities of individual rupture variations to favor those at the preferred end of the fault, and generate new hazard products with the modified probabilities.  No new ground motions will need to be calculated; we'll use Study 22.12 and 24.8 ground motions.&lt;br /&gt;
&lt;br /&gt;
Below is a flow chart representing the steps involved in performing this work.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Preferred_Rupture_Direction_flow_chart.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Implementation Details ==&lt;br /&gt;
&lt;br /&gt;
With the UCERF2 ERF used in Study 22.12 and Study 24.8, probabilities are specified at the rupture level -- that is, for a specific fault segment(s) and magnitude.  We then divide the probability by the number of rupture variations to get the uniform probability of each rupture variation.  For this work, we will create modified rupture variation probabilities and use these to generate a modified hazard curve.&lt;br /&gt;
&lt;br /&gt;
We will add new functionality to the CyberShake-related code in OpenSHA to support this work.  OpenSHA has a defined interface to return a list of probabilities for a given source ID and rupture ID.  We'll specify the source ID, rupture ID, rupture variation ID, probability in a CSV file which will be passed to OpenSHA.  This file will be parsed and used to populate a data structure, which will then be accessed by an implementation of the interface to determine the new probabilities.  For any source ID, rupture ID, rupture variation ID combinations not in the file, we'll use the default UCERF2 probabilities.  Then the new probabilities will be used to create a hazard curve.&lt;br /&gt;
&lt;br /&gt;
Modifying the HazardCurvePlotter to allow us to input custom probabilities for rupture variations allows us to compare hazard expected for persistent directivity vs random rupture direction. We can prescribe directivity based on topography or physics reasons and see if the signature is drowned out by randomness of the network or if it still stands out.&lt;br /&gt;
&lt;br /&gt;
Unlike the UCERF2 ERF in standard PSHA, CyberShake utilizes an extended ERF that supports rupture variation probabilities. Both include ruptures, but by specifying hypocenter, we're able to support directivity.&lt;br /&gt;
&lt;br /&gt;
In standard OpenSHA, the hierarchy is sources which are fault segments, down to ruptures which are collection of faults.&lt;br /&gt;
CyberShake extended ERFs add another layer, rupture variations. These identify the place on the fault where the hypocenter is and define a slip-time history.&lt;br /&gt;
&lt;br /&gt;
== New Functionality ==&lt;br /&gt;
The existing OpenSHA command-line tool, HazardCurvePlotter, is able to compute hazard curves and plot them to images locally on a system. We’ve added a new parameter, &amp;lt;code&amp;gt;rv-probs-csv&amp;lt;/code&amp;gt;, for a Rupture Variation Probabilities Input CSV file.&lt;br /&gt;
&lt;br /&gt;
Passing this file allows a user to specify directivity for specific rupture variations. Previously, all variations had an equal probability, distributed from the probability of the rupture itself. The sum of probabilities of the variations must still sum to the probability of the corresponding rupture.&lt;br /&gt;
&lt;br /&gt;
== CSV Files ==&lt;br /&gt;
&lt;br /&gt;
=== CSV Structure ===&lt;br /&gt;
The input CSV file should have the following columns.&lt;br /&gt;
* Source_ID&lt;br /&gt;
* Rupture_ID&lt;br /&gt;
* Rup_Var_ID&lt;br /&gt;
* Probability&lt;br /&gt;
&lt;br /&gt;
The specified source and rupture acts as a composite key, uniquely identifying a rupture variation. Each rupture at a site is comprised of a set of rupture variations.&lt;br /&gt;
The CSV file does not need to have an entry for each variation found in the CyberShake database.&lt;br /&gt;
The remaining unspecified variations are given an equal probability distributed from the difference of the rupture probability and the sum of rupture variation probabilities specified.&lt;br /&gt;
The formatting, structure, and sum or probabilities are validated in the HazardCurvePlotter.&lt;br /&gt;
&lt;br /&gt;
=== CSV Generation ===&lt;br /&gt;
&lt;br /&gt;
To generate the CSV files used for these tests, we created a Python script which takes in a configuration file and outputs a CSV with modified probabilities for selected rupture variations.&lt;br /&gt;
&lt;br /&gt;
The Python script is available [https://github.com/SCECcode/cybershake-core/blob/main/db/create_prop_file.py here].&lt;br /&gt;
&lt;br /&gt;
=== Inputs ===&lt;br /&gt;
The inputs take X% of hypocenters in preferred direction and doubled their probability, and taken X% of hypocenters in opposite direction and set to 0 to keep overall probabilities balanced.&lt;br /&gt;
&lt;br /&gt;
Below are the CSV input files used to generate the results found in the Excel spreadsheet and highlighted plots on this page.&lt;br /&gt;
&lt;br /&gt;
==== Initial Tests ====&lt;br /&gt;
Mojave:  Southeast -&amp;gt; Northwest, so the preferred hypocenters are in the southern portion of the fault.&lt;br /&gt;
&lt;br /&gt;
Coachella: Northwest -&amp;gt; Southeast, so the preferred hypocenters are in the northern portion of the fault.&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_reference_probs.csv USC_reference_probs.csv] : Reference input with same probabilities as fetched from DB for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella.csv USC_mod_probs_Mojave_Coachella.csv] : 10% hypocenter responsible modified probabilities for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella_90p.csv USC_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella_100p.csv USC_mod_probs_Mojave_Coachella_100p.csv] : 100% modified probability site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/USC_mod_probs_Moj_Coach_sa25p.csv USC_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/USC_mod_probs_Moj_Coach_sa100p.csv USC_mod_probs_Moj_Coach_sa100p.csv] : 100% modified prob, all San Andreas Events for site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/ALP_mod_probs_Mojave_Coachella_90p.csv ALP_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=ALP, run=9542&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/ALP_mod_probs_Moj_Coach_sa25p.csv ALP_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=ALP, run=9542&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/ALP_mod_probs_all_Moj_Coach_100p.csv ALP_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=ALP, run=9542&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SBSM_mod_probs_Mojave_Coachella_90p.csv SBSM_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SBSM_mod_probs_Mojave_Coachella_100p.csv SBSM_mod_probs_Mojave_Coachella_100p.csv] : 100% modified probability site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SBSM_mod_probs_Moj_Coach_sa25p.csv SBSM_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SBSM_mod_probs_all_Moj_Coach_100p.csv SBSM_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SBSM, run=9320&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SVD_mod_probs_Mojave_Coachella_90p.csv SVD_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SVD_mod_probs_Moj_Coach_sa25p.csv SVD_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SVD_mod_probs_all_Moj_Coach_100p.csv SVD_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/PDE_mod_probs_Mojave_Coachella_90p.csv PDE_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=PDE, run=9663&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/PDE_mod_probs_Moj_Coach_sa25p.csv PDE_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/PDE_mod_probs_all_Moj_Coach_100p.csv PDE_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
&lt;br /&gt;
==== Experiment 1 ====&lt;br /&gt;
Both Experiment 1 and Experiment 2 have the southern 20% of hypocenters with 90% of total probability. The remaining 10% is distributed over the remaining 80% of hypocenters.&lt;br /&gt;
&lt;br /&gt;
4 fault segments (South San Bernardino, North San Bernardino, South Mojave, North Mojave)&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/PDE_exp1.csv PDE_exp1.csv] : 90% modified probability site=PDE, run=9663&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/s036_exp1.csv s036_exp1.csv] : 90% modified probability site=s036, run=9402&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/s776_exp1.csv s776_exp1.csv] : 90% modified probability site=s776, run=9536&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/USC_exp1.csv USC_exp1.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
==== Experiment 2 ====&lt;br /&gt;
2 middle segments (North San Bernardino and South Mojave)&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s080_exp2.csv s080_exp2.csv] : 90% modified probability site=s080, run=9409&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s145_exp2.csv s145_exp2.csv] : 90% modified probability site=s145, run=9606&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s732_exp2.csv s732_exp2.csv] : 90% modified probability site=s732, run=9391&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/USC_exp2.csv USC_exp2.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
== Plots ==&lt;br /&gt;
The following plots show curves for the following sites:&lt;br /&gt;
* ALP&lt;br /&gt;
* USC&lt;br /&gt;
* SBSM&lt;br /&gt;
* SVD&lt;br /&gt;
* PDE&lt;br /&gt;
* s036&lt;br /&gt;
* s080&lt;br /&gt;
* s145&lt;br /&gt;
* s732&lt;br /&gt;
* s776&lt;br /&gt;
&lt;br /&gt;
There is a plot for each site of the periods (K) 2, 3, 5, and 10, for a total of 40 plots.&lt;br /&gt;
Each figure has the following plot lines&lt;br /&gt;
* The reference using original probabilities (Ref)&lt;br /&gt;
* Where hypocenters are responsible for 90% of the probability (90p, Exp1, Exp2)&lt;br /&gt;
* USC and SBSM has additional plot line where hypocenters are responsible for 100% (100p)&lt;br /&gt;
* 25% probability over All San Andreas Events (sa25p)&lt;br /&gt;
* 100% probability over All San Andreas Events (sa100p)&lt;br /&gt;
&lt;br /&gt;
The hazard curve plots were generated using the following bash scripts:&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_fetch_curves.sh plot_fetch_curves.sh] fetches curves from database&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_modified_curves.sh plot_modified_curves.sh] generates curves using input biases&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_comparison_curves.sh plot_comparison_curves.sh] builds comparison curves from the results of plot_fetch_curves and plot_modified_curves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
./plot_fetch_curves.sh&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/90p csv/initial_tests 90p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/100p csv/initial_tests 100p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/sa25p csv/initial_tests/all_san_andreas sa25p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/sa100p csv/initial_tests/all_san_andreas sa100p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/exp1 csv/experiment_1 exp1&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/exp2 csv/experiment_2 exp2&lt;br /&gt;
&lt;br /&gt;
./plot_comparison_curves.sh comps fetchdb modprob&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For the 10p, 90p, and 100p plots, the results are identical to the fetched probabilities. When considering all events across the San Andreas, we see significant impact to PDE and ALP, some impact to SVD, and no visible significant impact to SBSM and USC.&lt;br /&gt;
&lt;br /&gt;
In our follow-up experiments, Experiment 1 (Exp1) and Experiment 2 (Exp2), we observe significant ground motion (reduction of 10%) in s776 with a period of 2s.&lt;br /&gt;
&lt;br /&gt;
For the sake of brevity, we only show one plot per site in the highlighted plots below.&lt;br /&gt;
&lt;br /&gt;
All generated comparison plots are available at [https://github.com/opensha/opensha-cybershake/tree/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/comps opensha-cybershake GitHub - Comparison Plots]&lt;br /&gt;
&lt;br /&gt;
Individual plots and X,Y data is available on GitHub at [https://github.com/opensha/opensha-cybershake/tree/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/fetchdb fetchdb] for fetched reference data and [https://github.com/opensha/opensha-cybershake/tree/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/modprob modprob] for modified probability data.&lt;br /&gt;
&lt;br /&gt;
=== USC 9306 ===&lt;br /&gt;
&lt;br /&gt;
[[File:usc-t=3.png|800px|USC 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: University of Southern California (USC)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 7s (90p), 48s (100p), 1min 12s (sa25p), 1min 23s (sa100p), 6min 4s (Exp1), 48s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== ALP 9542 ===&lt;br /&gt;
[[File:alp-t=5.png|800px|ALP 5s]]&lt;br /&gt;
&lt;br /&gt;
Site: Antelope (ALP)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 16s (90p), 1min 12s (sa25p), 1min 29s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== SBSM 9320 ===&lt;br /&gt;
[[File:sbsm-t=3.png|800px|SBSM 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: San Bernardino Strong Motion (SBSM)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 59s (90p), 1min 1s (100p), 1min 12s (sa25p), 1min 24s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== SVD 9647 ===&lt;br /&gt;
[[File:svd-t=3.png|800px|SVD 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: Seven Oaks Dam (SVD)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 16s (90p), 1min 13s (sa25p), 1min 26s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== PDE 9663 ===&lt;br /&gt;
[[File:pde-t=10.png|800px|PDE 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: Pardee (PDE)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 21s (90p), 1min 4s (sa25p), 1min 25s (sa100p), 2min 36s (Exp1)&lt;br /&gt;
&lt;br /&gt;
=== s036 9402 ===&lt;br /&gt;
[[File:s036-t=2.png|800px|s036 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s036&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 34s (Exp1)&lt;br /&gt;
&lt;br /&gt;
=== s080 9409 ===&lt;br /&gt;
[[File:s080-t=2.png|800px|s080 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s080&lt;br /&gt;
&lt;br /&gt;
Compute Time: 5min 28s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s145 9606 ===&lt;br /&gt;
[[File:s145-t=2.png|800px|s145 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s145&lt;br /&gt;
&lt;br /&gt;
Compute Time: 5min 55s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s732 9391 ===&lt;br /&gt;
[[File:s732-t=2.png|800px|s732 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s732&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 52s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s776 9536 ===&lt;br /&gt;
[[File:s776-t=2.png|800px|s776 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s776&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 29s (Exp1)&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:S776-t%3D2.png&amp;diff=30501</id>
		<title>File:S776-t=2.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:S776-t%3D2.png&amp;diff=30501"/>
		<updated>2025-10-08T04:34:43Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:S732-t%3D2.png&amp;diff=30500</id>
		<title>File:S732-t=2.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:S732-t%3D2.png&amp;diff=30500"/>
		<updated>2025-10-08T04:34:17Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:S145-t%3D2.png&amp;diff=30499</id>
		<title>File:S145-t=2.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:S145-t%3D2.png&amp;diff=30499"/>
		<updated>2025-10-08T04:34:04Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:S080-t%3D2.png&amp;diff=30498</id>
		<title>File:S080-t=2.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:S080-t%3D2.png&amp;diff=30498"/>
		<updated>2025-10-08T04:33:25Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:S036-t%3D2.png&amp;diff=30497</id>
		<title>File:S036-t=2.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:S036-t%3D2.png&amp;diff=30497"/>
		<updated>2025-10-08T04:33:05Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Preferred_Rupture_Directivity_in_Hazard_Curve_Computations&amp;diff=30496</id>
		<title>Preferred Rupture Directivity in Hazard Curve Computations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Preferred_Rupture_Directivity_in_Hazard_Curve_Computations&amp;diff=30496"/>
		<updated>2025-10-08T04:32:46Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* Plots */  Use t=2 for new sites&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents efforts to investigate the impact that preferred rupture directions would have on hazard as calculated with CyberShake ground motions in OpenSHA.&lt;br /&gt;
&lt;br /&gt;
== Research Plan ==&lt;br /&gt;
&lt;br /&gt;
There is some evidence that some faults have a preferred rupture direction.  To investigate the impact this could have on hazard, we'll identify a handful of faults, modify the probabilities of individual rupture variations to favor those at the preferred end of the fault, and generate new hazard products with the modified probabilities.  No new ground motions will need to be calculated; we'll use Study 22.12 and 24.8 ground motions.&lt;br /&gt;
&lt;br /&gt;
Below is a flow chart representing the steps involved in performing this work.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Preferred_Rupture_Direction_flow_chart.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Implementation Details ==&lt;br /&gt;
&lt;br /&gt;
With the UCERF2 ERF used in Study 22.12 and Study 24.8, probabilities are specified at the rupture level -- that is, for a specific fault segment(s) and magnitude.  We then divide the probability by the number of rupture variations to get the uniform probability of each rupture variation.  For this work, we will create modified rupture variation probabilities and use these to generate a modified hazard curve.&lt;br /&gt;
&lt;br /&gt;
We will add new functionality to the CyberShake-related code in OpenSHA to support this work.  OpenSHA has a defined interface to return a list of probabilities for a given source ID and rupture ID.  We'll specify the source ID, rupture ID, rupture variation ID, probability in a CSV file which will be passed to OpenSHA.  This file will be parsed and used to populate a data structure, which will then be accessed by an implementation of the interface to determine the new probabilities.  For any source ID, rupture ID, rupture variation ID combinations not in the file, we'll use the default UCERF2 probabilities.  Then the new probabilities will be used to create a hazard curve.&lt;br /&gt;
&lt;br /&gt;
Modifying the HazardCurvePlotter to allow us to input custom probabilities for rupture variations allows us to compare hazard expected for persistent directivity vs random rupture direction. We can prescribe directivity based on topography or physics reasons and see if the signature is drowned out by randomness of the network or if it still stands out.&lt;br /&gt;
&lt;br /&gt;
Unlike the UCERF2 ERF in standard PSHA, CyberShake utilizes an extended ERF that supports rupture variation probabilities. Both include ruptures, but by specifying hypocenter, we're able to support directivity.&lt;br /&gt;
&lt;br /&gt;
In standard OpenSHA, the hierarchy is sources which are fault segments, down to ruptures which are collection of faults.&lt;br /&gt;
CyberShake extended ERFs add another layer, rupture variations. These identify the place on the fault where the hypocenter is and define a slip-time history.&lt;br /&gt;
&lt;br /&gt;
== New Functionality ==&lt;br /&gt;
The existing OpenSHA command-line tool, HazardCurvePlotter, is able to compute hazard curves and plot them to images locally on a system. We’ve added a new parameter, &amp;lt;code&amp;gt;rv-probs-csv&amp;lt;/code&amp;gt;, for a Rupture Variation Probabilities Input CSV file.&lt;br /&gt;
&lt;br /&gt;
Passing this file allows a user to specify directivity for specific rupture variations. Previously, all variations had an equal probability, distributed from the probability of the rupture itself. The sum of probabilities of the variations must still sum to the probability of the corresponding rupture.&lt;br /&gt;
&lt;br /&gt;
== CSV Files ==&lt;br /&gt;
&lt;br /&gt;
=== CSV Structure ===&lt;br /&gt;
The input CSV file should have the following columns.&lt;br /&gt;
* Source_ID&lt;br /&gt;
* Rupture_ID&lt;br /&gt;
* Rup_Var_ID&lt;br /&gt;
* Probability&lt;br /&gt;
&lt;br /&gt;
The specified source and rupture acts as a composite key, uniquely identifying a rupture variation. Each rupture at a site is comprised of a set of rupture variations.&lt;br /&gt;
The CSV file does not need to have an entry for each variation found in the CyberShake database.&lt;br /&gt;
The remaining unspecified variations are given an equal probability distributed from the difference of the rupture probability and the sum of rupture variation probabilities specified.&lt;br /&gt;
The formatting, structure, and sum or probabilities are validated in the HazardCurvePlotter.&lt;br /&gt;
&lt;br /&gt;
=== CSV Generation ===&lt;br /&gt;
&lt;br /&gt;
To generate the CSV files used for these tests, we created a Python script which takes in a configuration file and outputs a CSV with modified probabilities for selected rupture variations.&lt;br /&gt;
&lt;br /&gt;
The Python script is available [https://github.com/SCECcode/cybershake-core/blob/main/db/create_prop_file.py here].&lt;br /&gt;
&lt;br /&gt;
=== Inputs ===&lt;br /&gt;
The inputs take X% of hypocenters in preferred direction and doubled their probability, and taken X% of hypocenters in opposite direction and set to 0 to keep overall probabilities balanced.&lt;br /&gt;
&lt;br /&gt;
Below are the CSV input files used to generate the results found in the Excel spreadsheet and highlighted plots on this page.&lt;br /&gt;
&lt;br /&gt;
==== Initial Tests ====&lt;br /&gt;
Mojave:  Southeast -&amp;gt; Northwest, so the preferred hypocenters are in the southern portion of the fault.&lt;br /&gt;
&lt;br /&gt;
Coachella: Northwest -&amp;gt; Southeast, so the preferred hypocenters are in the northern portion of the fault.&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_reference_probs.csv USC_reference_probs.csv] : Reference input with same probabilities as fetched from DB for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella.csv USC_mod_probs_Mojave_Coachella.csv] : 10% hypocenter responsible modified probabilities for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella_90p.csv USC_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella_100p.csv USC_mod_probs_Mojave_Coachella_100p.csv] : 100% modified probability site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/USC_mod_probs_Moj_Coach_sa25p.csv USC_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/USC_mod_probs_Moj_Coach_sa100p.csv USC_mod_probs_Moj_Coach_sa100p.csv] : 100% modified prob, all San Andreas Events for site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/ALP_mod_probs_Mojave_Coachella_90p.csv ALP_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=ALP, run=9542&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/ALP_mod_probs_Moj_Coach_sa25p.csv ALP_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=ALP, run=9542&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/ALP_mod_probs_all_Moj_Coach_100p.csv ALP_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=ALP, run=9542&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SBSM_mod_probs_Mojave_Coachella_90p.csv SBSM_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SBSM_mod_probs_Mojave_Coachella_100p.csv SBSM_mod_probs_Mojave_Coachella_100p.csv] : 100% modified probability site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SBSM_mod_probs_Moj_Coach_sa25p.csv SBSM_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SBSM_mod_probs_all_Moj_Coach_100p.csv SBSM_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SBSM, run=9320&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SVD_mod_probs_Mojave_Coachella_90p.csv SVD_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SVD_mod_probs_Moj_Coach_sa25p.csv SVD_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SVD_mod_probs_all_Moj_Coach_100p.csv SVD_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/PDE_mod_probs_Mojave_Coachella_90p.csv PDE_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=PDE, run=9663&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/PDE_mod_probs_Moj_Coach_sa25p.csv PDE_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/PDE_mod_probs_all_Moj_Coach_100p.csv PDE_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
&lt;br /&gt;
==== Experiment 1 ====&lt;br /&gt;
Both Experiment 1 and Experiment 2 have the southern 20% of hypocenters with 90% of total probability. The remaining 10% is distributed over the remaining 80% of hypocenters.&lt;br /&gt;
&lt;br /&gt;
4 fault segments (South San Bernardino, North San Bernardino, South Mojave, North Mojave)&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/PDE_exp1.csv PDE_exp1.csv] : 90% modified probability site=PDE, run=9663&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/s036_exp1.csv s036_exp1.csv] : 90% modified probability site=s036, run=9402&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/s776_exp1.csv s776_exp1.csv] : 90% modified probability site=s776, run=9536&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/USC_exp1.csv USC_exp1.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
==== Experiment 2 ====&lt;br /&gt;
2 middle segments (North San Bernardino and South Mojave)&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s080_exp2.csv s080_exp2.csv] : 90% modified probability site=s080, run=9409&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s145_exp2.csv s145_exp2.csv] : 90% modified probability site=s145, run=9606&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s732_exp2.csv s732_exp2.csv] : 90% modified probability site=s732, run=9391&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/USC_exp2.csv USC_exp2.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
== Plots ==&lt;br /&gt;
The following plots show curves for the following sites:&lt;br /&gt;
* ALP&lt;br /&gt;
* USC&lt;br /&gt;
* SBSM&lt;br /&gt;
* SVD&lt;br /&gt;
* PDE&lt;br /&gt;
* s036&lt;br /&gt;
* s080&lt;br /&gt;
* s145&lt;br /&gt;
* s732&lt;br /&gt;
* s776&lt;br /&gt;
&lt;br /&gt;
There is a plot for each site of the periods (K) 2, 3, 5, and 10, for a total of 40 plots.&lt;br /&gt;
Each figure has the following plot lines&lt;br /&gt;
* The reference using original probabilities (Ref)&lt;br /&gt;
* Where hypocenters are responsible for 90% of the probability (90p, Exp1, Exp2)&lt;br /&gt;
* USC and SBSM has additional plot line where hypocenters are responsible for 100% (100p)&lt;br /&gt;
* 25% probability over All San Andreas Events (sa25p)&lt;br /&gt;
* 100% probability over All San Andreas Events (sa100p)&lt;br /&gt;
&lt;br /&gt;
The hazard curve plots were generated using the following bash scripts:&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_fetch_curves.sh plot_fetch_curves.sh] fetches curves from database&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_modified_curves.sh plot_modified_curves.sh] generates curves using input biases&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_comparison_curves.sh plot_comparison_curves.sh] builds comparison curves from the results of plot_fetch_curves and plot_modified_curves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
./plot_fetch_curves.sh&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/90p csv/initial_tests 90p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/100p csv/initial_tests 100p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/sa25p csv/initial_tests/all_san_andreas sa25p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/sa100p csv/initial_tests/all_san_andreas sa100p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/exp1 csv/experiment_1 exp1&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/exp2 csv/experiment_2 exp2&lt;br /&gt;
&lt;br /&gt;
./plot_comparison_curves.sh comps fetchdb modprob&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For the 10p, 90p, and 100p plots, the results are identical to the fetched probabilities. When considering all events across the San Andreas, we see significant impact to PDE and ALP, some impact to SVD, and no visible significant impact to SBSM and USC.&lt;br /&gt;
&lt;br /&gt;
In our follow-up experiments, Experiment 1 (Exp1) and Experiment 2 (Exp2), we did not observe any significant ground motion. This is in spite of selecting CyberShake sites which directly line up with the ends of the faults.&lt;br /&gt;
&lt;br /&gt;
For the sake of brevity, we only show one plot per site in the highlighted plots below.&lt;br /&gt;
&lt;br /&gt;
All generated comparison plots are available at [https://github.com/opensha/opensha-cybershake/tree/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/comps opensha-cybershake GitHub - Comparison Plots]&lt;br /&gt;
&lt;br /&gt;
Individual plots and X,Y data is available on GitHub at [https://github.com/opensha/opensha-cybershake/tree/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/fetchdb fetchdb] for fetched reference data and [https://github.com/opensha/opensha-cybershake/tree/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/modprob modprob] for modified probability data.&lt;br /&gt;
&lt;br /&gt;
=== USC 9306 ===&lt;br /&gt;
&lt;br /&gt;
[[File:usc-t=3.png|800px|USC 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: University of Southern California (USC)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 7s (90p), 48s (100p), 1min 12s (sa25p), 1min 23s (sa100p), 6min 4s (Exp1), 48s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== ALP 9542 ===&lt;br /&gt;
[[File:alp-t=5.png|800px|ALP 5s]]&lt;br /&gt;
&lt;br /&gt;
Site: Antelope (ALP)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 16s (90p), 1min 12s (sa25p), 1min 29s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== SBSM 9320 ===&lt;br /&gt;
[[File:sbsm-t=3.png|800px|SBSM 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: San Bernardino Strong Motion (SBSM)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 59s (90p), 1min 1s (100p), 1min 12s (sa25p), 1min 24s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== SVD 9647 ===&lt;br /&gt;
[[File:svd-t=3.png|800px|SVD 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: Seven Oaks Dam (SVD)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 16s (90p), 1min 13s (sa25p), 1min 26s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== PDE 9663 ===&lt;br /&gt;
[[File:pde-t=10.png|800px|PDE 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: Pardee (PDE)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 21s (90p), 1min 4s (sa25p), 1min 25s (sa100p), 2min 36s (Exp1)&lt;br /&gt;
&lt;br /&gt;
=== s036 9402 ===&lt;br /&gt;
[[File:s036-t=2.png|800px|s036 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s036&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 34s (Exp1)&lt;br /&gt;
&lt;br /&gt;
=== s080 9409 ===&lt;br /&gt;
[[File:s080-t=2.png|800px|s080 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s080&lt;br /&gt;
&lt;br /&gt;
Compute Time: 5min 28s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s145 9606 ===&lt;br /&gt;
[[File:s145-t=2.png|800px|s145 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s145&lt;br /&gt;
&lt;br /&gt;
Compute Time: 5min 55s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s732 9391 ===&lt;br /&gt;
[[File:s732-t=2.png|800px|s732 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s732&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 52s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s776 9536 ===&lt;br /&gt;
[[File:s776-t=2.png|800px|s776 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s776&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 29s (Exp1)&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Pde-t%3D10.png&amp;diff=30495</id>
		<title>File:Pde-t=10.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Pde-t%3D10.png&amp;diff=30495"/>
		<updated>2025-10-08T04:31:09Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Bhatthal uploaded a new version of File:Pde-t=10.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Svd-t%3D3.png&amp;diff=30494</id>
		<title>File:Svd-t=3.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Svd-t%3D3.png&amp;diff=30494"/>
		<updated>2025-10-08T04:30:26Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Bhatthal uploaded a new version of File:Svd-t=3.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Sbsm-t%3D3.png&amp;diff=30493</id>
		<title>File:Sbsm-t=3.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Sbsm-t%3D3.png&amp;diff=30493"/>
		<updated>2025-10-08T04:30:07Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Bhatthal uploaded a new version of File:Sbsm-t=3.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Alp-t%3D5.png&amp;diff=30492</id>
		<title>File:Alp-t=5.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Alp-t%3D5.png&amp;diff=30492"/>
		<updated>2025-10-08T04:29:43Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Bhatthal uploaded a new version of File:Alp-t=5.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Usc-t%3D3.png&amp;diff=30491</id>
		<title>File:Usc-t=3.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Usc-t%3D3.png&amp;diff=30491"/>
		<updated>2025-10-08T04:29:14Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Bhatthal uploaded a new version of File:Usc-t=3.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Preferred_Rupture_Directivity_in_Hazard_Curve_Computations&amp;diff=30490</id>
		<title>Preferred Rupture Directivity in Hazard Curve Computations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Preferred_Rupture_Directivity_in_Hazard_Curve_Computations&amp;diff=30490"/>
		<updated>2025-10-08T03:22:38Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* Plots */  Add links to generated plots on GitHub and explain new script. Remove Excel spreadsheet.&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents efforts to investigate the impact that preferred rupture directions would have on hazard as calculated with CyberShake ground motions in OpenSHA.&lt;br /&gt;
&lt;br /&gt;
== Research Plan ==&lt;br /&gt;
&lt;br /&gt;
There is some evidence that some faults have a preferred rupture direction.  To investigate the impact this could have on hazard, we'll identify a handful of faults, modify the probabilities of individual rupture variations to favor those at the preferred end of the fault, and generate new hazard products with the modified probabilities.  No new ground motions will need to be calculated; we'll use Study 22.12 and 24.8 ground motions.&lt;br /&gt;
&lt;br /&gt;
Below is a flow chart representing the steps involved in performing this work.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Preferred_Rupture_Direction_flow_chart.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Implementation Details ==&lt;br /&gt;
&lt;br /&gt;
With the UCERF2 ERF used in Study 22.12 and Study 24.8, probabilities are specified at the rupture level -- that is, for a specific fault segment(s) and magnitude.  We then divide the probability by the number of rupture variations to get the uniform probability of each rupture variation.  For this work, we will create modified rupture variation probabilities and use these to generate a modified hazard curve.&lt;br /&gt;
&lt;br /&gt;
We will add new functionality to the CyberShake-related code in OpenSHA to support this work.  OpenSHA has a defined interface to return a list of probabilities for a given source ID and rupture ID.  We'll specify the source ID, rupture ID, rupture variation ID, probability in a CSV file which will be passed to OpenSHA.  This file will be parsed and used to populate a data structure, which will then be accessed by an implementation of the interface to determine the new probabilities.  For any source ID, rupture ID, rupture variation ID combinations not in the file, we'll use the default UCERF2 probabilities.  Then the new probabilities will be used to create a hazard curve.&lt;br /&gt;
&lt;br /&gt;
Modifying the HazardCurvePlotter to allow us to input custom probabilities for rupture variations allows us to compare hazard expected for persistent directivity vs random rupture direction. We can prescribe directivity based on topography or physics reasons and see if the signature is drowned out by randomness of the network or if it still stands out.&lt;br /&gt;
&lt;br /&gt;
Unlike the UCERF2 ERF in standard PSHA, CyberShake utilizes an extended ERF that supports rupture variation probabilities. Both include ruptures, but by specifying hypocenter, we're able to support directivity.&lt;br /&gt;
&lt;br /&gt;
In standard OpenSHA, the hierarchy is sources which are fault segments, down to ruptures which are collection of faults.&lt;br /&gt;
CyberShake extended ERFs add another layer, rupture variations. These identify the place on the fault where the hypocenter is and define a slip-time history.&lt;br /&gt;
&lt;br /&gt;
== New Functionality ==&lt;br /&gt;
The existing OpenSHA command-line tool, HazardCurvePlotter, is able to compute hazard curves and plot them to images locally on a system. We’ve added a new parameter, &amp;lt;code&amp;gt;rv-probs-csv&amp;lt;/code&amp;gt;, for a Rupture Variation Probabilities Input CSV file.&lt;br /&gt;
&lt;br /&gt;
Passing this file allows a user to specify directivity for specific rupture variations. Previously, all variations had an equal probability, distributed from the probability of the rupture itself. The sum of probabilities of the variations must still sum to the probability of the corresponding rupture.&lt;br /&gt;
&lt;br /&gt;
== CSV Files ==&lt;br /&gt;
&lt;br /&gt;
=== CSV Structure ===&lt;br /&gt;
The input CSV file should have the following columns.&lt;br /&gt;
* Source_ID&lt;br /&gt;
* Rupture_ID&lt;br /&gt;
* Rup_Var_ID&lt;br /&gt;
* Probability&lt;br /&gt;
&lt;br /&gt;
The specified source and rupture acts as a composite key, uniquely identifying a rupture variation. Each rupture at a site is comprised of a set of rupture variations.&lt;br /&gt;
The CSV file does not need to have an entry for each variation found in the CyberShake database.&lt;br /&gt;
The remaining unspecified variations are given an equal probability distributed from the difference of the rupture probability and the sum of rupture variation probabilities specified.&lt;br /&gt;
The formatting, structure, and sum or probabilities are validated in the HazardCurvePlotter.&lt;br /&gt;
&lt;br /&gt;
=== CSV Generation ===&lt;br /&gt;
&lt;br /&gt;
To generate the CSV files used for these tests, we created a Python script which takes in a configuration file and outputs a CSV with modified probabilities for selected rupture variations.&lt;br /&gt;
&lt;br /&gt;
The Python script is available [https://github.com/SCECcode/cybershake-core/blob/main/db/create_prop_file.py here].&lt;br /&gt;
&lt;br /&gt;
=== Inputs ===&lt;br /&gt;
The inputs take X% of hypocenters in preferred direction and doubled their probability, and taken X% of hypocenters in opposite direction and set to 0 to keep overall probabilities balanced.&lt;br /&gt;
&lt;br /&gt;
Below are the CSV input files used to generate the results found in the Excel spreadsheet and highlighted plots on this page.&lt;br /&gt;
&lt;br /&gt;
==== Initial Tests ====&lt;br /&gt;
Mojave:  Southeast -&amp;gt; Northwest, so the preferred hypocenters are in the southern portion of the fault.&lt;br /&gt;
&lt;br /&gt;
Coachella: Northwest -&amp;gt; Southeast, so the preferred hypocenters are in the northern portion of the fault.&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_reference_probs.csv USC_reference_probs.csv] : Reference input with same probabilities as fetched from DB for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella.csv USC_mod_probs_Mojave_Coachella.csv] : 10% hypocenter responsible modified probabilities for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella_90p.csv USC_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella_100p.csv USC_mod_probs_Mojave_Coachella_100p.csv] : 100% modified probability site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/USC_mod_probs_Moj_Coach_sa25p.csv USC_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/USC_mod_probs_Moj_Coach_sa100p.csv USC_mod_probs_Moj_Coach_sa100p.csv] : 100% modified prob, all San Andreas Events for site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/ALP_mod_probs_Mojave_Coachella_90p.csv ALP_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=ALP, run=9542&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/ALP_mod_probs_Moj_Coach_sa25p.csv ALP_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=ALP, run=9542&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/ALP_mod_probs_all_Moj_Coach_100p.csv ALP_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=ALP, run=9542&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SBSM_mod_probs_Mojave_Coachella_90p.csv SBSM_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SBSM_mod_probs_Mojave_Coachella_100p.csv SBSM_mod_probs_Mojave_Coachella_100p.csv] : 100% modified probability site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SBSM_mod_probs_Moj_Coach_sa25p.csv SBSM_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SBSM_mod_probs_all_Moj_Coach_100p.csv SBSM_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SBSM, run=9320&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SVD_mod_probs_Mojave_Coachella_90p.csv SVD_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SVD_mod_probs_Moj_Coach_sa25p.csv SVD_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SVD_mod_probs_all_Moj_Coach_100p.csv SVD_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/PDE_mod_probs_Mojave_Coachella_90p.csv PDE_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=PDE, run=9663&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/PDE_mod_probs_Moj_Coach_sa25p.csv PDE_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/PDE_mod_probs_all_Moj_Coach_100p.csv PDE_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
&lt;br /&gt;
==== Experiment 1 ====&lt;br /&gt;
Both Experiment 1 and Experiment 2 have the southern 20% of hypocenters with 90% of total probability. The remaining 10% is distributed over the remaining 80% of hypocenters.&lt;br /&gt;
&lt;br /&gt;
4 fault segments (South San Bernardino, North San Bernardino, South Mojave, North Mojave)&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/PDE_exp1.csv PDE_exp1.csv] : 90% modified probability site=PDE, run=9663&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/s036_exp1.csv s036_exp1.csv] : 90% modified probability site=s036, run=9402&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/s776_exp1.csv s776_exp1.csv] : 90% modified probability site=s776, run=9536&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/USC_exp1.csv USC_exp1.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
==== Experiment 2 ====&lt;br /&gt;
2 middle segments (North San Bernardino and South Mojave)&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s080_exp2.csv s080_exp2.csv] : 90% modified probability site=s080, run=9409&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s145_exp2.csv s145_exp2.csv] : 90% modified probability site=s145, run=9606&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s732_exp2.csv s732_exp2.csv] : 90% modified probability site=s732, run=9391&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/USC_exp2.csv USC_exp2.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
== Plots ==&lt;br /&gt;
The following plots show curves for the following sites:&lt;br /&gt;
* ALP&lt;br /&gt;
* USC&lt;br /&gt;
* SBSM&lt;br /&gt;
* SVD&lt;br /&gt;
* PDE&lt;br /&gt;
* s036&lt;br /&gt;
* s080&lt;br /&gt;
* s145&lt;br /&gt;
* s732&lt;br /&gt;
* s776&lt;br /&gt;
&lt;br /&gt;
There is a plot for each site of the periods (K) 2, 3, 5, and 10, for a total of 40 plots.&lt;br /&gt;
Each figure has the following plot lines&lt;br /&gt;
* The reference using original probabilities (Ref)&lt;br /&gt;
* Where hypocenters are responsible for 90% of the probability (90p, Exp1, Exp2)&lt;br /&gt;
* USC and SBSM has additional plot line where hypocenters are responsible for 100% (100p)&lt;br /&gt;
* 25% probability over All San Andreas Events (sa25p)&lt;br /&gt;
* 100% probability over All San Andreas Events (sa100p)&lt;br /&gt;
&lt;br /&gt;
The hazard curve plots were generated using the following bash scripts:&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_fetch_curves.sh plot_fetch_curves.sh] fetches curves from database&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_modified_curves.sh plot_modified_curves.sh] generates curves using input biases&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_comparison_curves.sh plot_comparison_curves.sh] builds comparison curves from the results of plot_fetch_curves and plot_modified_curves&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
./plot_fetch_curves.sh&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/90p csv/initial_tests 90p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/100p csv/initial_tests 100p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/sa25p csv/initial_tests/all_san_andreas sa25p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/sa100p csv/initial_tests/all_san_andreas sa100p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/exp1 csv/experiment_1 exp1&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/exp2 csv/experiment_2 exp2&lt;br /&gt;
&lt;br /&gt;
./plot_comparison_curves.sh comps fetchdb modprob&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For the 10p, 90p, and 100p plots, the results are identical to the fetched probabilities. When considering all events across the San Andreas, we see significant impact to PDE and ALP, some impact to SVD, and no visible significant impact to SBSM and USC.&lt;br /&gt;
&lt;br /&gt;
In our follow-up experiments, Experiment 1 (Exp1) and Experiment 2 (Exp2), we did not observe any significant ground motion. This is in spite of selecting CyberShake sites which directly line up with the ends of the faults.&lt;br /&gt;
&lt;br /&gt;
For the sake of brevity, we only show one plot per site in the highlighted plots below.&lt;br /&gt;
&lt;br /&gt;
All generated comparison plots are available at [https://github.com/opensha/opensha-cybershake/tree/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/comps opensha-cybershake GitHub - Comparison Plots]&lt;br /&gt;
&lt;br /&gt;
Individual plots and X,Y data is available on GitHub at [https://github.com/opensha/opensha-cybershake/tree/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/fetchdb fetchdb] for fetched reference data and [https://github.com/opensha/opensha-cybershake/tree/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/modprob modprob] for modified probability data.&lt;br /&gt;
&lt;br /&gt;
=== USC 9306 ===&lt;br /&gt;
&lt;br /&gt;
[[File:usc-t=3.png|800px|USC 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: University of Southern California (USC)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 7s (90p), 48s (100p), 1min 12s (sa25p), 1min 23s (sa100p), 6min 4s (Exp1), 48s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== ALP 9542 ===&lt;br /&gt;
[[File:alp-t=5.png|800px|ALP 5s]]&lt;br /&gt;
&lt;br /&gt;
Site: Antelope (ALP)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 16s (90p), 1min 12s (sa25p), 1min 29s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== SBSM 9320 ===&lt;br /&gt;
[[File:sbsm-t=3.png|800px|SBSM 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: San Bernardino Strong Motion (SBSM)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 59s (90p), 1min 1s (100p), 1min 12s (sa25p), 1min 24s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== SVD 9647 ===&lt;br /&gt;
[[File:svd-t=3.png|800px|SVD 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: Seven Oaks Dam (SVD)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 16s (90p), 1min 13s (sa25p), 1min 26s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== PDE 9663 ===&lt;br /&gt;
[[File:pde-t=10.png|800px|PDE 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: Pardee (PDE)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 21s (90p), 1min 4s (sa25p), 1min 25s (sa100p), 2min 36s (Exp1)&lt;br /&gt;
&lt;br /&gt;
=== s036 9402 ===&lt;br /&gt;
[[File:s036-t=10.png|800px|s036 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s036&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 34s (Exp1)&lt;br /&gt;
&lt;br /&gt;
=== s080 9409 ===&lt;br /&gt;
[[File:s080-t=10.png|800px|s080 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s080&lt;br /&gt;
&lt;br /&gt;
Compute Time: 5min 28s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s145 9606 ===&lt;br /&gt;
[[File:s145-t=10.png|800px|s145 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s145&lt;br /&gt;
&lt;br /&gt;
Compute Time: 5min 55s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s732 9391 ===&lt;br /&gt;
[[File:s732-t=10.png|800px|s732 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s732&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 52s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s776 9536 ===&lt;br /&gt;
[[File:s776-t=10.png|800px|s776 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s776&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 29s (Exp1)&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Hazard_Curve_Comparison_Plots.xlsx&amp;diff=30485</id>
		<title>File:Hazard Curve Comparison Plots.xlsx</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Hazard_Curve_Comparison_Plots.xlsx&amp;diff=30485"/>
		<updated>2025-10-07T06:24:11Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Bhatthal uploaded a new version of File:Hazard Curve Comparison Plots.xlsx&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=Preferred_Rupture_Directivity_in_Hazard_Curve_Computations&amp;diff=30484</id>
		<title>Preferred Rupture Directivity in Hazard Curve Computations</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=Preferred_Rupture_Directivity_in_Hazard_Curve_Computations&amp;diff=30484"/>
		<updated>2025-10-07T05:35:07Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: /* s776 9536 */ Add compute time for s776&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;This page documents efforts to investigate the impact that preferred rupture directions would have on hazard as calculated with CyberShake ground motions in OpenSHA.&lt;br /&gt;
&lt;br /&gt;
== Research Plan ==&lt;br /&gt;
&lt;br /&gt;
There is some evidence that some faults have a preferred rupture direction.  To investigate the impact this could have on hazard, we'll identify a handful of faults, modify the probabilities of individual rupture variations to favor those at the preferred end of the fault, and generate new hazard products with the modified probabilities.  No new ground motions will need to be calculated; we'll use Study 22.12 and 24.8 ground motions.&lt;br /&gt;
&lt;br /&gt;
Below is a flow chart representing the steps involved in performing this work.&lt;br /&gt;
&lt;br /&gt;
{|&lt;br /&gt;
| [[File:Preferred_Rupture_Direction_flow_chart.png|thumb|600px]]&lt;br /&gt;
|}&lt;br /&gt;
&lt;br /&gt;
== Implementation Details ==&lt;br /&gt;
&lt;br /&gt;
With the UCERF2 ERF used in Study 22.12 and Study 24.8, probabilities are specified at the rupture level -- that is, for a specific fault segment(s) and magnitude.  We then divide the probability by the number of rupture variations to get the uniform probability of each rupture variation.  For this work, we will create modified rupture variation probabilities and use these to generate a modified hazard curve.&lt;br /&gt;
&lt;br /&gt;
We will add new functionality to the CyberShake-related code in OpenSHA to support this work.  OpenSHA has a defined interface to return a list of probabilities for a given source ID and rupture ID.  We'll specify the source ID, rupture ID, rupture variation ID, probability in a CSV file which will be passed to OpenSHA.  This file will be parsed and used to populate a data structure, which will then be accessed by an implementation of the interface to determine the new probabilities.  For any source ID, rupture ID, rupture variation ID combinations not in the file, we'll use the default UCERF2 probabilities.  Then the new probabilities will be used to create a hazard curve.&lt;br /&gt;
&lt;br /&gt;
Modifying the HazardCurvePlotter to allow us to input custom probabilities for rupture variations allows us to compare hazard expected for persistent directivity vs random rupture direction. We can prescribe directivity based on topography or physics reasons and see if the signature is drowned out by randomness of the network or if it still stands out.&lt;br /&gt;
&lt;br /&gt;
Unlike the UCERF2 ERF in standard PSHA, CyberShake utilizes an extended ERF that supports rupture variation probabilities. Both include ruptures, but by specifying hypocenter, we're able to support directivity.&lt;br /&gt;
&lt;br /&gt;
In standard OpenSHA, the hierarchy is sources which are fault segments, down to ruptures which are collection of faults.&lt;br /&gt;
CyberShake extended ERFs add another layer, rupture variations. These identify the place on the fault where the hypocenter is and define a slip-time history.&lt;br /&gt;
&lt;br /&gt;
== New Functionality ==&lt;br /&gt;
The existing OpenSHA command-line tool, HazardCurvePlotter, is able to compute hazard curves and plot them to images locally on a system. We’ve added a new parameter, &amp;lt;code&amp;gt;rv-probs-csv&amp;lt;/code&amp;gt;, for a Rupture Variation Probabilities Input CSV file.&lt;br /&gt;
&lt;br /&gt;
Passing this file allows a user to specify directivity for specific rupture variations. Previously, all variations had an equal probability, distributed from the probability of the rupture itself. The sum of probabilities of the variations must still sum to the probability of the corresponding rupture.&lt;br /&gt;
&lt;br /&gt;
== CSV Files ==&lt;br /&gt;
&lt;br /&gt;
=== CSV Structure ===&lt;br /&gt;
The input CSV file should have the following columns.&lt;br /&gt;
* Source_ID&lt;br /&gt;
* Rupture_ID&lt;br /&gt;
* Rup_Var_ID&lt;br /&gt;
* Probability&lt;br /&gt;
&lt;br /&gt;
The specified source and rupture acts as a composite key, uniquely identifying a rupture variation. Each rupture at a site is comprised of a set of rupture variations.&lt;br /&gt;
The CSV file does not need to have an entry for each variation found in the CyberShake database.&lt;br /&gt;
The remaining unspecified variations are given an equal probability distributed from the difference of the rupture probability and the sum of rupture variation probabilities specified.&lt;br /&gt;
The formatting, structure, and sum or probabilities are validated in the HazardCurvePlotter.&lt;br /&gt;
&lt;br /&gt;
=== CSV Generation ===&lt;br /&gt;
&lt;br /&gt;
To generate the CSV files used for these tests, we created a Python script which takes in a configuration file and outputs a CSV with modified probabilities for selected rupture variations.&lt;br /&gt;
&lt;br /&gt;
The Python script is available [https://github.com/SCECcode/cybershake-core/blob/main/db/create_prop_file.py here].&lt;br /&gt;
&lt;br /&gt;
=== Inputs ===&lt;br /&gt;
The inputs take X% of hypocenters in preferred direction and doubled their probability, and taken X% of hypocenters in opposite direction and set to 0 to keep overall probabilities balanced.&lt;br /&gt;
&lt;br /&gt;
Below are the CSV input files used to generate the results found in the Excel spreadsheet and highlighted plots on this page.&lt;br /&gt;
&lt;br /&gt;
==== Initial Tests ====&lt;br /&gt;
Mojave:  Southeast -&amp;gt; Northwest, so the preferred hypocenters are in the southern portion of the fault.&lt;br /&gt;
&lt;br /&gt;
Coachella: Northwest -&amp;gt; Southeast, so the preferred hypocenters are in the northern portion of the fault.&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_reference_probs.csv USC_reference_probs.csv] : Reference input with same probabilities as fetched from DB for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella.csv USC_mod_probs_Mojave_Coachella.csv] : 10% hypocenter responsible modified probabilities for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella_90p.csv USC_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/USC_mod_probs_Mojave_Coachella_100p.csv USC_mod_probs_Mojave_Coachella_100p.csv] : 100% modified probability site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/USC_mod_probs_Moj_Coach_sa25p.csv USC_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=USC, run=9306&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/USC_mod_probs_Moj_Coach_sa100p.csv USC_mod_probs_Moj_Coach_sa100p.csv] : 100% modified prob, all San Andreas Events for site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/ALP_mod_probs_Mojave_Coachella_90p.csv ALP_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=ALP, run=9542&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/ALP_mod_probs_Moj_Coach_sa25p.csv ALP_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=ALP, run=9542&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/ALP_mod_probs_all_Moj_Coach_100p.csv ALP_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=ALP, run=9542&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SBSM_mod_probs_Mojave_Coachella_90p.csv SBSM_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SBSM_mod_probs_Mojave_Coachella_100p.csv SBSM_mod_probs_Mojave_Coachella_100p.csv] : 100% modified probability site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SBSM_mod_probs_Moj_Coach_sa25p.csv SBSM_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SBSM, run=9320&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SBSM_mod_probs_all_Moj_Coach_100p.csv SBSM_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SBSM, run=9320&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/SVD_mod_probs_Mojave_Coachella_90p.csv SVD_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SVD_mod_probs_Moj_Coach_sa25p.csv SVD_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/SVD_mod_probs_all_Moj_Coach_100p.csv SVD_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/PDE_mod_probs_Mojave_Coachella_90p.csv PDE_mod_probs_Mojave_Coachella_90p.csv] : 90% modified probability site=PDE, run=9663&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/PDE_mod_probs_Moj_Coach_sa25p.csv PDE_mod_probs_Moj_Coach_sa25p.csv] : 25% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/initial_tests/all_san_andreas/PDE_mod_probs_all_Moj_Coach_100p.csv PDE_mod_probs_all_Moj_Coach_100p.csv] : 100% modified prob, all San Andreas Events for site=SVD, run=9647&lt;br /&gt;
&lt;br /&gt;
==== Experiment 1 ====&lt;br /&gt;
Both Experiment 1 and Experiment 2 have the southern 20% of hypocenters with 90% of total probability. The remaining 10% is distributed over the remaining 80% of hypocenters.&lt;br /&gt;
&lt;br /&gt;
4 fault segments (South San Bernardino, North San Bernardino, South Mojave, North Mojave)&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/PDE_exp1.csv PDE_exp1.csv] : 90% modified probability site=PDE, run=9663&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/s036_exp1.csv s036_exp1.csv] : 90% modified probability site=s036, run=9402&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/s776_exp1.csv s776_exp1.csv] : 90% modified probability site=s776, run=9536&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_1/USC_exp1.csv USC_exp1.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
==== Experiment 2 ====&lt;br /&gt;
2 middle segments (North San Bernardino and South Mojave)&lt;br /&gt;
&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s080_exp2.csv s080_exp2.csv] : 90% modified probability site=s080, run=9409&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s145_exp2.csv s145_exp2.csv] : 90% modified probability site=s145, run=9606&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/s732_exp2.csv s732_exp2.csv] : 90% modified probability site=s732, run=9391&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/csv/experiment_2/USC_exp2.csv USC_exp2.csv] : 90% modified probability site=USC, run=9306&lt;br /&gt;
&lt;br /&gt;
== Plots ==&lt;br /&gt;
The following plots show curves for the following sites:&lt;br /&gt;
* ALP&lt;br /&gt;
* USC&lt;br /&gt;
* SBSM&lt;br /&gt;
* SVD&lt;br /&gt;
* PDE&lt;br /&gt;
* s036&lt;br /&gt;
* s080&lt;br /&gt;
* s145&lt;br /&gt;
* s732&lt;br /&gt;
* s776&lt;br /&gt;
&lt;br /&gt;
There is a plot for each site of the periods (K) 2, 3, 5, and 10, for a total of 40 plots.&lt;br /&gt;
Each figure has the following plot lines&lt;br /&gt;
* The reference using original probabilities (Ref)&lt;br /&gt;
* Where hypocenters are responsible for 90% of the probability (90p, Exp1, Exp2)&lt;br /&gt;
* USC and SBSM has additional plot line where hypocenters are responsible for 100% (100p)&lt;br /&gt;
* 25% probability over All San Andreas Events (sa25p)&lt;br /&gt;
* 100% probability over All San Andreas Events (sa100p)&lt;br /&gt;
&lt;br /&gt;
The hazard curve plots were generated using the following bash scripts:&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_fetch_curves.sh plot_fetch_curves.sh] fetches curves from database&lt;br /&gt;
* [https://github.com/opensha/opensha-cybershake/blob/master/src/test/resources/org/opensha/sha/cybershake/plot/HazardCurvePlotter/plot_modified_curves.sh plot_modified_curves.sh] generates curves using input biases&lt;br /&gt;
&lt;br /&gt;
&amp;lt;code&amp;gt;&lt;br /&gt;
./plot_fetch_curves.sh&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/90p csv/initial_tests 90p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/100p csv/initial_tests 100p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/sa25p csv/initial_tests/all_san_andreas sa25p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/sa100p csv/initial_tests/all_san_andreas sa100p&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/exp1 csv/experiment_1 exp1&lt;br /&gt;
&lt;br /&gt;
./plot_modified_curves.sh modprob/exp2 csv/experiment_2 exp2&lt;br /&gt;
&amp;lt;/code&amp;gt;&lt;br /&gt;
&lt;br /&gt;
For the 10p, 90p, and 100p plots, the results are identical to the fetched probabilities. When considering all events across the San Andreas, we see significant impact to PDE and ALP, some impact to SVD, and no visible significant impact to SBSM and USC.&lt;br /&gt;
&lt;br /&gt;
In our follow-up experiments, Experiment 1 (Exp1) and Experiment 2 (Exp2), we did not observe any significant ground motion. This is in spite of selecting CyberShake sites which directly line up with the ends of the faults.&lt;br /&gt;
&lt;br /&gt;
For the sake of brevity, we only show one plot per site in the highlighted plots below.&lt;br /&gt;
&lt;br /&gt;
The X,Y data points and all plots generated are available in the Excel spreadsheet : [[File:Hazard_Curve_Comparison_Plots.xlsx]]&lt;br /&gt;
&lt;br /&gt;
Comparison plots were made by copying out the TXT X,Y datapoints into Excel. We may expand existing OpenSHA Comparison Plot functionality for this purpose. Currently we're able to compare a new run against results fetched from a database, but not multiple modified plots.&lt;br /&gt;
&lt;br /&gt;
=== USC 9306 ===&lt;br /&gt;
&lt;br /&gt;
[[File:usc-t=3.png|800px|USC 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: University of Southern California (USC)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 7s (90p), 48s (100p), 1min 12s (sa25p), 1min 23s (sa100p), 6min 4s (Exp1), 48s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== ALP 9542 ===&lt;br /&gt;
[[File:alp-t=5.png|800px|ALP 5s]]&lt;br /&gt;
&lt;br /&gt;
Site: Antelope (ALP)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 16s (90p), 1min 12s (sa25p), 1min 29s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== SBSM 9320 ===&lt;br /&gt;
[[File:sbsm-t=3.png|800px|SBSM 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: San Bernardino Strong Motion (SBSM)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 59s (90p), 1min 1s (100p), 1min 12s (sa25p), 1min 24s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== SVD 9647 ===&lt;br /&gt;
[[File:svd-t=3.png|800px|SVD 3s]]&lt;br /&gt;
&lt;br /&gt;
Site: Seven Oaks Dam (SVD)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 16s (90p), 1min 13s (sa25p), 1min 26s (sa100p)&lt;br /&gt;
&lt;br /&gt;
=== PDE 9663 ===&lt;br /&gt;
[[File:pde-t=10.png|800px|PDE 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: Pardee (PDE)&lt;br /&gt;
&lt;br /&gt;
Compute Time: 1min 21s (90p), 1min 4s (sa25p), 1min 25s (sa100p), 2min 36s (Exp1)&lt;br /&gt;
&lt;br /&gt;
=== s036 9402 ===&lt;br /&gt;
[[File:s036-t=10.png|800px|s036 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s036&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 34s (Exp1)&lt;br /&gt;
&lt;br /&gt;
=== s080 9409 ===&lt;br /&gt;
[[File:s080-t=10.png|800px|s080 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s080&lt;br /&gt;
&lt;br /&gt;
Compute Time: 5min 28s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s145 9606 ===&lt;br /&gt;
[[File:s145-t=10.png|800px|s145 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s145&lt;br /&gt;
&lt;br /&gt;
Compute Time: 5min 55s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s732 9391 ===&lt;br /&gt;
[[File:s732-t=10.png|800px|s732 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s732&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 52s (Exp2)&lt;br /&gt;
&lt;br /&gt;
=== s776 9536 ===&lt;br /&gt;
[[File:s776-t=10.png|800px|s776 10s]]&lt;br /&gt;
&lt;br /&gt;
Site: s776&lt;br /&gt;
&lt;br /&gt;
Compute Time: 7min 29s (Exp1)&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:S776-t%3D10.png&amp;diff=30483</id>
		<title>File:S776-t=10.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:S776-t%3D10.png&amp;diff=30483"/>
		<updated>2025-10-07T05:34:08Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:S732-t%3D10.png&amp;diff=30482</id>
		<title>File:S732-t=10.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:S732-t%3D10.png&amp;diff=30482"/>
		<updated>2025-10-07T05:33:55Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:S145-t%3D10.png&amp;diff=30481</id>
		<title>File:S145-t=10.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:S145-t%3D10.png&amp;diff=30481"/>
		<updated>2025-10-07T05:33:40Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:S080-t%3D10.png&amp;diff=30480</id>
		<title>File:S080-t=10.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:S080-t%3D10.png&amp;diff=30480"/>
		<updated>2025-10-07T05:33:25Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:S036-t%3D10.png&amp;diff=30479</id>
		<title>File:S036-t=10.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:S036-t%3D10.png&amp;diff=30479"/>
		<updated>2025-10-07T05:33:05Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Pde-t%3D10.png&amp;diff=30478</id>
		<title>File:Pde-t=10.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Pde-t%3D10.png&amp;diff=30478"/>
		<updated>2025-10-07T05:31:23Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Bhatthal uploaded a new version of File:Pde-t=10.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
	<entry>
		<id>https://strike.scec.org/scecwiki/index.php?title=File:Usc-t%3D3.png&amp;diff=30477</id>
		<title>File:Usc-t=3.png</title>
		<link rel="alternate" type="text/html" href="https://strike.scec.org/scecwiki/index.php?title=File:Usc-t%3D3.png&amp;diff=30477"/>
		<updated>2025-10-07T05:29:50Z</updated>

		<summary type="html">&lt;p&gt;Bhatthal: Bhatthal uploaded a new version of File:Usc-t=3.png&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;/div&gt;</summary>
		<author><name>Bhatthal</name></author>
		
	</entry>
</feed>