Difference between revisions of "CSEP 2.0 Developments"
From SCECpedia
Jump to navigationJump to searchLine 51: | Line 51: | ||
* comparison against csep1 simulations (use case for csep1 dataset) | * comparison against csep1 simulations (use case for csep1 dataset) | ||
** need method for comparing rate based forecasts and fault-based simulation based forecasts | ** need method for comparing rate based forecasts and fault-based simulation based forecasts | ||
+ | |||
+ | == Roadmap toward testing UCERF3 == | ||
+ | * describe specific steps required for testing UCERF3 | ||
+ | * essentially, previous sections written as actionable statements from previous section | ||
+ | * include timeline | ||
+ | |||
+ | == Conclusion == | ||
+ | * wrap up proposal with brief paragraph reiterating main points |
Latest revision as of 23:37, 28 June 2018
Contents
Proposed software developments needed to test UCERF3 -- CSEP2.0 as a workflow
- abstract goes here.
- summary statement of proposal
- clearly state proposed plans for putting ucerf3 under test
- [under development until remainder of proposal is written]
Introduction
- state reasons for transitioning from CSEP1 -> CSEP2
- introduce technical/scientific challenges of testing UCERF3 etas
- introduce the csep problem as a workflow
- describe remaining software infrastructure
- 'on-demand' prospective/retrospective experiments
- Guiding Principals
- Controlled Environment
- Transparency
- Comparability
- Reproducibility
Testing the Next Generation of Earthquake Forecasts
- cover technical/scientific challenges of testing UCERF3 etas
- fault and stochastic event sets
- potential models under forecast
- make point that software developments are required to test CSEP2.0
- software developments
- first step to evaluation forecast is to compute forecasts
CSEP2.0 as a Workflow
- proposed software and hardware to test CSEP2.0
- describe tech used to achieve guiding principals
- controlled environment -> containers
- transparency -> web application (results viewer)
- comparability -> (open sourced evaluations)
- reproducibility -> containers and workflow
- decoupling of forecasts and evaluations
- proposal to open-source experiments to community
- versioning of catalog data
- introduce the minimum viable product to run Open-SHA from container in pegasus
Proposed scientific questions for testing new Forecasting models
- do faults provide improved forecasting skill?
- retrospective experiements for past earthquakes
- el mayor cucupah
- landers
- napa valley
- northridge (unknown fault... hinterlands)
- simple versus complex models
- comparison against csep1 simulations (use case for csep1 dataset)
- need method for comparing rate based forecasts and fault-based simulation based forecasts
Roadmap toward testing UCERF3
- describe specific steps required for testing UCERF3
- essentially, previous sections written as actionable statements from previous section
- include timeline
Conclusion
- wrap up proposal with brief paragraph reiterating main points