Difference between revisions of "CSEP Minutes 06-27-2018"
From SCECpedia
Jump to navigationJump to searchLine 23: | Line 23: | ||
** new zealand has the issue of faults | ** new zealand has the issue of faults | ||
** ucerf3 most advanced | ** ucerf3 most advanced | ||
+ | ** WGSEP88 report model (D. Jackson paper in the SRL issue; good example for assessing fault based source model) |
Revision as of 20:28, 27 June 2018
Participants: M. Werner, D. Rhoades, and W. Savran
Minutes
- extract daily rates for each forecast and plot to verify intregity (first step)
- verify where W, T and R tests are not counted for entire time period
- decipher the timestamp on the evaluation file
- how does the system put together the cumulative tests
- are there cumulative analogs for each evaluation?
- curated csep1.0 data might be needed to compare against csep2.0 forecasts
- verification exercises:
- overlapping window: ETAS extract first month of forecasts and observed catalogs and evaluations
- RELM tests (N, L, CL, M, S)
- start at the beginning of time to verify cumulative tests
- write scripts for this
- overlapping window: ETAS extract first month of forecasts and observed catalogs and evaluations
- scripts to extract evaluation results
- journal for publishing geoscience data (https://www.earth-system-science-data.net/)
- check existence of the catalog data
- we would need to flag evaluations where catalog could not be downloaded
- competition for fault-based forecasts?
- some hazard models have the notion of seismic regions
- new zealand has the issue of faults
- ucerf3 most advanced
- WGSEP88 report model (D. Jackson paper in the SRL issue; good example for assessing fault based source model)