Difference between revisions of "CTC Results"

From SCECpedia
Jump to navigationJump to search
Line 3: Line 3:
 
[[CISN Testing Center]] evaluation results are presented as reference performance metrics. An important goal of a collaborative testing capabilities is the ability to compare equivalent results.  
 
[[CISN Testing Center]] evaluation results are presented as reference performance metrics. An important goal of a collaborative testing capabilities is the ability to compare equivalent results.  
  
== CTC Result Posting ==
+
== Testing Center Design Goals ==
  
As of 20 June 2011, CTC consists of two elements, an automated data processing framework based on the CSEP software that retrieves observational data for use in evaluations.
+
CTC is designed to meet scientific forecast testing goals described in Schorlemmer Gerstenberger (2005):
 
+
# Transparency
A second element are processing software that parses CISN Algorithm performance reports and matches the EEW event and site forecast information against ANSS catalog and ShakeMap ground motion observations.
+
# Controlled Environment
 
+
# Comparability
When testing results are presented, the results are identified with a specific version of the CTC software, and with a Run ID.
+
# Reproducibility
  
 
For transparency, CTC evaluation summaries are identified with a CTC software version. When a CTC results is presented, all input data sets used in the calculation, all authorized data sets (observational data), and all source code is accessible to researchers. All calculations can be checked.
 
For transparency, CTC evaluation summaries are identified with a CTC software version. When a CTC results is presented, all input data sets used in the calculation, all authorized data sets (observational data), and all source code is accessible to researchers. All calculations can be checked.

Revision as of 05:52, 20 June 2011

SCEC NSF-USGS-words logom.png
CISNlogos.png
Usgs-logo-color.jpg

CISN Testing Center evaluation results are presented as reference performance metrics. An important goal of a collaborative testing capabilities is the ability to compare equivalent results.

Testing Center Design Goals

CTC is designed to meet scientific forecast testing goals described in Schorlemmer Gerstenberger (2005):

  1. Transparency
  2. Controlled Environment
  3. Comparability
  4. Reproducibility

For transparency, CTC evaluation summaries are identified with a CTC software version. When a CTC results is presented, all input data sets used in the calculation, all authorized data sets (observational data), and all source code is accessible to researchers. All calculations can be checked.

For controlled environment, CTC evaluation summaries are based on EEW forecasts created during earthquake processing and transmitted to the CTC testing center with no human intervention. Performance summaries are produced by CTC testing group, independent of algorithm developers. Observational data used is obtained from approved and authorized data sources (ANSS Catalog and ShakeMap).

For comparability, CTC evaluation summaries integrate information from multiple algorithms. CTC processing combines information from multiple algorithms producing comparable performance information for difference algorithms for same earthquakes.

For reproducibility, CTC results can be reproduced by retrieving the CTC software version used to produced the summaries. The CTC software can be run interactively, which enables any use to reproduce CTC results.

CTC Results

See Also