Production Run Readiness Review

From SCECpedia
Jump to navigationJump to search

Prior to production research simulations, we conduct scientific and computational readiness reviews. Outlines of the material we cover during each review is shown below

Each production run should be defined as Study meaning that we have sufficient detail to identify total output files. When we think we are ready to start study, we conduct two reviews:

Scientific Readiness Review

  1. Goal of study:
  • Expected outcome
  • Statement of why this is a priority
  1. Problem statements Inputs:
  • ERF
  • CVM
  • Rupture variations (rupture generator)
  • Frequencies of simulation
  • SGT codes
  • Sites to be calculated
  1. Outputs:
  • Identification of expect outputs
  • Types of outputs files and results
  • Number of each
  • Size of expected
  • Existing partial results
  1. Verification/Validation Work:
  • Evidence the computational system produces correct answer

5. Reproducibly of results: • What we have done to implement reproducibility of results • What is required to re-run this study 1 year from now?

6. Expected analysis: • What analysis will be conducted with results. • Who will help with scientific analysis

7. Metrics Review • What metrics will be calculated?

8. Open Issues/Risks Analysis

Computational Readiness Review

1. Computational goal for study • Description deliverable outputs.

2. Total Input files and parameters • Input files and sizes

3. Total output files: • Expect output files in number and size • Identification of files that will be archived

4. Study archives both files and database: • Where will file-based archives be located. How much storage is needed. Confirm we have storage? • Where will database-based archives be located. What current storage. What is expected new storage needed. • How long will they be saved?

5. User Access To Results • How will users access database archives • How will users access file-based archives

5. Description of expected computing environment. • What system • What version of Pegasus • Description of codes to be used • Description of workflow environment • What versions of PMC • What tagged version of CyberShake codebase

6. Estimated computational requirements for study • Do we have computing time available? • Comparison of available computing time to estimated computing needed

7. Study start and duration estimates

8. Personnel involved in running study.

9. Reproducibility of results

10. Metrics Review

• Computational metrics that will be collected

11. Open Issues/Risk Analysis