Difference between revisions of "CyberShake Testing"
(Created page with 'Computational scale and complexity of the SCEC CyberShake system requires automated and repeatable system-level testing capabilities. The CyberShake testing must be capable of en…') |
|||
(6 intermediate revisions by the same user not shown) | |||
Line 2: | Line 2: | ||
The CyberShake Testing system combines a distributed workflow-based HPC software testing harness together with a database of reference problems and expected solutions. | The CyberShake Testing system combines a distributed workflow-based HPC software testing harness together with a database of reference problems and expected solutions. | ||
+ | |||
+ | == CyberShake Testing Requirements == | ||
+ | #Must support broad range of HPC codes including calculations too large to run on SCEC computers | ||
+ | #Must perform end-to-end calculations, capable of performing a series of calculations | ||
+ | #Must support multiple test evaluations including file-based comparisons, integer and floating-point with tolerance comparisons, and relational database entry comparisons. | ||
+ | #Must be modular, capable of testing performance of alternative codes on equivalent calculations | ||
+ | #Must support and help manage a test repository that contains reference problems and reference results | ||
+ | #Must support provisioning of, and job submission to, multiple HPC resource providers including USC HPCC, TeraGrid, and possibly DOE computer resources | ||
+ | #Must provide well-define metadata description of every test result that describes the code under test, the input parameters used, the reference results used in comparisons, and final test results | ||
+ | |||
+ | == Required Evaluation Tests == | ||
+ | #Rupture Generator | ||
+ | #SGT Calculation | ||
+ | #Mesh Maker | ||
+ | #Distance Calculation | ||
+ | #Site-Rupture Set Determination | ||
+ | #Ten moderate earthquake distributed around California | ||
+ | #List of Sites | ||
+ | |||
+ | == CyberShake Automated Testing Framework == | ||
+ | |||
+ | CyberShake 2.0 development requires a workflow-based system capable of automating multiple CyberShake HPC calculations. | ||
+ | |||
+ | An alternative definition of a test harness is software constructed to facilitate integration testing. | ||
+ | |||
+ | In 2011, Wikipedia describes Test Harness in the following terms. "In software testing, a test harness or automated test framework is a collection of software and test data configured to test a program unit by running it under varying conditions and monitoring its behavior and outputs. It has two main parts: the Test execution engine and the Test script repository. | ||
+ | |||
+ | Test harnesses allow for the automation of tests. They can call functions with supplied parameters and print out and compare the results to the desired value. The test harness is a hook to the developed code, which can be tested using an automation framework. | ||
+ | |||
+ | A test harness should allow specific tests to run (this helps in optimising), orchestrate a runtime environment, and provide a capability to analyse results. | ||
+ | |||
+ | The typical objectives of a test harness are to: | ||
+ | *Automate the testing process. | ||
+ | *Execute test suites of test cases. | ||
+ | *Generate associated test reports. | ||
+ | |||
+ | A test harness may provide some of the following benefits: | ||
+ | * Increased probability that regression testing will occur. | ||
+ | * Ensure that subsequent test runs are exact duplicates of previous ones. | ||
+ | * Increased productivity due to automation of the testing process. | ||
+ | |||
+ | |||
+ | |||
+ | *[http://en.wikipedia.org/wiki/Test_harness WikiPedia Test Harness] | ||
+ | |||
+ | == CyberShake Test Oracle == | ||
+ | |||
+ | We require a reference database that describes specific test problems, describes the input files, and output files, and defines a list of expected results. | ||
+ | |||
+ | == References == | ||
+ | |||
+ | # Maechling, P., Deelman, E., Cui, Y. (2009), Implementing Software Acceptance Tests as Scientific Workflows, Proceedings of the International Conference on Parallel and Distributed Processing Techniques and Applications, Hamid R. Arabnia (Ed.): PDPTA 2009, Las Vegas, Nevada, USA, July 13-17, 2009, 2 Volumes. CSREA Press 2009, ISBN 1-60132-123-6, pp. 317-323 | ||
+ | |||
+ | == Related Entries == | ||
+ | |||
+ | *[[CyberShake]] | ||
+ | *[[CyberShake Workplan]] | ||
+ | *[[UCVM]] | ||
+ | |||
+ | == See Also == | ||
+ | |||
+ | [[Main Page]] |
Latest revision as of 23:42, 21 February 2011
Computational scale and complexity of the SCEC CyberShake system requires automated and repeatable system-level testing capabilities. The CyberShake testing must be capable of end-to-end testing, showing that all elements, inputs, earth models, computational codes, and data processing and reduction codes all work together.
The CyberShake Testing system combines a distributed workflow-based HPC software testing harness together with a database of reference problems and expected solutions.
Contents
CyberShake Testing Requirements
- Must support broad range of HPC codes including calculations too large to run on SCEC computers
- Must perform end-to-end calculations, capable of performing a series of calculations
- Must support multiple test evaluations including file-based comparisons, integer and floating-point with tolerance comparisons, and relational database entry comparisons.
- Must be modular, capable of testing performance of alternative codes on equivalent calculations
- Must support and help manage a test repository that contains reference problems and reference results
- Must support provisioning of, and job submission to, multiple HPC resource providers including USC HPCC, TeraGrid, and possibly DOE computer resources
- Must provide well-define metadata description of every test result that describes the code under test, the input parameters used, the reference results used in comparisons, and final test results
Required Evaluation Tests
- Rupture Generator
- SGT Calculation
- Mesh Maker
- Distance Calculation
- Site-Rupture Set Determination
- Ten moderate earthquake distributed around California
- List of Sites
CyberShake Automated Testing Framework
CyberShake 2.0 development requires a workflow-based system capable of automating multiple CyberShake HPC calculations.
An alternative definition of a test harness is software constructed to facilitate integration testing.
In 2011, Wikipedia describes Test Harness in the following terms. "In software testing, a test harness or automated test framework is a collection of software and test data configured to test a program unit by running it under varying conditions and monitoring its behavior and outputs. It has two main parts: the Test execution engine and the Test script repository.
Test harnesses allow for the automation of tests. They can call functions with supplied parameters and print out and compare the results to the desired value. The test harness is a hook to the developed code, which can be tested using an automation framework.
A test harness should allow specific tests to run (this helps in optimising), orchestrate a runtime environment, and provide a capability to analyse results.
The typical objectives of a test harness are to:
- Automate the testing process.
- Execute test suites of test cases.
- Generate associated test reports.
A test harness may provide some of the following benefits:
- Increased probability that regression testing will occur.
- Ensure that subsequent test runs are exact duplicates of previous ones.
- Increased productivity due to automation of the testing process.
CyberShake Test Oracle
We require a reference database that describes specific test problems, describes the input files, and output files, and defines a list of expected results.
References
- Maechling, P., Deelman, E., Cui, Y. (2009), Implementing Software Acceptance Tests as Scientific Workflows, Proceedings of the International Conference on Parallel and Distributed Processing Techniques and Applications, Hamid R. Arabnia (Ed.): PDPTA 2009, Las Vegas, Nevada, USA, July 13-17, 2009, 2 Volumes. CSREA Press 2009, ISBN 1-60132-123-6, pp. 317-323