Difference between revisions of "Release Planning"

From SCECpedia
Jump to navigationJump to search
 
(2 intermediate revisions by the same user not shown)
Line 6: Line 6:
 
* ssh -o "ServerAliveInterval 60" username@discovery.usc.edu
 
* ssh -o "ServerAliveInterval 60" username@discovery.usc.edu
 
* cd /project/scec_608/maechlin/dev
 
* cd /project/scec_608/maechlin/dev
* git clone https://github.com/pjmaechling/ucvm.git
+
* git clone https://github.com/sceccode/ucvm.git
 
* adding codemeta.json file.
 
* adding codemeta.json file.
 
* add Contributors.MD file.
 
* add Contributors.MD file.
Line 83: Line 83:
 
* METADATA - Software metadata (CodeMeta), Citation File (CFF), "references" (dependencies)
 
* METADATA - Software metadata (CodeMeta), Citation File (CFF), "references" (dependencies)
 
* PROJECT DOCUMENTATION - Rationale, teams, governance, community (contact, code of conduct)
 
* PROJECT DOCUMENTATION - Rationale, teams, governance, community (contact, code of conduct)
 +
 +
== Where Documentation Lives ==
 +
Documentation lives where the source code lives! (This is never in an email, chat, or similar!)
 +
 +
Conceptual Documentation:
 +
* Requirements
 +
* Projects
 +
Hands-on documentation
 +
* How-tos, getting started
 +
* Templates for issues, pull/merge
 +
* Contribution guidelines
 +
Reference documentation
 +
* API
 +
* Tests
 +
* Metadata
 +
 +
== Toolbox Documentation: ==
 +
Toolbox documentation should describe the steps off analysis in a pedagogical, narrative
 +
fashion, with example data that users can load to follow along with and understand the documentation.
 +
 +
 +
== UCVM Implements Multiple Test Types: ==
 +
# Functional tests – Unit Tests essential core ucvm functions
 +
# Integration Tests – Test utilities including meshing, layer searches, gtls, and performance
 +
# Model Tests - Each velocity model has tests showing expected results for some points
 +
# Example program - Examples programs and scripts showing working examples of ucvm capabilities.
 +
# Acceptance tests – Confirm results on users system. Maybe union of funcational, integration, and model tests.
 +
 +
 +
== Standardized Testing Plan ==
 +
 +
# Unit Testing – test a small functional units
 +
# Regression Testing – ensure a fixed bug remains fixed
 +
## Core system regression tests
 +
##Model specific regression tests
 +
# Integration Testing – CI test suites
 +
## Tests run after each commit
 +
## Tests run after pull request
 +
# Performance and Benchmarking Tests
 +
# Acceptance Testing
 +
## Tests provided with models that show expected results
 +
## Tests provided with pull request that confirm the changes
 +
## Tests that confirm a new release must pass
 +
## Tests that software performs correctly on user environment
 +
 +
== Recommended Basic Software Practices: ==
 +
# Training on Software Practices
 +
# Code in a Code Repo
 +
# Automated Testing
 +
# Persistent ID for software versions

Latest revision as of 06:09, 6 November 2021

Fork and Pull Development

Using the fork and pull method, start with fork of sceccode/ucvm.git into personal repo. Then, to do development on CARC, plan to clone pjmaechling/ucvm.git for development. Set git upstream to original repo sceccode/ucvm.git to keep in sync with that repo. Follow Instruction for setting upstream here:

Development on CARC

Compiling

  • the source directory /project/scec_608/maechlin/dev/ucvm
  • the installation directory /project/scec_608/maechlin/ucvm_bin


When someone is looking at your project, they want to know:

  1. what is it?
  2. how good the code is?
  3. how much support is available?
  4. what’s included?
  5. what does it look like?
  6. how set it up?

Science Code Manifesto Elements:

  1. Code
  2. Copyright
  3. Citation
  4. Credit
  5. Curation


Steps To Software Product:

  1. Create citable, definitive version of software with doi, license, and repository.
  2. Define reference publication used to cite software.
  3. Define software as reference implementation of a method, and define a set of approved software acceptance/regression tests that can be used to establish a software implements that “method”.
  4. Create software maintenance organization with commit authority for pull requests and approval process for change requests, and process of approving new releases.
  5. Establish software community through registrations, newsletters, activity, regular calls, regular meetings, define community and roles.

Adoption of Fork and Pull Git Repo Model

  • Use the model used by the majority of open-source projects (including pyCSEP).
  • The “maintainer” of the shared repo assigns rights to “Collaborators”
  • Collaborators do not have push access to main (upstream) repo
  • Core development teams accepts (PRs) from collaborators, reviews them, then merges them into main repo

Contributor Process:

Working with shared projects on GitHub

  1. Fork the repository
  2. Clone your forked copy
  3. Sync your personal repo with shared repo
  4. Git merge/git rebase
  5. Make a contribution
  6. Pull request


How we want it Cited:

  • Example Citation:
  • Example Acknowledgements:
  • Example Reference:

Basic Recommendations:

  1. Make source code publicly accessible
  2. Make software easy to discover by providing software metadata via a popular community registry (Examples of community registries of software metadata are bio.tools (Ison et al., 2016), (Ison et al., 2016) biojs.io (Corpas et al.,2014; Gómez et al., 2013) and Omic Tools (Henry et al., 2014) in the life sciences and DataCite (Brase, n.d.) as a generic metadata registry for software as well as data.
  3. Adopt a license and comply with the license of third party dependencies
  4. Define clear and transparent contribution, governance and communications processes (For instance the Galaxy project’s website describes the team’s structure, how to be part of the community, and their communication channels.)

Types of Documentation with axis:

  • help learning – help working
  • theoretical knowledge – practical knowledge
  1. tutorials - learning oriented
  2. how-to guides – task-oriented
  3. Background/Concept explanations – understanding-oriented
  4. technical reference – information-oriented

DOCUMENTATION TYPES

  • CODE DOCUMENTATION - Semantic identifiers, comments, API, engineering, dependencies, requirements
  • USER DOCUMENTATION - How to get, run, use the software; parameters, data model, etc.; license
  • MAINTENANCE DOCUMENTATION - How to build, release, review code, publish
  • DEVELOPER DOCUMENTATION - How to contribute, contribution templates (issues, pull/merge requests)
  • METADATA - Software metadata (CodeMeta), Citation File (CFF), "references" (dependencies)
  • PROJECT DOCUMENTATION - Rationale, teams, governance, community (contact, code of conduct)

Where Documentation Lives

Documentation lives where the source code lives! (This is never in an email, chat, or similar!)

Conceptual Documentation:

  • Requirements
  • Projects

Hands-on documentation

  • How-tos, getting started
  • Templates for issues, pull/merge
  • Contribution guidelines

Reference documentation

  • API
  • Tests
  • Metadata

Toolbox Documentation:

Toolbox documentation should describe the steps off analysis in a pedagogical, narrative fashion, with example data that users can load to follow along with and understand the documentation.


UCVM Implements Multiple Test Types:

  1. Functional tests – Unit Tests essential core ucvm functions
  2. Integration Tests – Test utilities including meshing, layer searches, gtls, and performance
  3. Model Tests - Each velocity model has tests showing expected results for some points
  4. Example program - Examples programs and scripts showing working examples of ucvm capabilities.
  5. Acceptance tests – Confirm results on users system. Maybe union of funcational, integration, and model tests.


Standardized Testing Plan

  1. Unit Testing – test a small functional units
  2. Regression Testing – ensure a fixed bug remains fixed
    1. Core system regression tests
    2. Model specific regression tests
  3. Integration Testing – CI test suites
    1. Tests run after each commit
    2. Tests run after pull request
  4. Performance and Benchmarking Tests
  5. Acceptance Testing
    1. Tests provided with models that show expected results
    2. Tests provided with pull request that confirm the changes
    3. Tests that confirm a new release must pass
    4. Tests that software performs correctly on user environment

Recommended Basic Software Practices:

  1. Training on Software Practices
  2. Code in a Code Repo
  3. Automated Testing
  4. Persistent ID for software versions