Skip to end of metadata
Go to start of metadata

System Integration, Test, and Verification (ITV)

ITV encompasses three separate yet related concerns:

Integration, which attempts to tie all the components of the system together, Test, which attempts to check the quality of the product being built, and Verification, which attempts to verify the built product addresses the requirements that were identified by user discussions.

See System Integration for details on CI Integration.

Test and Verification for the Cyber Infrastructure can be broken down into a number of elements:

  • Unit Testing
  • Regression Testing
  • Integration Testing and System Integration Testing
  • Specialized Testing
  • System Verification Testing
  • System Acceptance Testing

See CI Integration, Test and Verification Plan for details of the overall CI testing plan.

ITV Activities (by topic)

Unit Testing

Unit (or component) tests assess the functionality of specific pieces of code. By definition, a unit is the smallest testable piece of a larger software module. The subsystem development teams are responsible for unit testing all the code that is developed. In order to gauge how thorough the unit tests cover the subsystem code, a code coverage tool will be utilized to provide the appropriate metrics. Based on those metrics, action can be taken to improve the coverage in specific areas that appear lower than expected. An internal target should be > 80% of the code is executed by our tests.

Regression Testing

Regression testing is a continuing activity that is used to uncover defects that occur as a result of changes to code. The term software regression refers to old bugs that have recurred as a result of ongoing code development. For the CI development, regression testing is implicit through the use of the automated suite of tests (see unit testing above) that execute at regular intervals. Additionally, once formal tracking of bugs has begun as part of the Release 1 Transition phase, new regression tests will be added for each bug fix to verify the bug has been fixed and does not recur in the future. The subsystem development teams are responsible for regression testing all bug fixes.

Integration Testing and System Integration Testing

The CI software will be integrated in an iterative way, building from the unit level up to the subsystem level and then to the CI system level. Integration testing is the evaluation of the proper functionality and performance of two or more software modules (i.e. components, services or subsystems). The purpose of integration testing is the exposure of defects in software interfaces and proper interactions between software modules. The subsystem development teams are responsible for integration testing different components or services within their subsystems. It is the responsibility of the ITV and subsystem teams to test multiple subsystems together to perform System Integration Testing.

Specialized Testing

Some specialized testing will be performed as part of the overall CI testing that incorporates a number of non-functional testing such as:

  • Security Testing - Evaluates identity and policy management features related to authentication, authorization, and auditing capabilities.
  • Stress and Volume Testing - Evaluates reliability, availability, and scalability and should be performed as part of the overall system testing effort
  • Stability Testing - Evaluates the ability of the code to run continuously under a normal load without problems. This should detect memory growth and growth in CPU Utilization and should be performed as part of the overall system testing effort

Specialized testing is the responsibility of the ITV team.

System Verification Testing

System Verification is the assessment of the integrated and tested CI for a given release against the L3 and L4 requirements that ultimately drive the product's design. It asks the question "was the system built right?". In essence, System Verification is a specialized form of testing against the requirements that is the responsibility of the System Engineering team, in collaboration with the ITV (and development) team(s).

Verification Procedures are written for each supported L3/L4 requirement that needs to be verified. These procedures are organized by subsystem and grouped by use case to enable ease of testing multiple requirements with one set of procedures (if at all possible).  System verification starts during construction and may continue through transition.  A requirements verification traceability matrix is a key tool in tracking the test procedures against the associated requirement(s). A number of test methods are used for verifying requirements:

  • Testing - Functional and interface requirements are usually verified by actual tests written in an appropriate programming language.
  • Demonstration - Behavioral and performance requirements are usually verified by the use of the product in some operation that can be qualitatively evaluated or shown (i.e. success is demonstrated by observation alone or by simple measurements).
  • Analysis - Reliability, Maintainability, and Availability (RMA) requirements are typically verified through the use of analysis of mathematical, narrative, graphical, or physical models. This could include estimation of execution times or system resources.
  • Inspection - Quality and constraint requirements are usually verified by a visual examination of the product's physical characteristics, processes, or documentation.

See Verification Methods for a detailed explanation and examples of each verification method.

System Acceptance Testing

System Acceptance Testing is to confirm that the the CI Release is ready for acceptance by and ownership transfer to Ocean Leadership. This effort includes both the Verification and Validation of Release Requirements and is the joint responsibility of Ocean Leadership System Engineering, CI System Engineering, in collaboration with CI ITV team.

  • Verification is accomplished through a number of items:
    • Verification Reports for System Verification execution (referenced above) have been completed and posted to the appropriate location.
    • Test Procedures for System Verification have been posted to the appropriate location
    • All bugs associated with System Verification have been entered into JIRA and associated with the release
    • All bugs of severity 1, 2, or 3 associated with System Verification have been resolved and marked as resolved in JIRA
    • Evidence has been provided that the appropriate regression testing was performed after bugs associated with System Verification have been resolved
  • Validation encompasses a separate set of test procedures to demonstrate that the "right system was built" (i.e. that the system stakeholders' needs are correctly addressed by the delivered system). This is essentially a high level user focused Use Case validation.
  • Additional testing is performed as part of System Acceptance Testing that includes: Security, Performance and Load, Stability, Error Handling, and Documentation.

ITV Activities (by time)

Release 1 Construction 2

Process

Responsibilities

  • Development Team
    • Develop software to specification, requirements, use cases, agreements
    • Develop unit and integration tests
    • Document major architecture and design decisions and critical steps on the Wiki
    • Fix defects assigned to developers for targeted releases
    • Develop regression tests for defects that are fixed
    • Run existing test frequently
    • Before release, run existing tests on release candidate
    • Fix all blocker and critical defects before release
    • Document all open issues in Jira issue tracking system
    • Deliver release candidate to IVT for testing and keep untouched during test period
    • After IVT testing, fix blocker and critical issue; don't introduce new code
    • Report on software/release/defect status in development team meeting
  • Integration, Verification and Test Team (IVT) 
    • Work with development teams to produce system integration tests
    • Work with development teams to produce security, stress and volume, and scalability tests
    • Assist System Engineering to produce test procedures for System Verification and System Acceptance Testing
    • Produce System Verification and System Acceptance test code
    • Receive release candidate from development team
    • Perform System Verification and System Acceptance Testing
    • Document all open issues in Jira issue tracking system
    • Report on System Acceptance Test status in development team lead meeting
    • Issue improvement and new feature requests as needed to maintain a sustainable architecture and user requests
  • Development Team Lead
    • Conduct weekly development team meetings
    • Keep track of assigned action items
    • Make release and defect fix decisions
    • Develop statements of work, assign tasks
    • Facilitate information exchange across team

Conventions

  • Issue Tracking using Jira
    • Fill in component, version and other available attributes
    • Be as lucid but also as specific as possible to describe the defect
    • Use consistent and clearly defined terminology. If in doubt, refer to a definition or provide one
    • Crash: The system or one of its components fails and stops and is unavailable subsequently.
    • Failure: An error occurs when executing a certain function

R1 Schedule

Currently for CI Release 1, the following schedule is being targeted:

  • R1 C3: The end of Release 1 Construction Iteration 3 is targeted for 4/29/11
  • IOC Review: After R1 C3, there is one week to prepare and one week to present the Initial Operational Capability (IOC) review. Completion of IOC Review is targeted for 5/13/11
  • Transition: After IOC Review, there is an 8 week transition phase to complete testing and perform bug fixing. Completion of Transition is targeted for 7/08/11
  • PRR: After transition, there is one week to prepare and one week to present the Product Readiness Review (PRR). Completion of PRR is targeted for 7/22/11
Enter labels to add to this page:
Please wait 
Looking for a label? Just start typing.