Skip to end of metadata
Go to start of metadata

Overview of "Produce Real-Time Calibrated Data" Use Case

Process arriving sensor data into a calibrated, quality-controlled data product.


Tip: Key Points
UC Priority= 4 or 5: Critical, is in R2
Only boldface steps are required
<#> before a step —> lower priority
(optional) —> run-time option

Related Jira Issues:   Open   •   All

Metadata

Refer to the Product Description and Product Description Release 2 pages for metadata definitions.

Actors Data Process Programmer
References CIAD SA OV Data Stream Processing
Uses  
Is Used By  
Extends UC.R2.21 Transform Data in Workflow
Is Extended By  
In Acceptance Scenarios AS.R2.02C Instrument Life Cycle Support, AS.R2.04A Data Product Leads Drive Core Data Product Creation
Technical Notes This is an exemplar use case. It does not mandate setting up a workflow capability for calibration and QC;
it is just to demonstrate the viability of transforming input data automatically as it arrives.
This use case generates a new data record, rather than annotating an existing data record with comments.
Lead Team SA
Primary Service Data Processing Services
Version 2.3.1
UC Priority 4
UC Status Mapped + Ready
UX Exposure PGM

Summary

This information summarizes the Use Case functionality.

Use the data processing capabilities of the system to apply a calibration and/or quality control algorithm to a data set, and create a new 'quality controlled and/or calibrated data set' on the fly as data arrives.

Assumptions

  • The assumptions of UC.R2.21 Transform Data in Workflow apply.
  • A Level 1 or Level 2 data set will result, since the process produces new data types (the calibrated, quality controlled data) not present in the original.
  • The form of the calibrated, quality controlled product is prototypical, not representative of similar products to be produced in Release 3.
  • The specification for the transformation is provided in a Data Product Specification.

Initial State

A suitably authorized user is logged in and ready to enter the data process definition to perform the calibration and QC transformations.

Scenario for "Produce Real-Time Calibrated Data" Use Case

The italicized steps are adapted from UC.R2.21 Transform Data in Workflow; the non-normative (second level in the outline) comments are unique to this scenario.

  1. Data Process Programmer (DPP) creates a data process definition specifying each required calibration and quality control transformation.
    1. Application of calibration coefficients may be one step, and each quality control process a separate step; these are expected to produce separate data products (hence, different definitions).
    2. This script can be tested and validated off-line, requiring minimal changes for use in the system.
    3. The data process definition is based on a Data Product Specification provided by OOI scientists.
  2. DPP defines the actual data transformations.
    1. Again, one transformation for each definition.
  3. The DPP configures the data process definition's execution frequencies.
    1. This should be set to execute whenever the data arrives from the previous transformation (the sequence of transformations is data-driven).
  4. The DPP defines the resulting output(s) from each transformation.
    1. In some cases transformation outputs will not be persisted; on other cases the Data Process Specification may call for them to be persisted.
    2. The outputs from the last transformation in the chain should be kept available for a period, if the data will be accessed by anyone other than the real-time subscribers.
  5. The DPP activates each data transformation in the chain.
  6. The arrival of new data causes each data transformation process in turn to be executed, creating the calibrated, quality controlled data.
  7. Real-time metadata for the produced resources are updated automatically by the data processes.
    1. Static metadata were updated when the resources were first registered.
  8. The transformation process issues an event (in addition to publishing resulting data) after each execution's completion.
  9. The DPP inspects any persisted outputs to ensure they are appropriately transformed and described.
    1. This check occurs outside of the routine ION processing — it can be performed manually.
    2. An example verification: Were right data outputs created with appropriate metadata attached? Were values correct?
  10. Each data process continues operating on newly arriving data supplements, until it is deactivated.
    1. Confirm that supplements are correctly processed.
    2. Confirm that processing stops when a data process is deactivated.

Final State

The data process definition is registered, is executing successfully according to the prescribed schedule, and is producing the desired products.

Comments

These comments provide additional context (usually quite technical) for editors of the use case.

The definition of the comprehensive data quality control algorithms, and of calibration algorithms, will occur as part of the activation of each instrument type (during driver and QC procedure development and configuration). However, the QC framework to execute it will only be in place at the end of R3 (S&A). So this use case is simply to develop an example and identify issues.

(click on # to go to R2 use case)
01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20
21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40
41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60
61     27B

Labels

r2-usecase r2-usecase Delete
usecase usecase Delete
productdescription productdescription Delete
Enter labels to add to this page:
Please wait 
Looking for a label? Just start typing.