Instrument Driver development is a complex process that required interaction between many people. The process can be broken down into three phases that will be driven by different system engineers and lead developers. These phases are Specification, Development, and Integration. During the specification phase the IOS document is authored and verified, then code is written and verified (through qualification testing) in the development phase, and the driver is verified within the larger system during a post-development integration phase.
Current MI Dataset Roles
MI Driver Development Systems Team Lead
MI Driver Development Software Team Lead
MI Software Engineer Lead
MI Integration Team Lead
Verifying New Instruments Against Developed Drivers
It is quite possible that a driver will be usable for instruments on multiple platforms. When developing the driver it is important that we indicate which instrument(s) generated the data files we are developing against. Then the integration team will verify and integrate against the instrument(s) indicated in that case. It is possible that we will not have all instruments commissioned and producing data when the driver is developed and the driver will have to be verified against those instruments in the future and will need to be integrated independently of the driver development. However, we should follow the same development process for these instruments. An IOS author will verify the files are the same, SE will ensure the instrument has stream and particle definition data in confluence, driver developers will add automated tests, and finally it transitions to integration. Using this method we can have high confidence the driver will function as expected and should integrate easily.
The entire dataset agent development process will be tracked using Jira. Each dataset agent will have one Jira ticket. Depending on the stage in the lifecycle, the task will be assigned to the appropriate person who is handling the task currently. It is the process owner's responsibility to ensure work is getting done and tickets aren't getting stale. Each development phase will have a different process owner which is detailed below.
When a task stage is completed the current owner should assign that ticket to the owner of the following stage, as detailed below.
All tasks should be created in the Marine Integration Development Queue (CIDEVMI).
Tasks should be named <Identifier> Dataset Agent Development, where <Identifier>, is replaced by the agent's unique identifier.
If we want to track IDD development and review, then separate Jira tickets will be needed because the IDD process runs at least partially in parallel with code development. SGaul
Life Cycle Phases
This process should be owned by the MI Dataset Team Lead System Engineer. The review should be managed by the Dataset Team, not by the MI Integration Team. The MI integration team should participate in the review, as should the Data Product Agent team and the Dataset team coders. The "identifying or creating record parameters and getting them added into the preload google doc spreadsheet" task should be completed by the integration team with support from the dataset agent team. What are "network map pages"? SGaul
Process Owner: IDD Author, MI Integration Team Lead
IDD Authoring and Verification
Estimated Duration: 4 days
This task includes characterizing a dataset and writing the IDD for that dataset. This includes identifying or creating record parameters and getting them added into the preload google doc spreadsheet, and ensuring that the records created will provide the variables required by the data product specification. It also includes identifying a sample set of data in order to verify that the dataset characterization agrees with the sample data.
When the IDD document is complete the ticket should be assigned to the MI Integration Team Lead and reviewed by:
MI Dataset System Engineer - verify specification. Ensure all network map pages intended for verification have sample data files generated from the associated platform.
Assigned MI Dataset Agent Software Developer - verify IDD contains enough detail for the developer
Data Product Algorithm Development team - cross check that parameters required by the DPAs are provided by the Dataset Agent
MI Lead Software Engineer - identify any potential architectural changes required
MI Integration Team - verify IDD stream definitions align with OOIN standards and are correctly defined
Should any changes be required they should be coordinated in this ticket.
Transition: After the review is complete the parent task should be assigned to the MI Dataset Agent Software Lead, who will reassign it to the appropriate developer.
Process Owner: Code Developer
Requirement Review and Design
Estimated Duration: 2 days
The developer will work with the MI Dataset Agent Lead and MI Software Engineer Lead:
Understand the IDD and verify completeness.
Plan development to understand time stamping, inheritance structure, configuration and any other design features.
Identify any architectural limitations and plan implementation.
Estimated Duration: 5 days
This task involves writing the code for the parser and writing the tests to verify the parser can parse a sample set of data. All tests should pass and MI Dataset Agent Lead should verify tests are complete.
Estimated Duration: 0-4 days
This task may not be needed for all dataset drivers, and won't be identified until after the IDD is written and understood by the dataset driver developer. It involved writing a new harvester and writing the tests to verify it is functioning, then running them with nosetests. All tests should pass and MI Dataset Agent Lead should verify tests are complete.
Estimated Duration: 6 days
The driver task consists of writing the driver code and writing tests to verify the driver is functioning using the dataset driver test framework. Integration tests, which verify standalone driver functionality, and qualification tests, which verify driver functionality within the COI capabilities container must be completed. All tests should pass and MI Dataset Agent Software Lead should verify tests are complete.
Transition: Assign the dataset driver task to the MI Dataset Agent Lead.
Review and Publish
Process Owner: MI Dataset Agent Software Lead
Final Driver Review
Estimated Duration: 4 days
The final driver review is assigned to the driver developer, but will involve at least one other person to review the code. Ideally one common person will review the drivers for consistency. This task is not complete until a code review is held, any bugs to fix are identified, and those bugs are fixed.
Review should include, but is not limited to:
Ensure data particles produced align with the IDD
Verify tests are comprehensive including both positive and negative tests
Verify code is reused appropriately
Should issues arise that require code rework, this ticket should be reassigned to the original developer with a note on what corrective action should be taken. Once fixed the ticket should be reassigned back to the reviewer and the review process should continue.
Reviews are handled in https://crucfish.oceanobservatories.org and the driver developer should create the review including all files that were modified including all sample and resource files. Reviews should include the MI Dataset Agent Software Lead and MI Software Engineer Lead. A link should be emailed to the reviewers notifying them of the review request. Reviewers will complete the review within 24 hours of submission.
Estimated Duration: 1/2 days
This task involves running the package driver script to generate the egg file for a new version of the driver, and copying that egg file into the main driver repository.
Transition: Assign the dataset driver task to the MI Integration Lead who will reassign the task to the appropriate integrator.
Integration and Verification
Process Owner: MI Integration Lead, Integrator
Driver Integration and Verification
Estimated Duration: 3 days
Driver integration requires that all of the steps of agent integration are performed as described in Agent Integration Steps.
Transition: Resolve this ticket and notify Ocean Leadership. Update confluence network map page to indication verification status.