Details
-
Feature
-
Could have
-
None
-
Data Processing
-
-
-
Inter Program
-
5
-
5
-
2
-
0.4
-
Team_YANDA
-
Sprint 5
-
-
-
-
15.1
-
Stories Completed, Solution Intent Updated, Outcomes Reviewed, NFRS met, Accepted by FO
-
-
SPO-1591
Description
Who?
- For SDP workflow developers, for AA operators
What?
- Improved developer experience for managing SDP workflows and integrating them into SDP
- Minimal consideration of workflow provenance information
Why?
- As we start adding more workflows or workflow variants to the SDP system we need a well established and easy to use way of adding workflows to the SDP system.
Notes
- Improve developer experience for adding and managing SDP workflows
- Consideration of minimal workflow provenance data
- TBD: Extend to revisiting workflow execution and characterisation with a change of title to "SDP workflow management and execution
---
In particular, reproducibility of data products can be enabled and improved through systematic use of repositories as well as versioning and release functions. Furthermore, collaborative functions allow the gradual build-up of an eco-system of software components. See also https://confluence.skatelescope.org/display/SWSI/DALiuGE+Workflow+Management
__
Elevating Peter's input from the comment section:
Note that the applicable architectural view here would be the science pipeline management use case view: http://ska-sdp.org/sites/default/files/attachments/ska-tel-sdp-0000013_07_sdparchitecture_usecase_views_0.pdf
I.e. this should make progress on implementing the use case of introducing, validating and deploying a (new) workflow. It is okay if we do this for a particular kind of workflow, however it should iterate from the existing design in https://gitlab.com/ska-telescope/sdp/ska-sdp-science-pipelines with the workflow list https://gitlab.com/ska-telescope/sdp/ska-sdp-science-pipelines/-/blob/master/workflows.json as the main connection point to the SDP LMC.