Details
-
Epic
-
None
-
None
-
None
-
Workflow Scalability Prototype
-
Data Processing
-
0
Description
Demonstrate scaling of Data Processing workflows towards full array SKA1 operations (for both SKA1-Mid and SKA1-Low). This will focus on prototyping one or more I/O intensive workflows using high-level Execution Engines. This should test
- interactions between Execution Frameworks and Processing Components,
- Data Island creation and usage as well as
- the role and performance of data queues (eg for used communication of QA and calibration solutions).
Epic Hypothesis Statement (https://www.scaledagileframework.com/epic/)
For | platform operators, execution engine and workflow developers |
who | run workflows involving intense storage access and internal communication, especially concerning reading and distributing visibilities |
the | Scalability Prototype |
are a | number of targeted benchmarks |
that | enable assessment of the I/O capabilities, reliability and flexibility of hardware and software systems (especially execution engines) for SKA use cases, especially considering data transfers to and from accelerators |
Unlike | synthetic benchmarks and tests using existing software |
our solution | simulates the kind of stress that comes from reading back an observation from the buffer at about ~10 times observation speed, including data replication for implementing faceting to deal with large images. Provides detailed performance data to help identify bottlenecks in executing workflows. |
Outcome hypothesis |
|
Leading Indicators | |
NFRs |
Attachments
Issue Links
- Child Of
-
SS-5 Evolutionary Prototype/SKAMPI
- Done