Data Analysis Workflows (DAW) for scientific problems are often developed only for a specific problem at a specific institution for a specific infrastructure. This leads to the situation that DAWs contain multiple hard-coded decisions, realizing implicit expectations regarding their input data and their execution environment. However, DAWs also are frequently reused for different problems (the research question evolves), for different data (the data to be analyzed changes), or on different infrastructures (the execution environment changes). Such reuse is painful when hard-coded, undocumented decisions strongly influence the DAW execution [1]
. Running a DAW with such properties under different circumstances may lead to incorrect results, sudden aborts, meaningless log entries, non-terminating executions, overflows of buffers (memory, disk, log-space) etc. Very often, the execution simply stops with an arbitrary, undocumented low-level error (e.g., “core dump”).
One way to prevent or at least reduce the number of such cases is the enhancement of DAW execution engines with the ability to manage and control validity constraints. A validity constraint (VC) is a constraint on some property of the input data or the execution environment. We will differentiate between hard VCs, which must be fulfilled to ensure correct execution of a DAW (e.g., result file size must be larger than 0 bytes), and soft VCs, which indicate deviation from expected behavior that may or may not be fatal (e.g., result file size is typically between 120 MB and 180 MB). VCs of both types may be detected automatically from log file analysis or can be specified by a DAW developer as part of the DAW specification (like integrity constraints for databases or assertions in programming languages). They can target any level of abstraction of a DAW, i.e., the abstract, logical, or physical level. Once uncovered, VCs can be controlled automatically by a DAW engine; typically, violation of hard VCs lead to abortion of DAW execution with a meaningful error message; violation of soft constraints will result in warnings to the developer. Depending on the specific VC, they may also be checkable prior to executing the DAW, which may lead to substantial reductions of computations and wait times.
Accordingly, the goal of team T3 is to develop jointly a framework for managing VCs during the DAW life cycle of creation, execution/control, and adaptation. We will create a software library that can be integrated into DAW specification languages, into DAW execution engines, and into resource managers. The framework will model the most important aspects of DAWs in terms of VCs (i.e. the properties and attributes that are to be constrained), allow for the registration and efficient search of VCs before and during execution of a DAW, and provide a driver to check VCs during DAW execution.
Collaborations
- A1 What is a valid trace of log information?
- A3 What is a valid analysis results?
- A5 What is a valid classification model?
- B3 What is a valid execution?
- B4 What are valid file placements?
- B5 What are valid file properties?
Related Publications
2024
Schintke, Florian; Belhajjame, Khalid; De Mecquenem, Ninon; Frantz, David; Guarino, Vanessa Emanuela; Hilbrich, Marcus; Lehmann, Fabian; Missier, Paolo; Sattler, Rebecca; Sparka, Jan Arne; Speckhard, Daniel T.; Stolte, Hermann; Vu, Anh Duc; Leser, Ulf
Validity constraints for data analysis workflows Journal Article
In: Future Generation Computer Systems, vol. 157, pp. 82–97, 2024, ISSN: 0167-739X.
@article{SCHINTKE2024,
title = {Validity constraints for data analysis workflows},
author = {Florian Schintke and Khalid Belhajjame and De Mecquenem, Ninon and David Frantz and Vanessa Emanuela Guarino and Marcus Hilbrich and Fabian Lehmann and Paolo Missier and Rebecca Sattler and Jan Arne Sparka and Daniel T. Speckhard and Hermann Stolte and Anh Duc Vu and Ulf Leser},
url = {https://www.sciencedirect.com/science/article/pii/S0167739X24001079},
doi = {https://doi.org/10.1016/j.future.2024.03.037},
issn = {0167-739X},
year = {2024},
date = {2024-01-01},
urldate = {2024-01-01},
journal = {Future Generation Computer Systems},
volume = {157},
pages = {82--97},
abstract = {Porting a scientific data analysis workflow (DAW) to a cluster infrastructure, a new software stack, or even only a new dataset with some notably different properties is often challenging. Despite the structured definition of the steps (tasks) and their interdependencies during a complex data analysis in the DAW specification, relevant assumptions may remain unspecified and implicit. Such hidden assumptions often lead to crashing tasks without a reasonable error message, poor performance in general, non-terminating executions, or silent wrong results of the DAW, to name only a few possible consequences. Searching for the causes of such errors and drawbacks in a distributed compute cluster managed by a complex infrastructure stack, where DAWs for large datasets typically are executed, can be tedious and time-consuming. We propose validity constraints (VCs) as a new concept for DAW languages to alleviate this situation. A VC is a constraint specifying logical conditions that must be fulfilled at certain times for DAW executions to be valid. When defined together with a DAW, VCs help to improve the portability, adaptability, and reusability of DAWs by making implicit assumptions explicit. Once specified, VCs can be controlled automatically by the DAW infrastructure, and violations can lead to meaningful error messages and graceful behaviour (e.g., termination or invocation of repair mechanisms). We provide a broad list of possible VCs, classify them along multiple dimensions, and compare them to similar concepts one can find in related fields. We also provide a proof-of-concept implementation for the workflow system Nextflow.},
keywords = {},
pubstate = {published},
tppubtype = {article}
}
Porting a scientific data analysis workflow (DAW) to a cluster infrastructure, a new software stack, or even only a new dataset with some notably different properties is often challenging. Despite the structured definition of the steps (tasks) and their interdependencies during a complex data analysis in the DAW specification, relevant assumptions may remain unspecified and implicit. Such hidden assumptions often lead to crashing tasks without a reasonable error message, poor performance in general, non-terminating executions, or silent wrong results of the DAW, to name only a few possible consequences. Searching for the causes of such errors and drawbacks in a distributed compute cluster managed by a complex infrastructure stack, where DAWs for large datasets typically are executed, can be tedious and time-consuming. We propose validity constraints (VCs) as a new concept for DAW languages to alleviate this situation. A VC is a constraint specifying logical conditions that must be fulfilled at certain times for DAW executions to be valid. When defined together with a DAW, VCs help to improve the portability, adaptability, and reusability of DAWs by making implicit assumptions explicit. Once specified, VCs can be controlled automatically by the DAW infrastructure, and violations can lead to meaningful error messages and graceful behaviour (e.g., termination or invocation of repair mechanisms). We provide a broad list of possible VCs, classify them along multiple dimensions, and compare them to similar concepts one can find in related fields. We also provide a proof-of-concept implementation for the workflow system Nextflow.
2023
Schintke, Florian; De Mecquenem, Ninon; Frantz, David; Guarino, Vanessa Emanuela; Hilbrich, Marcus; Lehmann, Fabian; Sattler, Rebecca; Sparka, Jan Arne; Speckhard, Daniel T.; Stolte, Hermann; Vu, Anh Duc; Leser, Ulf
Validity Constraints for Data Analysis Workflows Miscellaneous
2023.
@misc{schintke2023validity,
title = {Validity Constraints for Data Analysis Workflows},
author = {Florian Schintke and De Mecquenem, Ninon and David Frantz and Vanessa Emanuela Guarino and Marcus Hilbrich and Fabian Lehmann and Rebecca Sattler and Jan Arne Sparka and Daniel T. Speckhard and Hermann Stolte and Anh Duc Vu and Ulf Leser},
year = {2023},
date = {2023-01-01},
urldate = {2023-01-01},
keywords = {},
pubstate = {published},
tppubtype = {misc}
}
References
- Sehrish Kanwal and Farah Zaib Khan and Andrew Lonie and Richard O. Sinnott (2017): Investigating reproducibility and tracking provenance – A genomic workflow case study. In: BMC Bioinformatics, vol. 18, 2017.