Welcome to WORKS06
In recent years workflows have emerged as a key technology that enables large-scale computations on distributed resources. Workflows enable scientists to design complex applications that are composed of individual application components or services. Often times these components and services are designed, developed, and tested collaboratively. Because of the size of the data and the complexity of the analysis, large amounts of shared resources such as clusters and storage systems are being used to store the data sets and execute the workflows. The process of workflow design and execution in a distributed environment can be very complex and involve mapping high-level workflow descriptions onto the available resources, as well as monitoring and debugging of the subsequent execution. Because computations and data access operations are performed on shared resources, there is an increased interest in managing the fair allocation and management of those resources at the workflow level.
Adequate workflow descriptions are needed to support the complex workflow management process which includes workflow creation, workflow reuse, and modifications made to the workflow over time—for example modifications to the individual workflow components. Additional workflow annotations may provide guidelines and requirements for resource mapping and execution.
Large-scale scientific applications pose several requirements on the workflow systems. Besides the magnitude of data processed by the workflow components, the resulting and intermediate data need to be annotated with provenance information and any other information needed to evaluate the quality of the data and support the repeatability of the analysis.
The Workshop on Workflows in Support of Large-Scale Science focuses on the entire workflow lifecycle including the workflow composition, mapping, and robust execution. The workshop also welcomes contributions in the applications area, where the requirements on the workflow management systems can be derived.