The term 'provenance', in a Computer Science context is taken to mean the lineage of digital data or information processing which can be recorded for quality control, auditing, affording outputs a level of trust or process improvement. Research in the last decade and a half has established standardised models for provenance that include data artefacts, processes that act on data and agents that are responsible for those processes.
Provenance capture and management can use custom or common tools and may occur at any level: deep within computer architectures, at network junctures, for a single system or a systems-of-systems. One person may implement a provenance regime but so too may a whole organisation or even consortia.
Of most interest to this conference is the generation or application of considered provenance theory to any sort of modelling, simulation or data management scenario.
Key topics: Provenance, Lineage, Trust, Quality assessment