Information security depends on the quality and the excellent understanding of the functionality of software systems. Only if this functionality is guaranteed to be correct and safe, customers and information can be protected against adversarial attacks. Thus, to make communication and computation secure against catastrophic failure and malicious interference, it is essential to develop correct and safe software systems. A scientific foundation of software engineering provides models that enable us to capture application domains and their requirements, but also to understand the structure and working of software systems, software architectures, and programs. Software and systems safety as a consequence of high-level software engineering methods depends on the excellent understanding of both a clear and precise description (specification) as well as the proof of correct behaviour (verification).
The 31st International Summer School Marktoberdorf 2010 on Software and Systems Safety: Specification and Verification assembled a group of internationally leading researchers and experts (rising stars as well as honoured ones), who present their experience in the specification and verification of software systems – accompanied by corresponding tools and methods. Our lecturers, representing excellent research groups from industry and academia, taught a detailed and excellent overview on current research results with special emphasis on software information security.
In his lectures on Model-based Testing, ED BRINKSMA presented a theory of model-based testing for reactive systems and its applicative context. He gave a general introduction to, and a framework for the concepts of control-oriented testing. He discussed a number of implementation relations and in particular introduced the ioco relation as the basis for practical model-based testing. His course dealt with the principles of test generation from specification models and the TorX tool for the automated test generation, execution, and evaluation of test runs. Moreover, he discussed a concrete application example and sketched the extension of the theory to real-time systems.
In his lectures on Towards a Theory of Architectural Contracts: Schemes and Patterns of Assumption/Promise Based System Specification, MANFRED BROY presented basic concepts and paradigms for the specification and reasoning about systems, their interfaces and architectures as well as their properties in terms of contracts based on the idea of assumptions and promises. The key idea is that an assumption represents a specification of properties of the environment in which a system can be used, and that the promise describes the specification of properties of the system guaranteed under the assumption. Based on this idea, he derived explicit system specifications forming implicative interface assertions and analysed more specific aspects of the specification and reasoning about properties of systems and their architectures in terms of assumptions and promises. What he studied, in particular, is the fact that system assertions show some specific structure, for instance, in terms of liveness and safety properties as well as causality and realizability. This structure influences the meaning of assume/promise-specifications and imposes healthiness conditions for them.
With Engineering Evolving and Self-adaptive Systems: An Overview, CARLO GHEZZI gave an introduction to software evolution from a historical perspective. He characterized the history as progressing from a closed to an open world setting. The open world setting requires continuous evolution and, when possible, self-adapting capabilities in the software to continue to satisfy the requirements in the presence of environmental changes. He dealt with software architectures that support evolution and adaptation. Moreover, he surveyed the main characteristics of publish-subscribe, repository-based, and service-based architectures and focused on architectures supporting Web service composition. A methodology based on the assume-guarantee pattern was proposed as well.
Abstraction for System Verification was the topic of the lectures of SUSANNE GRAF. The use of appropriate abstraction is a key for successful verification of properties of programs and systems. Solving a general verification problem M $\models$ φ is of high complexity even if the representations M and φ are small – this is known as the state explosion problem. In an abstraction-based approach, you have to compute an appropriate abstraction function α at reasonable cost and an abstract interpretation α(M) of a program or system such that (1) checking α(M) $\models$ φ can be done with reasonable effort and (2) one can prove a priori that positive results of M $\models$ φ are preserved from the abstract to the concrete setting. Strong preservation requires that also negative results can be transferred from the abstract to the concrete domain. The general framework of abstraction is based on the use of Galois connexions, and applies these general results to show property preservation and to construct abstractions for transition system extended with data.
JOHN HARRISON's lecture on Formal Verification gave a survey to main results in automated reasoning that are applicable to formal verification. He started with the most basic case of propositional logic (SAT), which has experienced a resurgence of interest in the last decade or so. Then first-order logic and the decidability questions for various arithmetical first-order theories were explained, as well as how such decision procedures can be combined (SMT) and certified. Finally, he demonstrated the construction of interactive proof assistants for situations where complete automation is not feasible.
In Requirements Models for Critical Systems, CONSTANCE HEITMEYER explained how to capture software system requirements one must elicit, document, and analyze for the required externally visible behaviour of the system within some physical context. Moreover, the system requirements were modelled and specified in terms of required system services and environmental assumptions. These lectures provided a systematic formal approach to the modelling, specification, validation, and verification of requirements of critical systems, focusing on systems that must satisfy critical properties, especially safety and security properties. For a given system, the objective of this approach is to produce a complete, consistent, and correct set of requirements, relying on the use of formal specifications to achieve precision, to avoid ambiguity, and to serve as a basis for many types of analyses. Various techniques supporting the creation, analysis, and validation of requirements to improve their quality were discussed. The presentations were illustrated with representative examples from real-world systems.
From Concurrency Models to Numbers: Performance, Dependability, Energy was taught by HOLGER HERMANNS. Compositional model construction and model checking are two aspects of a modern strategy to guarantee correctness of a system design. In this lecture series, he painted the landscape of behavioural models for probability, time, and cost, with a mild focus on concurrent Markov models and process calculi. He discussed foundational and practical aspects of compositional modelling and model checking of such systems.
TONY HOARE held a lecture on Unifying Models of Data Flow. A unifying model is one that generalises a diverse range of more specific theories, applicable to a range of phenomena in the real world. The original models turn out to be special cases, which are (and will remain) useful in their special areas of application. The similarities for the specific theories are codified by a set of algebraic laws which they all share: and they are precisely and concisely differentiated by counter examples and use it to model many computational phenomena such as memory (weakly or strongly consistent), communications (buffered or un-buffered), sequentiality, concurrency, atomicity, etc.
In KIM LARSEN's course on Model-based Verification and Analysis for Real-Time Systems, model-driven development has been proposed as a way to deal with the increasing complexity of real-time and embedded systems, while reducing the time and cost to market. The use of models should permit early assessment of the functional correctness of a given design as well as requirements for resources (e.g. energy, memory, and bandwidth) and real-time and performance guarantees. He offered a thorough account of the formalism of timed automata due to Alur and Dill as well as several recent extensions.
DORON PELED held a lecture on Model Checking, which is a collection of techniques for the algorithmic verification of software and hardware systems. He discussed modelling of systems as transition systems and formal specifications of properties using linear temporal logic (LTL), computational tree logic (CTL) and Büchi automata. He presented two techniques for model checking: using a search through the explicit state space, and using symbolic model checking on binary decision diagrams.
JOHN RUSHBY lectured on Formal Methods and Argument-Based Safety Cases. Safety-critical systems must be supplied with strong assurance that they are, indeed, safe. Traditionally, assurance has been on adherence to standards, but recently the idea of a safety case has been gaining acceptance. A safety case provides an explicit argument that a system is safe to deploy; the notion of “safe” is made precise in suitable claims or goals for the system and its context of deployment, and the argument is intended to substantiate these claims, based on evidence concerning the system and its design and construction. The approach can be applied recursively, so that substantiated claims about a subsystem can be used as evidence in a parent case. Evaluators examine the case and may certify the system if they are persuaded that the claims are appropriate, the evidence is valid, and the argument is correct.
The contributions presented in this volume have emerged from these lectures of the 31st International Summer School held in Marktoberdorf from August 3rd to August 15th, 2010. 103 participants from 30 countries attended – including students, lecturers and staff. The Summer School provided two weeks of learning, discussion, and development of new ideas and was a fruitful event – both at the professional and social level.
We thank all lecturers, staff, and hosts in Marktoberdorf. Special thanks goes to Dr. Katharina Spies, Silke Müller, and Katjana Stark for their great support.
The Marktoberdorf Summer School was arranged as an Advanced Study Institute of the NATO Science for Peace and Security Programme with support from the Deutscher Akademischer Austausch Dienst (DAAD), Microsoft Research, and the town and county of Marktoberdorf. We thank all authorities involved.
THE EDITORS