To cover the broad range of engineering dependable software systems, we assembled in our NATO Advanced Study Institute Summer School Marktoberdorf 2014 a group of internationally leading researchers and experts. They taught their profound knowledge and experience in Software Engineering of Dependable Software Systems. Our lecturers representing excellent research groups from industry as well as universities presented a detailed and excellent overview on current research results with special emphasis to solve current problems. To impart this big picture in the excellent networking as a “ASI-Summer School Marktoberdorf” the course was designed as an in-depth presentation and excellent teaching of state-of-the-art scientific techniques and methods covering research and industrial practice as well as scientific foundations.
RAJEEV ALUR lectured on the topic Syntax-Guided Synthesis. The classical formulation of the program-synthesis problem is to find a program that meets a correctness specification given as a logical formula. Recent work on program synthesis and program optimization illustrates many potential benefits of allowing the user to supplement the logical specification with a syntactic template that constrains the space of allowed implementations. The goal was to identify the core computational problem common to these proposals in a logical framework. He described three different instantiations of the counter-example-guided-inductive-synthesis (CEGIS) strategy for solving the synthesis problem, report on prototype implementations, and presented experimental results on an initial set of benchmarks. The lecture held by THOMAS BALL was on Automated Test Generation via SAT Solvers. Satisfiability (SAT) and Satisfiability Modulo Theory (SMT) solvers power a new generation of modelling, verification, and testing tools. In these lectures, he focussed on the use of SAT and SMT solvers in automated test generation tools, the applications of these tools in industry (specifically the SAGE and Pex tools from Microsoft), and how interpolating SMT solvers can be used to bridge the gap between testing and verification via the process of lazy annotation.
From Actions, Transactions, and Processes to Services was the lecture series by MANFRED BROY. In the problem domain of business process engineering and in the solution domain of service oriented architectures (SOA) he introduced, modelled, and formalized in notions of business processes such as action, actor, event, business process, business transaction, and service a systematic way key in terms of a rigorous mathematical system model. The goal was to provide a minimal set of formal modelling concepts, nevertheless expressive enough to formalize key notions and concepts in business process engineering and service-oriented architectures capturing also their mutual relationships. The purpose of the approach is manifold; a clear definition of terminology, concepts, terms, and models in business process modelling and SOA; a rigorous formal basis for the specification, design and development of business processes and, in particular, SOAs, and a strictly formal concept for the development steps going from business process models to services as part of a SOA-oriented development process.
MICHAEL JACKSON lectured on System Behaviours and Problem Frames. A cyber-physical system comprises the machine, executing the software, and the problem world in which the system behaviour is evoked by the machine and will be observed and experienced by the stakeholders. The configuration of these elements constitutes a problem with two aspects: design of a behaviour to satisfy the stakeholders requirements; and specification of a machine that will ensure that behaviour in the problem world. Realistic systems present two major challenges. First, complexity, due to the proliferation and interaction of features and the related need – in critical systems – for a wide operational envelope. Second, the physicality of the problem world. At the relevant granularities the physical world is non-formal, presenting severe difficulties for reliable operation and for the formal reasoning necessary for system development.
AXEL VAN LAMSWEERDE held a lecture Dependable Human-Intensive Systems. Human-intensive systems involve complex cooperation processes among software applications, devices, and people, where the human contributions require domain expertise and have a significant impact on the success or failure of the overall mission. Such systems are in general safety-critical as exemplified by flight control, disaster management, or medical process support systems. The course focussed on medical human-intensive systems where decisions regulate safety-critical processes within the system. It presented a variety of formal techniques for building and analyzing models of human-intensive systems. A formal language supporting both event-based and state-based specifications was introduced as a process modelling language enabling suitable analyses. The course discussed formal operators for composing process model views in a coherent way or, conversely, for decomposing process models into specific views that abstract from details irrelevant to a specific dimension and also showed how goal-oriented techniques can be applied to declarative models of decision-making tasks to support risk analysis and model restructuring.
The topic of JOHN McDERMID's lecture was Development of Safety and Dependability Cases. Many of the systems on which society now depends have the ability to affect safety – either positively or negatively. It is difficult to make such systems safe; it is also difficult to demonstrate that they are safe. The focus of these talks was on demonstrating safety, through the use of safety cases, and on broader issues such as dependability. Development activities produce designs; safety analysis provides information about the potential for unsafe behavior of a design; effective safety management uses the results of analysis to drive development to produce systems that are “safe enough”; safety cases provide the argument (rationale) why (it is believed) the system is “safe enough” to use. Evolving Software Ecosystems was the title of the lecture given by TOM MENS. Software Ecosystems are collections of interacting and communicating software projects developed by the same developer community. Managing the complex interactions and dynamics is an important problem that raises additional challenges compared to the maintenance of individual software systems. He explored the state of the art in the empirical analysis of evolving software ecosystems with a focus on the evolutionary aspect, drew the analogy between software ecosystems and natural ecosystems, and explored what can be learnt from ecology and biological evolution in order to increase understanding of software ecosystem evolution.
The lecture series given by BERTRAND MEYER was on Automatic Alias Analysis and Frame Inference. Take two expressions e and f, denoting references (pointers), and a program location p. When an execution reaches p, can e and f ever be attached (i.e., point) to a single object? This is the aliasing problem, undecidable like anything else of interest in computer science, but for which we may nevertheless expect good enough solutions. Applications abound, from the general verification of object-oriented programs to frame analysis and even deadlock analysis. The lectures described various forms of the alias calculus and how it has been applied to addressing these problems, as part of an integrated program development and verification environment.
DORON PELED lectured on the topic Automatic Verification and Synthesis of Finite State Systems. With the advancement of computing power, and the increased insight into formal methods, it is only natural to ask whether one could automate the transformation from specification into working code (or at least, a skeleton of code that can be further adjusted to fit one's architecture and some additional constraints). This series of lectures included a fast tutorial on classical model checking and the state of the art in this area, and introduced the basics of software synthesis, together with some theoretical constraints that are responsible for the fact that programming is still a human rather than a computerized activity.
The course given by ALEXANDER PRETSCHNER was about Fault Based Testing. Intuition and textbooks suggest that good tests are those that unveil faults. Hypothesizing that there are correct programs, this means that there are no good tests for correct programs. A modified definition of good test cases then demands that tests unveil potential faults, and furthermore do so with good cost effectiveness. In this set of lectures, we studied the theoretical and empirical fault detection ability of partition-based testing, specifically coverage-based testing, both when used for test case selection and test case assessment. To show the applicability of this model, he showed how this theory captures fault models including simple division-by-zero or stuck-at-1 faults, but also limit testing, testing strategies for finite state machines, testing strategies for object-oriented software, performance as well as security testing, common concurrency faults, and combinatorial testing. Finally, he discussed the usefulness of fault models for the derivation of check lists and presented a live cycle model for fault models.
Electronic UTP: Mechanised Unifying Theories of Programming was the title of JIM WOODCOCK's lecture. Hoare & He's Unifying Theories of Programming (UTP) is a long-term research agenda to understand the relationship between different programming theories and pragmatic programming paradigms. The origin of UTP is in the work on predicative programming that was started by Hehner, and it supports three different ways to study such relationships: (i) by computational paradigm; (ii) by level of abstraction; and (iii) by method of presentation. These lectures formed a tutorial introduction to three key theories in UTP and their implementation in the Isabelle/HOL interactive theorem prover. We described mechanised theories of (i) relations, (ii) designs (pre-postcondition pairs), and (iii) reactive processes; the first two in detail, the last in outline.
PAMELA ZAVE gave a lecture on Compositional Models of Network Architecture. In the future, the Internet will need to support an even wider variety of applications, stakeholders, resources, endpoint devices, communication functions, service guarantees, and resource policies than it does today. Extending the capabilities of a network architecture means adding configuration data, functional mechanisms, operating constraints, etc. The first goal of this work was to describe all real networks by specializing and composing modules of this type. Subsequently, she used the model to study enhanced capabilities, discovering the properties and principles that enable them to be synergistic and non-interfering. Ultimately, a rigorous understanding of compositionality can turn into automated tools for network design, analysis, and implementation.
High-level scientific presentations as well as working and learning together shows the characteristics of the Summer School series. Such activities are essential for future scientific results in computer science and consequently the development of large-scale reliable and secure software systems – Dependable Software Systems. The Summer School provided once more two weeks of learning, discussion and development of new ideas, and was be a productive and beneficial event, at both the professional and social level.
We thank all lecturers, the staff, and hosts in Marktoberdorf. Special thanks must be addressed to Dr. Katharina Spies and Silke Müller for their great support.
The Marktoberdorf Summer School was arranged as an Advanced Study Institute of the NATO Science for Peace and Security Programme.We thank all authorities involved.
THE EDITORS