The objective of our Summer School 2007 on Formal Logical Methods for System Security and Correctness was to present the state-of-the-art in the field of proof technology in connection with secure and correct software. The lecturers have shown that methods of correct-by-construction program and process synthesis allow a high level programming method more amenable to security and reliability analysis and guarantees. By providing the necessary theoretical background and presenting corresponding application oriented concepts, the objective was an in-depth presentation of such methods covering both theoretical foundations and industrial practice. In detail the following courses were given:
GILLES BARTHE lectured on Verification Methods for Software Security and Correctness. The objective of the lectures was to present static enforcement mechanisms to ensure reliability and security of mobile code. First, he introduced a type based verifier for ensuring information flow policies and a verification condition generator for Java bytecode programs. He also described how these mechanisms have been certified using the proof assistant Coq. Second, he related these two enforcement mechanisms to their counterparts for Java programs.
ROBERT CONSTABLE's lectures Logical Foundations of Computer Security were concerned with developing correct-by-construction security protocols for distributed systems on communication networks. He used computational type theory to express logically sound cryptographic services and established them by machine generated formal proofs.
In his course Building a Software Model-Checker JAVIER ESPARZA introduced jMoped, a tool for the analysis of Java programs. He then explained the theory and algorithms behind the tool. In jMoped is assumed that variables have a finite range. He started by considering the computational complexity of verifying different classes of programs satisfying this constraint. After choosing a reasonable class of programs, he introduced a model-checking algorithm based on pushdown automata and then addressed the problem of data. He presented an approach to this problem based on BDDs and counterexample-based abstraction refinement with interpolants.
With Automatic Refinement and Vacuity Detection for Symbolic Trajectory Evaluation ORNA GRUMBERG presented a powerful model checking technique called Symbolic Trajectory Evaluation (STE), which is particularly suitable for hardware. STE is applied to a circuit M, described as a graph over nodes (gates and latches). The specification consists of assertions in a restricted temporal language. The assertions are of the form A⇒C, where the antecedent A expresses constraints on nodes n at different times t, and the consequent C expresses requirements that should hold on such nodes (n,t). Abstraction in STE is derived from the specification by initializing all inputs not appearing in A to the X (“unknown”) value. A refinement amounts to changing the assertion in order to present node values more accurately. A symbolic simulation and the specific type of abstraction, used in STE, was described. We proposed a technique for automatic refinement of assertions in STE, in case the model checking results in X. In this course the notion of hidden vacuity for STE was defined and several methods for detecting it was suggested.
JOHN HARRISON lectured on Automated and Interactive Theorem Proving. He covered a range of topics from Boolean satisfiability checking (SAT), several approaches to first-order automated theorem proving, special methods for equations, decision procedures for important special theories, and interactive proofs. He gave some suitable references.
MARTIN HOFMANN gave a series of lectures on Correctness of Effect-based Program Transformations in which a type system was considered capable of tracking reading, writing and allocation in a higher-order language with dynamically allocated references. He gave a denotational semantics to this type system which allowed us to validate a number of effect-dependent program equivalences in the sense of observational equivalence. On the way we learned popular techniques such as parametrised logical relations, regions, admissible relations, etc which belong to the toolbox of researchers in principles of programming languages.
Abstract and Concrete Models of Recursion was the course of MARTIN HYLAND. Systems of information flow are fundamental in computing systems generally and security protocols in particular. One key issue is feedback (or recursion) and we developed an approach based on the notion of trace. He covered applications to fixed point theory, automata theory and topics in the theory of processes.
In his course Security Analysis of Network Protocols JOHN MITCHELL provided an introduction to network protocols that have security requirements. He covered a variety of contemporary security protocols and gave students information needed to carry out case studies using automated tools and formal techniques. The first lectures surveyed protocols and their properties, including secrecy, authentication, key establishment, and fairness. The second part covered standard formal models and tools used in security protocol analysis, and described their advantages and limitations.
With his lectures on The Engineering Challenges of Trustworthy Computing GREG MORRISETT talked about a range of language, compiler, and verification techniques that can be used to address safety and security issues in systems software today. Some of the techniques, such as software fault isolation, are aimed at legacy software and provide relatively weak but important guarantees, and come with significant overhead. Other techniques, such as proof-carrying code, offer the potential of fine-grained protection with low overhead, but introduce significant verification challenges.
The focus of TOBIAS NIPKOW's lecture series Verified Decision Procedures for Linear Arithmetic was on decision procedures for linear arithmetic (only +, no *) and their realization in foundational theorem provers. Although we used arithmetic for concreteness, the course was also a general introduction of how to implement arbitrary decision procedures in foundational provers. The course focused on two well-known quantifier elimination algorithms (and hence decision procedures) for linear arithmetic: Fourier-Motzkin elimination, which is complete for rationals and reals, and Cooper's method, which is complete for the integers.
In his series of lectures Proofs with Feasible Computational Content HELMUT SCHWICHTENBERG considered logical propositions concerning data structures. If such a proposition involves (constructive) existential quantifiers in strictly positive positions, then—according to Brouwer, Heyting and Kolmogorov—it can be seen as a computational problem. A (constructive) proof of the proposition then provides a solution to this problem, and one can machine extract (via a realizability interpretation) this solution in the form of a lambda calculus term, which can be seen as a functional program. He concentrated on the question how to control at the proof level the complexity of the extracted programs.
STAN WAINER lectured on Proof Systems, Large Functions and Combinatorics. Proof-theoretic bounding functions turn out to have surprisingly deep connections with finitary combinatorics. These four lectures showed how the power of Peano Arithmetic, and various natural fragments of it, is precisely delineated by variants of the Finite Ramsey Theorem.
The contributions in this volume have emerged from these lectures of the 28th International Summer School at Marktoberdorf from July 31 to August 12, 2007. About 90 participants from 30 countries attended—including students, lecturers and staff. The Summer School provided two weeks of learning, discussion and development of new ideas, and was a fruitful event, at both the professional and social level.
We would like to thank all lecturers, staff, and hosts in Marktoberdorf. In particular special thanks goes to Dr. Katharina Spies, Silke Müller, and Sonja Werner for their great and gentle support.
The Marktoberdorf Summer School was arranged as an Advanced Study Institute of the NATO Science for Peace and Security Programme with support from the town and county of Marktoberdorf and the Deutscher Akademischer Austausch Dienst (DAAD). We thank all authorities involved.
THE EDITORS