Publications

go next top of page

Publications

[HPS24]
Jochen Hoenicke, Oded Padon, and Sharon Shoham. On the Power of Temporal Prophecy. In PODELSKI 65. Springer, 2024. [Hide abstract]
We consider a technique for liveness and termination proofs that reduces temporal proofs to verification conditions in uninterpreted first-order logic based on a suitable notion of a fair abstract lasso. It was previously shown that temporal prophecy makes the technique more powerful, essentially empowering it to prove termination of nested loops. In this paper we show that temporal prophecy is even more powerful than previously demonstrated: it can even prove termination of the Ackermann function, which is not primitive recursive, and essentially represents infinitely-many nested loops.
[HHS23]
Elisabeth Henkel, Jochen Hoenicke, and Tanja Schindler. Choose Your Colour: Tree Interpolation for Quantified Formulas in SMT. In CADE 2023, pages 248–265. Springer, 2023. [doi | preprint with proofs | Hide abstract]
We present a generic tree-interpolation algorithm in the SMT context with quantifiers. The algorithm takes a proof of unsatisfiability using resolution and quantifier instantiation and computes interpolants (which may contain quantifiers). Arbitrary SMT theories are supported, as long as each theory itself supports tree interpolation for its lemmas. In particular, we show this for the theory combination of equality with uninterpreted functions and linear arithmetic. The interpolants can be tweaked by virtually assigning each literal in the proof to interpolation partitions (colouring the literals) in arbitrary ways. The algorithm is implemented in SMTInterpol.
[HS22]
Jochen Hoenicke and Tanja Schindler. A Simple Proof Format for SMT. In SMT 2022, pages 54–70. CEUR-WS.org, 2022. [pdf | Hide abstract]
We present a simple resolution-based proof format targeted for SMT. The proof format uses a syntax similar to SMT-LIB terms. The resolution rule is its only proof rule with premises; all other rules are axioms proving tautological clauses. The format aims to be solver-independent by only including a minimal complete set of axioms while still being able to express proofs succinctly. Most of its axioms are purely syntactic; only for arithmetic reasoning, some axioms with side conditions are used for succinct reasoning with linear (in)equalities. The format can be extended with solver-specific rules, which can then either be treated as trusted axioms, or, better, replaced by their low-level proof. The format has been implemented in the solver SMTInterpol and the solver produces proofs for all benchmarks it can solve in the combinations of the theories of equality, linear arithmetic, arrays and datatypes. There is also a stand-alone proof checker for this format.
[DHHNP21]
Daniel Dietsch, Matthias Heizmann, Jochen Hoenicke, Alexander Nutz, and Andreas Podelski. Separating Map Variables in a Logic-Based Intermediate Verification Language. In NETYS, pages 169–186. Springer, 2021. [doi | Hide abstract]
In SMT solver based verification, the program to be verified is often given in an intermediate verification language such as Boogie. We present a program transformation that aims at splitting mathematical arrays (i.e., maps, which are typically used to model arrays and specifically the heap) into different partitions, so that the resulting verification conditions are easier to solve (due to the need of fewer case splits when analysing the effect of reads and writes over the same array). Our method takes the similar role of classical preprocessing steps based on alias analysis; the difference is that it works on any (mathematical) map, as opposed to a data structure that is known to present a chunk of memory managed by some compiler. Having to forfeit the benefits of general assumptions about memory (e.g., allocate-before-use), we need to deal with additional difficulties but obtain a more general technique. In particular, our technique can be applied to arbitrary programs in the intermediate verification language, including programs that are not directly derived from a program in a production-type programming language, like C or Java. We have implemented a prototypical version of the program transformation in order to demonstrate that it can lead to up to exponential reductions in execution time for the Ultimate software verification tool, despite the cost of performing the initial static analysis.
[PHMPSS21]
Oded Padon, Jochen Hoenicke, Kenneth L. McMillan, Andreas Podelski, Mooly Sagiv, and Sharon Shoham. Temporal Prophecy for Proving Temporal Properties of Infinite-State Systems. Formal Methods in System Design, 57(2):246–269, July 2021. [doi | Hide abstract]
Various verification techniques for temporal properties transform temporal verification to safety verification. For infinite-state systems, these transformations are inherently imprecise. That is, for some instances, the temporal property holds, but the resulting safety property does not. This paper introduces a mechanism for tackling this imprecision. This mechanism, which we call temporal prophecy, is inspired by prophecy variables. Temporal prophecy refines an infinite-state system using first-order linear temporal logic formulas, via a suitable tableau construction. For a specific liveness-to-safety transformation based on first-order logic, we show that using temporal prophecy strictly increases the precision. Furthermore, temporal prophecy leads to robustness of the proof method, which is manifested by a cut elimination theorem. We integrate our approach into the Ivy deductive verification system, and show that it can handle challenging temporal verification examples.
[HHS21]
Elisabeth Henkel, Jochen Hoenicke, and Tanja Schindler. Proof Tree Preserving Sequence Interpolation of Quantified Formulas in the Theory of Equality. In SMT 2021, pages 3–16. CEUR-WS.org, 2021. [pdf | Hide abstract]
Interpolation of SMT formulas is difficult, especially in the presence of quantifiers since quantifier instantiations introduce mixed terms, i.e., terms containing symbols that belong to different partitions of the input formula. Existing interpolation algorithms for quantified formulas require proof modifications or syntactical restrictions on the generated proof trees. We present a non-restrictive, proof tree preserving approach to compute inductive sequences of interpolants for quantified formulas in the theory of equality with uninterpreted functions using a single proof tree.
[HS21]
Jochen Hoenicke and Tanja Schindler. Incremental Search for Conflict and Unit Instances of Quantified Formulas with E-Matching. In VMCAI 2021, pages 534–555. Springer, 2021. [doi | Hide abstract]
We present a new method to find conflicting instances of quantified formulas in the context of SMT solving. Our method splits the search for such instances in two parts. In the first part, E-matching is used to find candidate instances of the quantified formulas. In principle, any existing incremental E-matching technique can be used. The incrementality avoids duplicating work for each small change of the E-graph. Together with the candidate instance, E-matching also provides an existing node in the E-graph corresponding to each term in this instance. In the second part, these nodes are used to evaluate the candidate instance, i.e., without creating new terms. The evaluation can be done in constant time per instance. Our method detects conflicting instances and unit-propagating instances (clauses that propagate new literals). This makes our method suitable for a tight integration with the DPLL(T) framework, very much in the style of an additional theory solver.
[LDWHP19]
Vincent Langenfeld, Daniel Dietsch, Bernd Westphal, Jochen Hoenicke, and Amalinda Post. Scalable Analysis of Real-Time Requirements. In RE 2019, pages 234–244. Springer, 2019. [doi | Hide abstract]
Detecting issues in real-time requirements is usually a trade-off between flexibility and cost: the effort expended depends on how expensive it is to fix a defect introduced by faulty, ambiguous or incomplete requirements. The most rigorous techniques for real-time requirement analysis depend on the formalisation of these requirements. Completely formalised real-time requirements allow the detection of issues that are hard to find through other means, like real-time inconsistency (i.e., “do the requirements lead to deadlocks and starvation of the system?”) or vacuity (i.e., “are some requirements trivially satisfied”). Current analysis techniques for real-time requirements suffer from scalability issues - larger sets of such requirements are usually intractable. We present a new technique to analyse formalised real-time requirements for various properties. Our technique leverages recent advances in software model checking and automatic theorem proving by converting the analysis problem for real-time requirements to a program analysis task. We also report preliminary results from an ongoing, large scale application of our technique in the automotive domain at Bosch.
[HS19]
Jochen Hoenicke and Tanja Schindler. Solving and Interpolating Constant Arrays Based on Weak Equivalences. In VMCAI 2019, pages 297–317. Springer, 2019. [doi | Hide abstract]
We present a new solver and interpolation algorithm for the theory of arrays with constant arrays. It is based on our previous work on weakly equivalent arrays. Constant arrays store the same value at every index, which is useful for model checking of programs with initialised memory. Instead of using a store chain to explicitly initialise the memory, using a constant array can considerably simplify the queries and thus reduce the solving and interpolation time. We show that only a few new rules are required for constant arrays and prove the correctness of the decision procedure and the interpolation procedure. We implemented the algorithm in our interpolating solver SMTInterpol.
[DHHNP18]
Daniel Dietsch, Matthias Heizmann, Jochen Hoenicke, Alexander Nutz, and Andreas Podelski. Ultimate TreeAutomizer (CHC-COMP Tool Description). In HCVS/PERR@ETAPS 2019, volume 296 in EPTCS, pages 42–47. 2019. [doi | pdf | Hide abstract]
We present Ultimate TreeAutomizer, a solver for satisfiability of sets of constrained Horn clauses. Constrained Horn clauses (CHC) are a fragment of first order logic with attractive properties in terms of expressiveness and accessibility to algorithmic solving. Ultimate TreeAutomizer is based on the techniques of trace abstraction, tree automata and tree interpolation. This paper serves as a tool description for TreeAutomizer in CHC-COMP 2019.
[HS19a]
Jochen Hoenicke and Tanja Schindler. Interpolation and the Array Property Fragment. In CoRR abs/1904.11381. arXiv, 2019. [pdf | Hide abstract]
Interpolation based software model checkers have been successfully employed to automatically prove programs correct. Their power comes from interpolating SMT solvers that check the feasibility of potential counterexamples and compute candidate invariants, otherwise. This approach works well for quantifier-free theories, like equality theory or linear arithmetic. For quantified formulas, there are SMT solvers that can decide expressive fragments of quantified formulas, e. g., EPR, the array property fragment, and the finite almost uninterpreted fragment. However, these solvers do not support interpolation. It is already known that in general EPR does not allow for interpolation. In this paper, we show the same result for the array property fragment.
[PHMPSS18]
Oded Padon, Jochen Hoenicke, Kenneth L. McMillan, Andreas Podelski, Mooly Sagiv, Sharon Shoham. Temporal Prophecy for Proving Temporal Properties of Infinite-State Systems. In FMCAD 18. IEEE, 2018. [doi | pdf | Hide abstract]
Various verification techniques for temporal properties transform temporal verification to safety verification. For infinite-state systems, these transformations are inherently imprecise. That is, for some instances, the temporal property holds, but the resulting safety property does not. This paper introduces a mechanism for tackling this imprecision. This mechanism, which we call temporal prophecy, is inspired by prophecy variables. Temporal prophecy refines an infinite-state system using first-order linear temporal logic formulas, via a suitable tableau construction. For a specific liveness-to-safety transformation based on first-order logic, we show that using temporal prophecy strictly increases the precision. Furthermore, temporal prophecy leads to robustness of the proof method, which is manifested by a cut elimination theorem. We integrate our approach into the Ivy deductive verification system, and show that it can handle challenging temporal verification examples.
[DHHNP18]
Daniel Dietsch, Matthias Heizmann, Jochen Hoenicke, Alexander Nutz, and Andreas Podelski. The Map Equality Domain. In VSTTE 18, pages 291–308. Springer, 2018. [doi | Hide abstract]
We present a method that allows us to infer expressive invariants for programs that manipulate arrays and, more generally, data that are modeled using maps (including the program memory which is modeled as a map over integer locations). The invariants can express, for example, that memory cells have changed their contents only at locations that have not been previously allocated by another procedure. The motivation for the new method stems from the fact that, although state-of-the-art SMT solvers are starting to be able to check the validity of more and more complex invariants, there is not much work yet on their automatic inference. We present our method as a static analysis over an abstract domain that we introduce, the map equality domain. The main challenge in the design of the method lies in scalability; given the expressiveness of the invariants, it is a priori not clear that a corresponding static analysis can be made scalable. Preliminary experiments with a prototypical implementation of the method allow us to cautiously conclude that may indeed be the case.
[HNP18]
Jochen Hoenicke, Alexander Nutz, and Andreas Podelski. A Tree-based Approach to Data Flow Proofs. In VSTTE 18, pages 1–16. Springer, 2018. [doi | Hide abstract]
In this paper, we investigate the theoretical foundation for the cost/precision trade-off of data flow graphs for verification. We show that one can use the theory of tree automata in order to characterize the loss of precision inherent in the abstraction of a program by a data flow graph. We also show that one can transfer a result of Oh et al. and characterize the power of the proof system of data flow proofs (through a restriction on the assertion language in Floyd-Hoare proofs).
[HS18]
Jochen Hoenicke and Tanja Schindler. Efficient Interpolation for the Theory of Arrays. In IJCAR 18, pages 549–565. Springer, 2018. [doi | Hide abstract] The author's version is available at arXiv.
Existing techniques for Craig interpolation for the quantifier-free fragment of the theory of arrays are inefficient for computing sequence and tree interpolants: the solver needs to run for every partitioning (A,B) of the interpolation problem to avoid creating AB-mixed terms. We present a new approach using Proof Tree Preserving Interpolation and an array solver based on Weak Equivalence on Arrays. We give an interpolation algorithm for the lemmas produced by the array solver. The computed interpolants have worst-case exponential size for extensionality lemmas and worst-case quadratic size otherwise. We show that these bounds are strict in the sense that there are lemmas with no smaller interpolants. We implemented the algorithm and show that the produced interpolants are useful to prove memory safety for C programs.
[DGHHNPSS18]
Daniel Dietsch, Marius Greitschus, Matthias Heizmann, Jochen Hoenicke, Alexander Nutz, Andreas Podelski, Christian Schilling, and Tanja Schindler. Ultimate Taipan with Dynamic Block Encoding – (Competition Contribution). In TACAS (2) 18, pages 452–456. Springer, 2018. [doi | Hide abstract]
Ultimate Taipan is a software model checker that uses trace abstraction and abstract interpretation to prove correctness of programs. In contrast to previous versions, Ultimate Taipan now uses dynamic block encoding to obtain the best precision possible when evaluating transition formulas of large block encoded programs.
[HCDGHLNMSSP18]
Matthias Heizmann, Yu-Fang Chen, Daniel Dietsch, Marius Greitschus, Jochen Hoenicke, Yong Li, Alexander Nutz, Betim Musa, Christian Schilling, Tanja Schindler, and Andreas Podelski. Ultimate Automizer and the Search for Perfect Interpolants – (Competition Contribution). In TACAS (2) 18, pages 447–451. Springer, 2018. [doi | Hide abstract]
Ultimate Automizeris a software verifier that generalizes proofs for traces to proofs for larger parts for the program. In recent years the portfolio of proof producers that are available to Ultimate has grown continuously. This is not only because more trace analysis algorithms have been implemented in Ultimate but also due to the continuous progress in the SMT community. In this paper we explain how Ultimate Automizer dynamically selects trace analysis algorithms and how the tool decides when proofs for traces are “good” enough for using them in the abstraction refinement.
[PHLPS18]
Oded Padon, Jochen Hoenicke, Giuliano Losa, Andreas Podelski, Mooly Sagiv, and Sharon Shoham. Reducing Liveness to Safety in First-order Logic. In PACMPL 2 (POPL), pages 26:1–26:33. ACM, 2018. [doi | Hide abstract]

We develop a new technique for verifying temporal properties of infinite-state (distributed) systems. The main idea is to reduce the temporal verification problem to the problem of verifying the safety of infinite-state systems expressed in first-order logic. This allows to leverage existing techniques for safety verification to verify temporal properties of interesting distributed protocols, including some that have not been mechanically verified before. We model infinite-state systems using first-order logic, and use first-order temporal logic (FO-LTL) to specify temporal properties. This general formalism allows to naturally model distributed systems, while supporting both unbounded-parallelism (where the system is allowed to dynamically create processes), and infinite-state per process.

The traditional approach for verifying temporal properties of infinite-state systems employs well-founded relations (e.g. using linear arithmetic ranking functions). In contrast, our approach is based the idea of fair cycle detection. In finite-state systems, temporal verification can always be reduced to fair cycle detection (a system contains a fair cycle if it revisits a state after satisfying all fairness constraints). However, with both infinitely many states and infinitely many fairness constraints, a straightforward reduction to fair cycle detection is unsound. To regain soundness, we augment the infinite-state transition system by a dynamically computed finite set, that exploits the locality of transitions. This set lets us define a form of fair cycle detection that is sound in the presence of both infinitely many states, and infinitely many fairness constraints. Our approach allows a new style of temporal verification that does not explicitly involve ranking functions. This fits well with pure first-order verification which does not explicitly reason about numerical values. In particular, it can be used with effectively propositional first-order logic (EPR), in which case checking verification conditions is decidable. We applied our technique to verify temporal properties of several interesting protocols. To the best of our knowledge, we have obtained the first mechanized liveness proof for both TLB Shootdown, and Stoppable Paxos.

[HMP17]
Jochen Hoenicke, Rupak Majumdar, and Andreas Podelski. Thread Modularity at Many Levels: a pearl in compositional verification. In POPL 17, pages 473–485. ACM, 2017. [doi | Hide abstract]
A thread-modular proof for the correctness of a concurrent program is based on an inductive and interference-free annotation of each thread. It is well-known that the corresponding proof system is not complete (unless one adds auxiliary variables). We describe a hierarchy of proof systems where each level k corresponds to a generalized notion of thread modularity (level 1 corresponds to the original notion). Each level is strictly more expressive than the previous. Further, each level precisely captures programs that can be proved using uniform Ashcroft invariants with k universal quantifiers. We demonstrate the usefulness of the hierarchy by giving a compositional proof of the Mach shootdown algorithm for TLB consistency. We show a proof at level 2 that shows the algorithm is correct for an arbitrary number of CPUs. However, there is no proof for the algorithm at level 1 which does not involve auxiliary state.
[CH16]
Jürgen Christ and Jochen Hoenicke. Proof Tree Preserving Tree Interpolation. Journal of Automated Reasoning, 57(1):67–95, June 2016. [doi | Hide abstract]
Craig interpolation has a wide range of applications in model checking, verification, and state space abstraction. Recent advances use a more general version of interpolation called tree interpolation. In this paper, we present a method to extract tree interpolants from a proof tree generated by an SMT solver without modifying the proof tree. The framework is general with respect to the theories involved. We instantiate the framework to the combination of the theories of uninterpreted functions and linear arithmetic.
[HP15]
Jochen Hoenicke and Andreas Podelski. Fairness for Infinitary Control. In Correct System Design – Symposium in Honor of Ernst-Rüdiger Olderog. 2015. [doi | Hide abstract]
In 1988, Olderog and Apt developed a fair scheduler for a system with finitely many processes based on the concept of explicit scheduling. In 2010, Hoenicke, Olderog, and Podelski extended the fair scheduler from static to dynamic control. In systems with dynamic control, processes can be created dynamically. Thus, the overall number of processes can be infinite, but the number of created processes is finite at each step of an execution of the system. In this paper we extend the fair scheduler to infinitary control. In systems with infinitary control, the number of created processes can be infinite. The fair scheduler for infinitary control is perhaps interesting for its apparent unfairness: instead of treating all processes equal, the scheduler discriminates each process against finitely many other processes. However, it also privileges each process against infinitely many other processes (in fact, all but finitely many).
[FHHKP15]
Azadeh Farzan, Matthias Heizmann, Jochen Hoenicke, Zachary Kincaid and Andreas Podelski. Automated Program Verification. In LATA 2015. 2015. [doi | Hide abstract]
A new approach to program verification is based on automata. The notion of automaton depends on the verification problem at hand (nested word automata for recursion, Büchi automata for termination, a form of data automata for parametrized programs, etc.). The approach is to first construct an automaton for the candidate proof and then check its validity via automata inclusion. The originality of the approach lies in the construction of an automaton from a correctness proof of a given sequence of statements. A sequence of statements is at the same time a word over a finite alphabet and it is (a very simple case of) a program. Just as we ask whether a word has an accepting run, we can ask whether a sequence of statements has a correctness proof (of a certain form). The automaton accepts exactly the sequences that do.
[CH15]
Jürgen Christ and Jochen Hoenicke. Cutting the Mix. In CAV 2015. 2015. [doi | Hide abstract]
While linear arithmetic has been studied in the context of SMT individually for reals and integers, mixed linear arithmetic allowing comparisons between integer and real variables has not received much attention. For linear integer arithmetic, the cuts from proofs algorithm has proven to have superior performance on many benchmarks. In this paper we extend this algorithm to the mixed case where real and integer variables occur in the same linear constraint. Our algorithm allows for an easy integration into existing SMT solvers. Experimental evaluation of our prototype implementation inside the SMT solver SMTInterpol shows that this algorithm is successful on benchmarks that are hard for all existing solvers.
[CH14]
Jürgen Christ and Jochen Hoenicke. Weakly Equivalent Arrays. In FroCos 2015. 2015. [doi | Hide abstract] The author's version is available at arXiv.
The (extensional) theory of arrays is widely used to model systems. Hence, efficient decision procedures are needed to model check such systems. In this paper, we present an efficient decision procedure for the theory of arrays. We build upon the notion of weak equivalence. Intuitively, two arrays are weakly equivalent if they only differ at finitely many indices. We formalise this notion and show how to exploit weak equivalences to decide formulas in the quantifier-free fragment of the theory of arrays. We present a novel data structure to represent all weak equivalence classes induced by a formula in linear space (in the number of array terms). Experimental evidence shows that this technique is competitive with other approaches.
[HHP14]
Matthias Heizmann, Jochen Hoenicke, and Andreas Podelski. Termination Analysis by Learning Terminating Programs. In CAV 2014. Springer, 2014. [doi | Hide abstract]
We present a novel approach to termination analysis. In a first step, the analysis uses a program as a black-box which exhibits only a finite set of sample traces. Each sample trace is infinite but can be represented by a finite lasso. The analysis can "learn" a program from a termination proof for the lasso, a program that is terminating by construction. In a second step, the analysis checks that the set of sample traces is representative in a sense that we can make formal. An experimental evaluation indicates that the approach is a potentially useful addition to the portfolio of existing approaches to termination analysis.
[HHLP13]
Matthias Heizmann, Jochen Hoenicke, Jan Leike, and Andreas Podelski. Linear Ranking for Linear Lasso Programs. In ATVA 2013, volume 8172 in LNCS, pages 365–380. Springer, 2013. [doi | Hide abstract] The author's version is available at arXiv.
We present a novel approach to termination analysis. In a first step, the analysis uses a program as a black-box which exhibits only a finite set of sample traces. Each sample trace is infinite but can be represented by a finite lasso. The analysis can "learn" a program from a termination proof for the lasso, a program that is terminating by construction. In a second step, the analysis checks that the set of sample traces is representative in a sense that we can make formal. An experimental evaluation indicates that the approach is a potentially useful addition to the portfolio of existing approaches to termination analysis.
[HHP13]
Matthias Heizmann, Jochen Hoenicke, and Andreas Podelski. Software Model Checking for People Who Love Automata. In CAV 2013, volume 8044 in LNCS, pages 36–52. Springer, 2013. Invited tutorial. [doi | Hide abstract]
In this expository paper, we use automata for software model checking in a new way. The starting point is to fix the alphabet: the set of statements of the given program. We show how automata over the alphabet of statements can help to decompose the main problem in software model checking, which is to find the right abstraction of a program for a given correctness property.
[CH13]
Jürgen Christ and Jochen Hoenicke. Extending Proof Tree Preserving Interpolation to Sequences and Trees. In SMT-workshop. 2013. [pdf | Hide abstract]
We present ongoing work to extend proof tree preserving interpolation to inductive sequences and tree interpolants. We present an algorithm to compute tree interpolants inductively over a resolution proof. Correctness of the algorithm is justified by the concept of partial tree interpolants and the appropriate definition of a projection function for conjunctions of literals onto nodes of the tree. We observe great similarities between the interpolation rules for binary interpolation and those for tree interpolation.
[CHN13]
Jürgen Christ, Jochen Hoenicke, and Alexander Nutz. Proof Tree Preserving Interpolation. In TACAS 13, volume 7795 in LNCS, pages 126–140. Springer, 2013. [doi | technical report | improved version | Hide abstract] The author's version is available at arXiv.
Craig interpolation in SMT is difficult because, e. g., theory combination and integer cuts introduce mixed literals, i. e., literals containing local symbols from both input formulae. In this paper, we present a scheme to compute Craig interpolants in the presence of mixed literals. Contrary to existing approaches, this scheme neither limits the inferences done by the SMT solver, nor does it transform the proof tree before extracting interpolants. Our scheme works for the combination of uninterpreted functions and linear arithmetic but is extendable to other theories. The scheme is implemented in the interpolating SMT solver SMTInterpol.
[CHN12]
Jürgen Christ, Jochen Hoenicke, and Alexander Nutz. SMTInterpol: An Interpolating SMT Solver. In Model Checking Software, volume 7385 in LNCS, pages 248–254. Springer, 2012. [doi | pdf | Hide abstract]
Craig interpolation is an active research topic and has become a powerful technique in verification. We present SMTInterpol, an interpolating SMT solver for the quantifier-free fragment of the combination of the theory of uninterpreted functions and the theory of linear arithmetic over integers and reals. SMTInterpol is SMTLIB 2 compliant and available under an open source software license (LGPL v3).
[PH12]
Amalinda Post and Jochen Hoenicke. Formalization and Analysis of Real-Time Requirements: a Feasibility Study at BOSCH. In VSTTE 12, pages 225–240. Springer, 2012. [doi | pdf | Hide abstract]
In this paper, we evaluate a tool chain to algorithmically analyze real-time requirements. According to this tool chain, one formalizes the requirements in a natural-language pattern system. The requirements can then be automatically compiled into formulas in a real-time logic. The formulas can be checked automatically for properties whose violation indicates an error in the requirements specification (the properties considered are: consistency, rt-consistency, vacuity). We report on a feasibility study in the context of several automotive projects at Bosch . The results of the study indicate that the effort for the formalization of real-time requirements is acceptable; the analysis algorithms are computationally feasible; the benefit (the detection of specification errors resp. the formal guarantee of their absence) seems significant.
[EHP12]
Evren Ermis, Jochen Hoenicke, and Andreas Podelski. Splitting via Interpolants. In VMCAI 12, pages 186–201. Springer, 2012. [doi | pdf | Hide abstract]
A common problem in software model checking is the automatic computation of accurate loop invariants. Loop invariants can be derived from interpolants for every path leading through the corresponding loop header. However, in practice, the consideration of single paths often leads to very path specific interpolants. Inductive invariants can only be derived after several iterations by also taking previous interpolants into account. In this paper, we introduce a software model checking approach that uses the concept of path insensitive interpolation to compute loop invariants. In contrast to current approaches, path insensitive interpolation summarizes several paths through a program location instead of one. As a consequence, it takes the abstraction refinement considerably less effort to obtain an adequate interpolant. First experiments show the potential of our approach.
[PHP11b]
Amalinda Post, Jochen Hoenicke, and Andreas Podelski. Vacuous real-time requirements. In RE 11, pages 153–162. IEEE, 2011. [doi | pdf | Hide abstract]
We introduce the property of vacuity for requirements. A requirement is vacuous in a set of requirements if it is equivalent to a simpler requirement in the context of the other requirements. For example, the requirement ``if A then B'' is vacuous together with the requirement ``not A''. The existence of a vacuous requirement is likely to indicate an error. We give an algorithm that proves the absence of this kind of error for real-time requirements. A case study in an industrial context demonstrates the practical potential of the algorithm.
[PHP11]
Amalinda Post, Jochen Hoenicke, and Andreas Podelski. rt-inconsistency: a new property for real-time requirements. In FASE 2011, volume 6603 in LNCS, pages 34–49. Springer, 2011. [doi | pdf | Hide abstract]
We introduce rt-inconsistency, a property of real-time requirements. The property reflects that the requirements specify apparently inconsistent timing constraints. We present an algorithm to check rt-inconsistency automatically. The algorithm works via a stepwise reduction to real-time model checking. We implement the algorithm using an existing module for the reduction and the UPPAAL tool for the realtime model checking. As a case study, we apply our prototype implementation to existing real-time requirements for automotive projects at Bosch . The case study demonstrates the relevance of rt-inconsistency for detecting errors in industrial real-time requirements specifications.
[HLPSW10]
Jochen Hoenicke, K. Rustan M. Leino, Andreas Podelski, Martin Schäf, and Thomas Wies. Doomed Program Points. Formal Methods in System Design, 37(2–3):171–199, December 2010. [doi | Hide abstract]
Any programming error that can be revealed before compiling a program saves precious time for the programmer. While integrated development environments already do a good job by detecting, e.g., data-flow abnormalities, current static analysis tools suffer from false positives ("noise") or require strong user interaction. We propose to avoid this deficiency by defining a new class of errors. A program fragment is doomed if its execution will inevitably fail, regardless of which state it is started in. We use a formal verification method to identify such errors fully automatically and, most significantly, without producing noise. We report on experiments with a prototype tool.
[HMO10]
Jochen Hoenicke, Roland Meyer, and E.-R. Olderog. Kleene, Rabin, and Scott are available. In CONCUR 2010, volume 6269 in LNCS, pages 462–477. Springer, 2010. [doi | pdf | Hide abstract]
We are concerned with the availability of systems, defined as the ratio between time of correct functioning and uptime. We propose to model guaranteed availability in terms of regular availability expressions (rae) and availability automata. We prove that the intersection problem of rae is undecidable. We establish a Kleene theorem that shows the equivalence of the formalisms and states precise correspondence of flat rae and simple availability automata. For these automata, we provide an extension of the powerset construction for finite automata due to Rabin and Scott. As a consequence, we can state a complementation algorithm. This enables us to solve the synthesis problem and to reduce model checking of availability properties to reachability.
[HOP10]
Jochen Hoenicke, E.-R. Olderog, and Andreas Podelski. Fairness for Dynamic Control. In TACAS 2010, volume 6015 in LNCS, pages 251–265. Springer, 2010. [doi | pdf | Hide abstract]
Already in Lamport's bakery algorithm, integers are used for fair schedulers of concurrent processes. In this paper, we present the extension of a fair scheduler from `static control' (the number of processes is fixed) to `dynamic control' (the number of processes changes during execution). We believe that our results shed new light on the concept of fairness in the setting of dynamic control.
[HHP10]
Matthias Heizmann, Jochen Hoenicke, and Andreas Podelski. Nested Interpolants. In POPL 10, pages 471–482. ACM, 2010. [doi | pdf | Hide abstract]
In this paper, we explore the potential of the theory of nested words for partial correctness proofs of recursive programs. Our conceptual contribution is a simple framework that allows us to shine a new light on classical concepts such as Floyd/Hoare proofs and predicate abstraction in the context of recursive programs. Our technical contribution is an interpolant-based software model checking method for recursive programs. The method avoids the costly construction of the abstract transformer by constructing a nested word automaton from an inductive sequence of `nested interpolants' (i.e., interpolants for a nested word which represents an infeasible error trace).
[HLPSW09]
Jochen Hoenicke, K. Rustan M. Leino, Andreas Podelski, Martin Schäf, and Thomas Wies. It's Doomed; we can Prove it. In FM 2009, volume 5850 in LNCS, pages 338–353. Springer, 2009. [doi | pdf | Hide abstract]
Programming errors found early are the cheapest. Tools applying to the early stage of code development exist but either they suffer from false positives (``noise'') or they require strong user interaction. We propose to avoid this deficiency by defining a new class of errors. A program fragment is doomed if an execution that reaches it, in whatever state, will inevitably fail. We use a formal verification method to identify such errors fully automatically and, most significantly, without producing noise. We report on preliminary experiments with a prototypical tool.
[HHP09]
Matthias Heizmann, Jochen Hoenicke, and Andreas Podelski. Refinement of Trace Abstraction. In Static Analysis Symposium (SAS 2009), volume 5673 in LNCS, pages 69–85. Springer, 2009. [doi | pdf | Hide abstract]
Automatic refinement of abstraction is an active research theme in static analysis. In existing approaches, calls to theorem provers (for the construction of a sequence of increasingly precise state abstractions) represent an obstacle to scalability. We present an approach that aims at the reuse of theorem prover calls. The abstraction is based not on states but on traces. Each refinement extends a tuple of automata with another automaton; this automaton is derived from the interpolant-based analysis of a spurious counterexample.
[MFHR08]
Roland Meyer, Johannes Faber, Jochen Hoenicke, and Andrey Rybalchenko. Model Checking Duration Calculus: a practical approach. Formal Aspects of Computing, 20(4–5):481–505, July 2008. [doi | Hide abstract]
Model checking of real-time systems against Duration Calculus (DC) specifications requires the translation of DC formulae into automata-based semantics. The existing algorithms provide a limited DC coverage and do not support compositional verification. We propose a translation algorithm that advances the applicability of model checking tools to realistic applications. Our algorithm significantly extends the subset of DC that can be checked automatically. The central part of the algorithm is the automatic decomposition of DC specifications into sub-properties that can be verified independently. The decomposition is based on a novel distributive law for DC. We implemented the algorithm in a tool chain for the automated verification of systems comprising data, communication, and real-time aspects. We applied the tool chain to verify safety properties in an industrial case study from the European Train Control System (ETCS).
[HM05]
Jochen Hoenicke and Patrick Maier. Model-Checking of Specifications Integrating Processes, Data and Time. In FM 2005, volume 3582 in LNCS, pages 465–480. Springer, 2005. [doi | pdf | Hide abstract]
We present a new model-checking technique for CSP-OZ-DC, a combination of CSP, Object-Z and Duration Calculus, that allows reasoning about systems exhibiting communication, data and real-time aspects. As intermediate layer we will use a new kind of timed automata that preserve events and data variables of the specification. These automata have a simple operational semantics that is amenable to verification by a constraint-based abstraction-refinement model checker. By means of a case study, a simple elevator parameterised by the number of floors, we show that this approach admits model-checking parameterised and infinite state real-time systems.
[HO02b]
Jochen Hoenicke and Ernst-Rüdiger Olderog. CSP-OZ-DC: A Combination of Specification Techniques for Processes, Data and Time. Nordic Journal of Computing, 9(4):301–334, 2002. Appeared March 2003. [pdf | Hide abstract]
CSP-OZ-DC is a new combination of three well researched formal techniques for the specification of processes, data and time: CSP [Hoa85], Object-Z [Smi00], and Duration Calculus [ZHR91]. This combination is illustrated by specifying the train controller of a case study on radio controlled railway crossings. The technical contribution of the paper is a smooth integration of the underlying semantic models and its use for verifying timing properties of CSP-OZ-DC specifications. This is done by combining the model-checkers FDR [Ros94] for CSP and UPPAAL [BLL97] for timed automata with a new tool f2u that transforms FDR transition systems and certain patterns of Duration Calculus formulae into timed automata. This approach is illustrated by the example of a vending machine.
[HO02a]
Jochen Hoenicke and Ernst-Rüdiger Olderog. Combining Specification Techniques for Processes Data and Time. In Integrated Formal Methods, volume 2335 in Lecture Notes in Computer Science, pages 245–266. Springer, May 2002. [doi | pdf | Hide abstract]
We present a new combination CSP-OZ-DC of three well researched formal techniques for the specification of processes, data and time: CSP [Hoa85], Object-Z [Smi00], and Duration Calculus [ZHR91]. The emphasis is on a smooth integration of the underlying semantic models and its use for verifying properties of CSP-OZ-DC specifications by a combined application of the model-checkers FDR [Ros94] for CSP and UPPAAL [BLL97] for Timed Automata. This approach is applied to part of a case study on radio controlled railway crossings.
[Hoe01]
Jochen Hoenicke. Specification of Radio Based Railway Crossings with the Combination of CSP, OZ, and DC. FBT 2001, jun 2001. [pdf | Hide abstract]
We use a combination of three techniques for the specification of processes, data and time: CSP, Object-Z and Duration Calculus. Whereas the combination of CSP and Object-Z is well established by the work of C. Fischer, the integration with Duration Calculus is new. The combination is used to specify parts of a novel case study on radio controlled railway crossings.
 top of page go back

Thesis

[Hoe06]
Jochen Hoenicke. Combination of Processes, Data, and Time. PhD thesis, University of Oldenburg, July 2006. [pdf | Hide abstract]

In this work, we present a combination of the formal methods CSP, Object-Z and Duration Calculus. Each method can describe certain aspects of a system: CSP can describe behavioural aspects, such as sequential and concurrent behaviour and synchronous communication, Object-Z complex data operations, and Duration Calculus real-time requirements. It is challenging to combine these to a unified language, CSP-OZ-DC, that takes advantage of the individual strengths of the underlying formalisms.

The semantics of CSP-OZ-DC needs to cover the basic entities of the constituent languages: events for CSP, operations and data spaces for Object-Z, and time dependent observables for Duration Calculus. We describe a particular behaviour of the system as a trajectory mapping each point in time to a valuation that provides values for all state variables. Instantaneous events are represented by rising and falling edges of Boolean state variables. The trace semantics of the full system is given as the set of all behaviours that are allowed by the CSP, the Object-Z, and the Duration Calculus part.

To facilitate the use of model checkers, we also provide an operational semantics. To this end, we introduce phase event automata, a new class of timed automata. The components of a combined specification can be translated individually into phase event automata that run in parallel to build the full system. The full system permits those behaviours that are permitted by each component automaton. We prove the soundness of the translation by showing that a behaviour is in the trace semantics of the CSP, the Object-Z, or the DC part if and only if it is accepted by the corresponding automaton.

[Hoe99]
Jochen Hoenicke. Graphische Spezifikationsspachen: Der Zusammenhang zwischen Constraint Diagrams und Real-Time Symbolic Timing Diagrams. Master's thesis, University of Oldenburg, 1999. In German. [pdf | Hide abstract]
In dieser Arbeit werden zwei graphische Spezifikationssprachen, Real-Time Symbolic Timing Diagrams (RTSTD) und Constraint Diagrams (CD), betrachtet und in Zusammenhang gebracht. Es werden die semantischen Bereiche der beiden zugrundeliegenden Formalismen TPTL und DC in Zusammenhang gebracht. Auf Grundlage einer gemeinsamen Semantik wird ein Übersetzung einer Teilklasse von CDs in RTSTDs angegeben.