TCS / Current / Formal Methods Forum / Autumn 1999
Helsinki University of Technology, 
     Laboratory for Theoretical Computer Science

Formal Methods Forum

Autumn 1999

This is a seminar on formal methods in computer science and engineering organized by the Laboratory for Theoretical Computer Science at HUT. We meet usually on Fridays at 2 p.m. in room B353 in the Computer Science building.

For further information, please contact Ilkka Niemelä.


Program

  • 10.9.99 Prof. Jia-Huai You (Department of Computing Science University of Alberta, Canada):
    Nonmonotonic Reasoning as Prioritized Argumentation

    Abstract: Argumentation is a common form of reasoning in every day life and has found significant applications in fields such as legal reasoning. Nonmonotonic reasoning is about the reasoning that deals with incomplete information. In this talk, we argue that nonmonotonic reasoning in general can be viewed as a form of prioritized argumentation where "monotonic" inferences are constrained by a simple notion of priority among rules of arguments. For this purpose, we formalize a knowledge representation language where a theory consists of a collection of rules of first order formulas and a priority among these rules. We study the semantics of language and how prioritized argumentation is related to other forms of nonmonotonic reasoning.

  • 17.9.99 Heikki Tauriainen: A Randomized Testbench for Algorithms Translating Linear Temporal Logic Formulae into Büchi Automata

    Abstract: Verification of reactive and concurrent systems can be done with model checking using the automata-theoretic approach. The correctness specification is expressed as a linear temporal property, a requirement which must be satisfied by all the system executions. Another way to state the requirement is as an enumeration of all the legal executions of the system, which can be compactly expressed as an automaton obtained from the temporal property using a transformation algorithm. We present a testbench for comparing the correctness of implementations of different algorithms for translating Linear Temporal Logic formulae into Büchi automata. The testbench generates random input for the translation algorithms and checks whether all the implementations produce the same results when run on the same input.

  • 24.9.99 Eero Lassila: Structural Analysis of Some Generative String Rewriting Devices
    Part 1: Definitions and examples

    Abstract: A unifying theoretical framework for such generative string rewriting devices as Chomsky grammars, Lindenmayer systems, and macro processors is presented. Within the framework, properties such as progressiveness, confluence, and stableness of a rewriting device are defined. It is suggested that a particular subclass of the newly formulated rewriting devices may be profitably applied to optimizing machine-level code generation.
    (Please notice that the presentation language will be Finnish.)

  • 1.10.99 Eero Lassila: Structural Analysis of Some Generative String Rewriting Devices
    Part 2: Basic properties

  • 8.10.99 Timo Latvala: Coping with Strong Fairness - On-the-fly Emptiness Checking of Streett Automata

    Abstract: The model checking approach to verification has proven to be very successful. Stating the required properties in temporal logic allows the process of verification to be automated. When propositional linear temporal logic (LTL) is used as the specification language Buchi automata usually are the theoretical constructions of choice. Buchi automata, however, cannot cope with strong fairness efficiently. Streett automata can easily model strong fairness constraints, but the known model checking algorithms are more complex than the algorithms for Buchi automata. In order to circumvent this problem we present a method which takes advantage of the simpler model checking complexity of Buchi automata when possible, and uses Streett automata when faced with strong fairness constraints. The method performs the model checking in an on-the-fly fashion and also includes counterexample generation. We present memory efficient algorithms for both the emptiness checking of Streett automata and counterexample generation.

  • 29.10.99 Kimmo Varpaaniemi: Stable Models for Stubborn Sets

    Abstract: The stubborn set method is one of the methods that try to relieve the state space explosion problem that occurs in state space generation. Spending some time in looking for ``good'' stubborn sets can pay off in the total time spent in generating a reduced state space. This paper shows how the method can exploit tools that solve certain problems of logic programs. The restriction of a definition of stubbornness to a given state can be translated into a variable-free logic program. When a stubborn set satisfying additional constraints is wanted, the additional constraints should be translated, too. It is easy to make the translation in such a way that each acceptable stubborn set of the state is represented by at least one stable model of the program, each stable model of the program represents at least one acceptable stubborn set of the state, and for each pair in the representation relation, the number of certain atoms in the stable model is equal to the number of enabled transitions of the represented stubborn set. So, in order to find a stubborn set which is good w.r.t. the number of enabled transitions, it suffices to find a stable model which is good w.r.t. the number of certain atoms.

    Work in progress: for any place/transition net and for any state of the net, let us say that a subset X of transitions is a persistent set of the state iff X is the set of enabled transitions of some stubborn set of the state. The following problems are NP-complete:

    • Given a place/transition net, a state of the net and a positive integer number n, construct a persistent set of size less than or equal to n, or show that there is no such set.
    • Given a place/transition net, a state of the net and a positive integer number n, construct a persistent set of size n, or show that there is no such set.

  • 5.11.99 Glenn Lewis (Univ. of Tasmania): Behavioural Inheritance and Incremental Analysis

    Abstract: Inheritance means one can begin with an abstract representation of an object that is easy to understand and clutter-free, and incrementally change that to a more concrete representation. In other words, inheritance provides support for abstraction, which is the most common and effective technique for dealing with complexity. The principle of substitutability has been proposed in various forms to give the expectations that an incrementally changed component should comply with if it is to be substituted for a component. One possibility, which is known as weak substitutability, relates to the compatibility of method parameter and result types - it does not require behavioural compatibility. Another version of the substitutability principle, referred to as strong substitutability requires behavioural compatibility between the type and subtype. There are a number of proposals for substitutability in the context of concurrent object-oriented systems. It is unclear whether these proposals are overly constrained for practical application. In the context of Coloured Petri Nets, I will present a set of three incremental modifications which lie somewhere between weak and strong substitutability. The major obstacle to the practical application of state based analysis techniques is the state space explosion problem. Current work is investigating the extent to which the incremental structure of a formal model can be used to help alleviate the state space explosion. I will outline the algorithms for this "incremental analysis" as they stand, and indicate the status of the investigation.

  • 19.11.99 Tomi Janhunen: Classifying Semi-Normal Default Logic on the Basis of its Expressive Power

    Abstract: This talk reports on systematic research which aims to classify non-monotonic logics by their expressive power. The classification is based on translation functions that satisfy three important criteria: polynomiality, faithfulness and modularity (PFM for short). The basic method for classification is to prove that PFM translation functions exist (or do not exist) between certain logics. As a result, non-monotonic logics can be arranged to form a hierarchy. The talk gives an overview of the current expressive power hierarchy (EPH). In addition, semi-normal default logic (SNDL) as well as prerequisite-free and semi-normal (PSNDL) default logic are taken into consideration and their exact positions in EPH are determined.

  • 10.12.99 Fabio Massacci (Dip. Ingegneria dell'Informazione - Univ. di Siena): Logical Cryptanalysis: Encoding and Analysis, from DES to RSA

    Abstract: Cryptographic algorithms play a key role in computer security and the formal analysis of their robustness is of utmost importance. Yet, CAD and automated reasoning tools are seldom used in the analysis of a cipher, and thus one cannot often get the desired formal assurance that the cipher is free from unwanted properties that may weaken its strength.
    In this talk, I'll show that one can feasibly encode the low-level properties of state-of-the-art cryptographic algorithms as SAT problems and then use efficient automated theorem proving (ATP) systems and SAT-solvers for reasoning about them. I call this approach *logical cryptanalysis*.
    In this framework, for instance, finding a model for a formula encoding an algorithm, is equivalent to finding a key with a cryptanalytic attack. Other important properties, such as cipher integrity or algebraic closure, can also be captured.
    Here I present a case study on the U.S. Data Encryption Standard (DES) and on Rivest_Shamir-Adleman signature schemes (RSA). I'll shows how to obtain a manageable encoding of their features and how even the simple process of formalization uncovers deep differences between what have been proved at CRYPTO conferences and what properties one would actually like.
    I have also tested a number of SAT tools (BDD by Briant, TABLEAU by Crawford & Auton, SATO by Zhang, Look-back CSP by Bayardo & Schrag, Walk-SAT by Selman & Kautz, Lagrangian methods by Wah et al.) on the encoding of DES, and I'll discuss the reasons behind their different performance.


[TCS main] [Contact Info] [Personnel] [Research] [Publications] [Software] [Studies] [News Archive] [Links]
Latest update: 09 January 2005.