TCS / Current / Theoretical Computer Science Forum / Autumn 2007
Helsinki University of Technology, 
     Laboratory for Theoretical Computer Science

Theoretical Computer Science Forum

Autumn 2007

This is a forum for talks on theoretical computer science organized by the TCS Laboratory at HUT. Unless stated otherwise, the talks are held on Fridays at 2 p.m. in Room B353 in the Computer Science building. Any exceptions to this rule are indicated below.

The talks given at TCS Forum aim at presenting recent advances in theoretical computer science that can be expected to attract interested audience also beyond the research groups of the laboratory. The Friday afternoon time slot is occasionally used also for presentations about ongoing work and results from student's projects at what is called as TCS Lounge. The main goal of the TSC Lounge is to improve transfer of knowledge and ideas between the different research groups of the TCS laboratory.

TCS Forum Ends

The last talk at the TCS Forum will take place on the 25th of January, 2008. After that the activities of the TCS Forum will be continued by the new Information and Computer Science Forum.

For further information, please contact Kaisa Nyberg.


Program

Forthcoming presentations

  • 25.1.2008 FORUM Long Nguyen, Computing Laboratory, Oxford University, UK:

    Authentication Protocols Based on Human Interaction in Security Pervasive Computing

    Abstract: A big challenge in pervasive computing is to establish secure communication over the Dolev-Yao network without a PKI. An approach studied by researchers is to build security though human work creating a low-bandwidth authentication channel (physical contact, human conversation) where the transmitted information is authentic and cannot be faked. In this talk, I give a brief survey of authentication protocols of this type as well as concentrating on our contribution to this area. These are our proposed group protocols and a new cryptographic primitive termed a Digest function that uniformly digests large information into a short string.

    I start with one-way authentic channel schemes, for example: MANA I [Gehrmann-Mitchell-Nyberg 2004], and demonstrate that it is not optimal in the human work, and then present our improved versions. I then move on to analyse a number of strategies: Indirect-binding [Vaudenay 2005, Pasini-Vaudenay 2006, Cagalj-Capkun-Hubaux 2005, Wong-Stajano 2005-2007] and Direct-bindind [Laur-Nyberg 2006, Valkonen-Asokan-Nyberg 2006, Hoepman 2005, Nguyen-Roscoe 2006-2007], used to construct interactive pair-wise and group protocols. Whilst both strategies can optimise the human work relative to the amount of security obtained, we point out that the Indirect-binding approach might be more expensive in computation cost. These protocols are based on the human comparison of a short authenticated string that is either the digest of authenticated information in Direct-binding or the exclusive-or of short random nonces in Indirect-binding.

    This is based on joint work with Prof. Bill Roscoe.

    More information about this work, which has appeared in Journal of Information and Computation, and Proc of FCS-ARSPA 2006, is available here.


Past presentations

  • 14.12.2007 FORUM Andreas Feldmann RWTH University Aachen, Germany :

    Computing Approximate Equilibria in Network Congestion Games

    Abstract: In recent years, there has been an increased interest in understanding selfish routing in large networks like the Internet using game theory. We consider the special case of network congestion games, in which the resources are the edges of a graph and every player wants to allocate a path between her source and target node. The delay of an edge increases with the number of players allocating it, and every player is interested in allocating a routing path with minimum delay. The problem of interest is that of computing epsilon-approximate Nash equilibria in these games. The general problem is known to be PLS-complete for every polynomial-time computable epsilon, but the reductions are based on artificial and steep delay functions with the property that already two players using the same resource cause a delay that is significantly larger than the delay for a single player.

    We consider network congestion games with delay functions such as polynomials, exponential functions, and functions from queuing theory. We analyse which approximation guarantees can be achieved for such congestion games by the method of randomised rounding. Our results show that the success of this method depends on different criteria depending on the class of functions considered. For example, queuing theoretical functions admit good approximations if the equilibrium load of every resource is bounded away appropriately from its capacity.

    For the paper, see here.

  • 9.11.2007 FORUM Jorma Jormakka, Professor, National Defence University , Helsinki, Finland :

    On the existence of polynomial-time algorithms for the Merkle-Hellman knapsack problem

    Abstract: In this talk we present new polynomial-time algorithms for some restricted cases of the knapsack problem used in the Merkle-Hellman knapsack cryptosystem. We will also identify a case of this NP hard problem, for which polynomial time algorithms may not exist. Indeed, we conjecture that this is the case and give strong arguments to support the conjecture.

  • 31.8.2007 FORUM Alexander Hartmann , Professor Dr., Institut für Physik, Universität Oldenburg :

    Doing better Theoretical Physics by using modern Computer Science optimization algorithms

    Abstract: Many important problems in Theoretical Physics can be mapped onto problems from Theoretical Computer Science, in particular optimization problems. Using recently developed sophisticated optimization algorithms, such problems can be studied numerically in several cases much better compared to using traditional physics simulation approaches.

    Here, two examples from statistical physics/solid state physics will be discussed. We show that the behavior of two types of disordered magnets can be understood much better by using mappings to optimization problems. First, so called random-field Ising systems can be mapped onto networks and studied using max-flow algorithms. Second, the low-temperature behavior of two-dimensional Ising spin glasses can be investigated using minimum-weight perfect matching algorithms.

    The focus of the talk will be on how the physics problems are mapped onto optimization problems and on algorithms for calculating different interesting quantities. Only few physics results will be given for illustration.

  • 17.8.2007 FORUM Romualdo Pastor-Satorras , D.Sc.(Tech.), Professor, Departament de Física i Enginyeria Nuclear, Universitat Politècnica de Catalunya :

    Epidemic modeling of computer viruses

    Abstract: Computer viruses have a long history, dating from the 1980s and before, becoming newly and sadly famous after each new bug attack, which can eventually cause losses worth millions of dollars in computer equipment and downtime. Their ever increasing threat has therefore stimulated a growing interest in the scientific community and the economic world, translated in this latter case into the antivirus software business, moving millions of dollars worldwide every year. Computer virus studies have been carried out for long time, based mainly in the analogy with biological epidemiology. In particular most studies have focused on the statistical epidemiology approach, aimed at modeling and forecasting the global incidence and evolution of computer virus epidemics in the computer world. Puzzling enough, however, the observed behaviour of computer viruses in the wild exhibits peculiar characteristics that find a difficult explanation in the usual epidemic spreading framework. In particular, digital viruses appear to easily reach long lasting, almost endemic, steady states, corresponding to generalized very long lifetimes, indistinctively of the viral strain. In this talk we will show that the key to understand these peculiar features is the particular background in which computer virus activity carries on. While classical epidemiological models consider viruses propagating on the vertices of regular lattices or random networks with rather homogeneous degree distribution, computer viruses dwell in a digital world (the physical Internet, the WWW, the e-mail network) which, due to its scale-free nature, has intrinsically large degree fluctuations. The introduction of this new element yields a theoretical scenario in which all viruses, irrespective of their virulence, have a chance to pervade the whole system, allowing to rationalize the empirical data from computer virus epidemics.

[TCS main] [Contact Info] [Personnel] [Research] [Publications] [Software] [Studies] [News Archive] [Links]
Latest update: 25 January 2008. Kaisa Nyberg