Archive

Existence and Non-existence of Universal Graphs

A talk by Larry Moss, Indiana University, Department of Mathematics.
January 28, 2013, 4 pm, LH 101


Abstract:.
The random graph R is the graph with infinitely many vertices, with every pair connected with a probability of 1/2, independently. This graph R was discovered by R. Rado in 1964. It has many interesting properties, one of which is that it is universal: every countable graph is a subgraph of R. This graph is also connected with finite random graphs, and with model theory.

I’ll be talking about a set of related results going back to the 1990s, having to do with other notions of universality. Perhaps the most quotable one is that there is a countable graph U with the property that every countable graph is an isometric subgraph of U. There are also intriguing open problems in finite graph theory coming from this subject.

On detection methods and analysis of malware

A talk by Jean-Yves Marion, University of Lorraine, LORIA, Nancy, France.
October 29, 2012, 4 pm, LH 101


Abstract:
This talk will present different research directions in malware analysis and detection. First, we will make a brief overview of the detection techniques and of the malware defenses. We will essentially focus on the analyze of cryptographic implementations, which are important for malware analysis where they are an integral part both of the malware payload and the unpacking code that decrypts this payload. We present a tool that identifies cryptographic functions in obfuscated programs, by retrieving loops and their I/O parameters in an implementation-independent fashion, and comparing them with those of known cryptographic functions. This work was presented at CCS this year. Then, we will present malware behavioral detection based on model-checking from a work present at Esorics’12.

Foundational Analyses of Computation

A talk by Yuri Gurevich, Microsoft Research, Redmond
October 22, 2012, 4 pm, LH 101


Abstract:

How can one possibly analyze computation in general? The task seems
daunting if not impossible. There are too many different kinds of
computation, and the notion of general computation seems too amorphous. As
in quicksand, one needs a rescue point, a fulcrum. In computation analysis,
a fulcrum is a particular viewpoint on computation that clarifies and
simplifies things to the point that analysis become possible.
We review from that point of view the few foundational analyses of
general computation in the literature: Turing’s analysis of human
computations, Gandy’s analysis of mechanical computations, Kolmogorov’s
analysis of bit-level computation, and our own analysis of computation on
the arbitrary abstraction level.

Pseudorandom Generators from One-way Functions, Part II

A talk by Hani T. Dawoud, IUB, SoIC
October 8, 2012, 4 pm, LH 101


Abstract:

Pseudorandom generator (PRG) is a fundamental primitive in cryptography and in other areas of computer science. From the perspective of cryptography, the usefulness of PRG is demonstrated by the fact that it can be used to construct very powerful cryptographic primitives, like private-key cryptography, bit-commitment schemes, and zero-knowledge proofs for NP. From the perspective of feasible computation, PRG is used to reduce the number of random bits needed for any probabilistic polynomial-time algorithm and thus performing a deterministic simulation of any such algorithm. Intuitively, a PRG is a polynomial-time computable function G that stretches a short random string s into a longer string G(s) that looks random to any efficient distinguisher. In these talks we consider the cryptographic pseudorandom generators.

In the first talk, we will introduce the seminal result of Hastad, Impagliazzo, Levin, and Luby (HILL) [SICOMP ’99] that one-way function implies a PRG. We will show constructions, security proofs, and we will introduce the key notion in this result—namely, pseudoentropy.

In the second talk, we will introduce recent work by Haitner, Reingold, and Vadhan [STOC’10], Vadhan and Zheng [STOC ’12] on simplifying and improving the efficiency of PRGs. Also we will introduce the new notion of next-bit pseudoentropy and how it enabled circumventing the complications in the HILL construction.

Pseudorandom Generators from One-way Functions, Part I

A talk by Hani T. Dawoud, IUB, SoIC
October 1, 2012, 4 pm, LH 101


Abstract:

Pseudorandom generator (PRG) is a fundamental primitive in cryptography and in other areas of computer science. From the perspective of cryptography, the usefulness of PRG is demonstrated by the fact that it can be used to construct very powerful cryptographic primitives, like private-key cryptography, bit-commitment schemes, and zero-knowledge proofs for NP. From the perspective of feasible computation, PRG is used to reduce the number of random bits needed for any probabilistic polynomial-time algorithm and thus performing a deterministic simulation of any such algorithm. Intuitively, a PRG is a polynomial-time computable function G that stretches a short random string s into a longer string G(s) that looks random to any efficient distinguisher. In these talks we consider the cryptographic pseudorandom generators.

In the first talk, we will introduce the seminal result of Hastad, Impagliazzo, Levin, and Luby (HILL) [SICOMP ’99] that one-way function implies a PRG. We will show constructions, security proofs, and we will introduce the key notion in this result—namely, pseudoentropy.

In the second talk, we will introduce recent work by Haitner, Reingold, and Vadhan [STOC’10], Vadhan and Zheng [STOC ’12] on simplifying and improving the efficiency of PRGs. Also we will introduce the new notion of next-bit pseudoentropy and how it enabled circumventing the complications in the HILL construction.

Turing Symposium

A centenary celebration

Organized by the Theory Group and IU Program in Pure and Applied Logic

May 8, 2012, 8:30-5:30 pm, LH 102

See here for more details and the program.

Optimal Acceleration-Bounded Trajectory Planning in Dynamic Environments Along a Specified Path (AKA: a solution to the acceleration-bounded Frogger problem)

A talk by Jeff Johnson, IUB, SoIC
April 16, 2012, 5:30 pm, LH 101


Abstract:

Vehicles that cross lanes of traffic encounter the problem of navigating around dynamic obstacles under actuation constraints. This paper presents an optimal, exact, polynomial-time planner for optimal bounded-acceleration trajectories along a fixed, given path with dynamic obstacles. The planner constructs reachable sets in the path-velocity-time (PVT) space by propagating reachable velocity sets between obstacle tangent points in the path-time (PT) space. The terminal velocities attainable by endpoint-constrained trajectories in the same homotopic class are proven to span a convex interval, so the planner merges contributions from individual homotopic classes to find the exact range of reachable velocities and times at the goal. A reachability analysis proves that running time is polynomial given reasonable assumptions, and empirical tests demonstrate that it scales well in practice and can handle hundreds of dynamic obstacles in a fraction of a second on a standard PC.

An Introduction to the Ideas of Fully Homomorphic Encryption, Part II
A talk by Steven Myers, IUB, SoIC
April 9, 2012, 5:30 pm, LH 101

An Introduction to the Ideas of Fully Homomorphic Encryption, Part II
A talk by Steven Myers, IUB, SoIC
March 26, 2012, 5:30 pm, LH 101
CANCELLED

An Introduction to the Ideas of Fully Homomorphic Encryption
A talk by Steven Myers, IUB, SoIC
March 19, 2012, 5:30 pm, LH 101


Abstract:

Since the advent of public-key cryptography, one of its holy grails has been the development of fully homomorphic encryption. This is a system in which there is a natural ring homomorphism between the ciphertext space, and the underlying message space: given ciphertexts c_1,c_2 of messages m_1 and m_2, then the decryption of c_1 + c_2 and c_1 * c_2 correspond to m_1+m_2 and m_1 * m_2 respectively.
Just like the holy grail, most cryptographers felt that fully homomorphic encryption would be impossible to attain, and that a group homomorphism, which allows only one of the two types of previously mentioned operators, would be the best that could be attained. In 2009, Craig Gentry overturned orthodox belied and introduced a fully homomorphic encryption scheme that was based on a reasonable lattice assumption. The construction introduced a beautiful and insightful new technique called bootstrapping which permits for the management of noise in ciphertexts, and is the key contribution in getting to fully homomorphic encryption. We will introduce the concept of fully homomorphic encryption, show constructions of the schemes, introduce the concept of bootstrapping, and hint at the proofs of security if time permits.

Bounds on Exponential Sums and Boolean Circuit Complexity
A talk by Karsten Chipeniuk, IUB, Math
March 5, 2012, 5:30 pm, LH 101


Abstract:

We will present arguments due to J. Bourgain and F. Green, A. Roy, and H. Straubing used to prove bounds for exponential sums which closely approximate the correlation between certain boolean functions. We will also discuss applications to boolean circuit
complexity theory.

On the Foundations for Nash’s Equilibrium
A talk by Robert A. Becker, IUB, Economics
February 27, 2012, 5:30 pm, LH 101


Abstract:

Nash’s well-known theorem on the existence of equilibrium points assumes a particular behavioral representation of player’s payoff functions: expected utility theory applies. This behavioral foundation is open to challenges, such as the Allais Paradox. Game theorists have been restrained in how far they have widened their existence theory for Nash equilibrium points to encompass newer behavioral foundations.

The purpose of my talk will be to talk about some of the issues arising with alternative behavioral foundations for demonstrating an equilibrium existence theorem. The talk will bring it some of my work with Subir Chakrabarti, as well as draw on some papers published by others.

Automatic Sequences and Zip-Specifications, Part II
A talk by Larry Moss , IUB, Math
February 20, 2012, 5:30 pm, LH 101

Automatic Sequences and Zip-Specifications
A talk by Larry Moss , IUB, Math
January 30, 2012, 5:30 pm, LH 101


Abstract:

We consider infinite sequences of symbols, also known as streams, and the decidability question for equality of streams defined in a restricted format. This restricted format consists of prefixing a symbol at the head of a stream, of the stream function `zip’, and
recursion variables. Here `zip’ interleaves the elements of two streams in alternating order, starting with the first stream. For example, the Thue-Morse sequence (0,1,1,0,1,0,0,1, . . . ) is obtained by the `zip-specification’ {M = 0 : X, X = 1:
zip(X,Y), Y= 0 : zip(Y,X)}. Our analysis of such systems employs both term rewriting and coalgebraic techniques. We establish decidability for these zip-specifications, employing bisimilarity of observation graphs based on a suitably chosen cobasis. The importance of
zip-specifications resides in their intimate connection with automatic sequences. We establish a new and simple characterization of automatic sequences. Thus we obtain for the binary zip that a stream is 2-automatic iff its observation graph using the cobasis (hd,even,odd) is finite. The generalization to zip-k specifications and their
relation to k-automaticity is straightforward. In fact, zip-specifications can be perceived as a term rewriting syntax for automatic sequences. Our study of zip-specifications is placed in an even wider perspective by employing the observation graphs in a dynamic logic setting, leading to an alternative characterization of automatic sequences. We further obtain a natural extension of the class of automatic sequences, obtained by `zip-mix’ specifications that use zips of different arities in one specification. We also show that equivalence is undecidable for a simple extension of the zip-mix format with projections like even and odd. However, it remains open whether zip-mix specifications have a decidable equivalence problem.

The work reported in this talk originated with a question that I have been asking at IU and other places over the years, whether the equality relation for zip-specifications is indeed decidable. The question was taken up in earnest by my collaborators on this project, Clemens Grabmayer (Utrecht), Joerg Endrullis, Dimitri Hendriks, and Jan Willem Klop (all of the Free University, Amsterdam).

I’ll explain many of the notions from coalgebra slowly, and so this will take two sessions.

Deterministic Annealing
A talk by Geoffrey Fox , IUB, SoIC
January 23, 2012, 5:30 pm, LH 101


Abstract:

We discuss general theory behind deterministic annealing and illustrate with applications to mixture models (including GTM and PLSA), clustering and dimension reduction. We cover cases where the analyzed space has a metric and cases where it does not. We discuss the many open issues and possible further work for methods that appear to outperform the standard approaches but are in practice not used.

References

  1. Ken Rose, Deterministic Annealing for Clustering, Compression, Classification,
    Regression, and Related Optimization Problems. Proceedings of the IEEE, 1998. 86: p. 2210–2239.

  2. References earlier papers including his Caltech Elec. Eng. PhD 1990 and papers like Rose, K., Gurewitz, E., and Fox, G. C. “Statistical mechanics and phase transitions in clustering,” Physical Review Letters, 65(8):945-948, August 1990.

Later important work

  1. T Hofmann, JM Buhmann, “Pairwise data clustering by deterministic annealing”, IEEE Transactions on Pattern Analysis and Machine Intelligence 19, pp1-13 1997.
  2. Hansjörg Klock and Joachim M. Buhmann, “Data visualization by multidimensional scaling: a deterministic annealing approach”, Pattern Recognition, Volume 33, Issue 4, April 2000, Pages 651-669.
  3. Frühwirth R, Waltenberger W: Redescending M-estimators and Deterministic Annealing, with Applications to Robust Regression and Tail Index Estimation. http://www.stat.tugraz.at/AJS/ausg083+4/ 08306Fruehwirth.pdf. Austrian Journal of Statistics 2008, 37(3&4):301-317.

Recent algorithm work by Seung-Hee Bae, Jong Youl Choi (Indiana CS PhD’s)

  1. http://grids.ucs.indiana.edu/ptliupages/publications/CetraroWriteupJune11-09.pdf
  2. http://grids.ucs.indiana.edu/ptliupages/publications/hpdc2010_submission_57.pdf
  3. http://grids.ucs.indiana.edu/ptliupages/publications/pdac24g-fox.pdf

Introduction to additive combinatorics, Part II
A talk by Nets H. Katz , IUB, Math
November 28, 2011, 5:30 pm, LH 101


Abstract:

We discuss some basics about the structure of sets which, while not
additively closed, are close to being so in an appropriate sense.

Introduction to additive combinatorics
A talk by Nets H. Katz , IUB, Math
November 14, 2011, 5:30 pm, LH 101


Abstract:

We discuss some basics about the structure of sets which, while not
additively closed, are close to being so in an appropriate sense.

Reasoning about conditional independence
A talk by Dirk Van Gucht , IUB
November 7, 2011, 5:30 pm, LH 101


Abstract:

Conditional independence is a crucial notion in the development of probabilistic systems which are successfully employed in a variety of computing areas. Numerous real-world systems can be modeled by a probability distribution over a set of random variables. Unfortunately, reasoning over the full joint probability distribution is intractable for all but the smallest number of cases. It is the very notion of conditional independence that facilitates the decomposition (factorization) of joint probability distributions into smaller parts. This is turn provides leverage to do more efficient probabilistic inference and learning. In this talk, we will discuss the Conditional Independence Implication Problem which is the problem of deciding whether a conditional independence statement follows from a given set of conditional independence statements. (For example, such a set of statements could be derived from probabilistic graphical models). Surprisingly, it is not yet known whether this problem is decidable. (This problem has been open since 1980.) We are therefore left with developing heuristics that are sufficient to detect valid and/or invalid instances of this implication problem. A remarkably useful tool in this regard is the study the CI implication problem relative to the class of supermodular functions. This class of functions contains the class of probability distributions. Supermodular functions, which are the discrete analogue of convex functions, occur in many domains, including combinatorial optimization, databases, game-theory, reasoning about uncertainty, etc. The theory of supermodular functions is rich and we will see how it can be used to reason about probabilistic conditional independence.

Mechanizing Metatheorems in Twelf
A talk by Zachary Sparks , IUB
October 24, 2011, 5:30 pm, LH 101


Abstract:

According to the Twelf wiki (http://twelf.org), “Twelf is a language used to specify, implement, and prove properties of deductive systems such as programming languages and logics.” In practice, Twelf is a logic programming language that is able to prove theorems about the deductive systems formalized inside of it. While not technically as powerful as languages such as Coq, it is sufficiently powerful to prove metatheorems about deductive systems (for example, the proof of safety of Standard ML has been formalized in Twelf) in a clean and intuitive way. We will go over the basics of encoding a deductive system in Twelf, and (time permitting) formalize a proof of type preservation of the simply-typed lambda calculus.

The purpose of this talk is to provide an overview of Twelf, demonstrate some simple but characteristic examples of how Twelf is used, and hopefully convince others to try Twelf for their metatheorem-mechanization needs. The goal of this talk is not to provide a deep understanding of the inner workings of Twelf, nor is it to give any complicated demonstrations of
how Twelf is used. There are examples of these things on the wiki, as well as a few other resources developed by people involved with the Twelf project; hopefully this talk will provide enough background to make the wiki’s examples more approachable.

Background knowledge of functional programming, logic programming, and type theory is helpful, but not assumed. Ideally, nothing more than a mathematical background and an open mind should be required. There will be several examples drawn from functional programming, though, and some basics of logic programming (such as mode checking) may be glossed over in the interest of time.

An Introduction to Information Theory: Static Information
A talk by Alex Gates, IUB
October 17, 2011, 5:30 pm, LH 101


Abstract:

Since its formalization by Claude Shannon, Information Theory has become a powerful tool for the analysis of a broad range of systems, with applications to neurobiology, cryptography, data compression, genetics, machine learning, and many other areas. Here we present an introduction to the basic quantities of information: entropy, joint entropy, mutual entropy, conditional entropy, and partial entropy. These measures will provide a
foundation for the concepts of maximal entropy distributions and Kolmogorov Complexity. Finally, a brief comparison to some concepts in statistical physics will be made.

Throughout this introduction we will assume a familiarity with both discrete and continuous probability theory at an undergraduate level. Examples will be drawn from many areas and will cater to all backgrounds.

References

[1] J. Jacod and P. Protter, Probability Essentials. Springer, 2nd ed., 2004.

[2] T. M. Cover and J. A. Thomas, Elements of Information Theory 2nd Edition. Wiley Series in Telecommunications and Signal Processing, Wiley-Interscience, 2nd ed., 2006.

[3] J. P. Sethna, Statistical mechanics: entropy, order parameters, and complexity. Oxford University Press, 2006.

On the decidability of implicational ticket entailment
A talk by Katalin Bimbo, University of Alberta
October 10, 2011, 5:30 pm, LH 101


Abstract:

The logic of ticket entailment was formulated by A. R. Anderson in 1960. The implicational part of this logic (T->) is especially interesting, because its axioms — according to W. Ackermann — are the axioms for the pure theory of strong implication. Various fragments of relevance logics have been shown either decidable or undecidable by now. The last major remaining open decidability problem in the area was the decidability of T->.

In this talk, I will present some sequent calculi for T-> and closely related relevance logics. In particular, it is possible to obtain a sequent calculus for R->, the logic of relevant implication, by adding restricted structural rules to a sequent calculus for T-> with t. It is well known that R-> is decidable, (see Kripke 1959). Then I will show how the decidability of R-> and the somewhat unusual sequent calculus can be put to work together to obtain a proof of the decidability of T->.

Parts of this talk are based on the following two papers:

KB and J. M. Dunn, New consecution calculi for R->t, (18 pp., submitted for publication)

KB and J. M. Dunn, On the decidability of implicational ticket entailment, (in preparation)

The research, some results of which are presented in this talk, is supported by my Standard Research Grant (#410-2010-0207) awarded by the Social Sciences and Humanities Research Council (SSHRC) of Canada.

An Introduction to Interactive Proofs, Arguments and Zero-Knowledge
A talk by Steve Myers, IUB
October 3, 2011, 5:30 pm, LH 101


Abstract:

I will introduce the concept of interactive proofs, and their computational analogue arguments. I will then introduce the notion of Zero-Knowledge in its information and computational frameworks. Time permitting I will introduce the notion of Non-Interactive Zero-Knowledge, Concurrent Zero-Knowledge, and nonmalleable zero-knowledge.

Knowledge and Uncertainty
A talk by Rohit Parikh, Brooklyn College of CUNY and CUNY Graduate Center
September 26, 2011, 5:30 pm, LH 101


Abstract:

Sometimes we have to choose between two actions, but each has a number of possible consequences, depending on circumstances which we do not fully know. Even if we have preferences between individual pairs of consequences, we might not know how to choose among sets. Expected utility is a common way to resolve this issue, but cardinal utilities and subjective probabilities, which we need to define the expected utility, may not always be available.

Such a problem arises when we consider the case where the result of an election may be a set of individuals (perhaps a slate) and even if we have clear preferences among individual candidates we still need to convert them into preferences between slates. The problem also arises in a game theoretic situation where a player does not know what moves have been made and is choosing under uncertainty. So an extension from individual choices to choices between sets is needed.
What are reasonable conditions which such an extension should satisfy? Cagil Tasdemir, Andreas Witzel, and I extend work by several people, considering agents with different temperaments who may extend in different ways.

This brings us to our second issue. Does knowledge always improve our chances or can it make them worse? Leaving out our own irrationality as a possible explanation (we might not want to know where we hid that pack of cigarettes) Kamien, Tauman and Zamir give examples where a player in a two person game can be worse off by knowing something. We apply this work to our own considerations in a situation where someone has some knowledge and can furnish it or withhold it from players actively participating in a game. We provide a model for a game theoretic situation which includes such a player, give examples, and prove one or two results.

A similar problem arises when a candidate is campaigning. Everything she says will a?ect the views of the voters towards her. Given that she has not expressed views on all topics of interest, voters will somehow average over her possible positions, some optimistically, some pessimistic and some using an expected utility. Given how the voters are, what should she say to improve her chances? We express the preferences of the voters as truth assignments over a propositional language expressing signigicant issues, and the candidate’s expressed views as a theory over these issues. Our results are expressed in joint work with Walter Dean.

Aspects of category theory
A talk by Esfan Haghverdi, IUB
September 19, 2011, 5:30 pm, LH 101


Abstract:

I will talk about some basic notions of category theory starting from the
absolute beginnings. Thus no prior exposure to category theory is required,
although, of course it will enhance understanding the material. Time permitting,
I will talk about more recent notions such as categorical trace.