Penn Arts & Sciences Logo

Friday, September 22, 2006 - 2:00pm

Wojciech Szpankowski

Purdue U

Location

University of Pennsylvania

Skirkanich Hall, Berger Aud.

Skirkanich Hall is located at 210 S. 33rd St.

Analytic information theory aims at studying problems of information theory using analytic algorithmics and combinatorics. Following Hadamard and Knuth's precept, we tackle these problems by complex analysis methods such as generating functions, Mellin transform, Fourier series, saddle point method, analytic poissonization and depoissonization, and singularity analysis. This pursuit lies at the crossroad of computer science and information theory. In this talk, we concentrate on one facet of information theory, namely the redundancy rate problem of source coding (better known as data compression). The redundancy rate problem for a class of sources is the determination of how far the actual code length exceeds the optimal (ideal) code length. We examine the worst case (individual) minimax redundancy for memoryless, Markov, and renewal sources. Precise analysis of redundancy for such sources leads us to the tree generating functions (arising in counting labeled rooted trees), integer partitions, enumeration of Eulerian paths in multigraphs, and counting binary trees with a given path. In the last part of this talk we reflect on *information* in its generality, and muse on some problems in the interface of computer science and information theory. In conclusion, we describe a few pertinent challenges that bridge Shannon information with Boltzmann's entropy, Maxwell's demon, Landauer's principle, Bennett's irreversible computations, and also timeliness and value of information.