Wojciech Szpankowski, Purdue University, W. Lafayette, U.S.A.

Information Theory by Analytic Methods: The Precise Minimax Redundancy

Recent years have seen a resurgence of interest in redundancy rates of lossless and lossy coding. The redundancy rate problem for a class of sources consists in determining by how much the actual code length exceeds (on average or in the worst case) the optimal code length. In the minimax scenario one finds the best code for the worst source (again either on average or the worst case). The redundancy rate problem is typical of a situation where second-order asymptotic play a crucial role (since the leading term of the optimal code length is known to be the entropy of the source). This problem is an ideal candidate for ``analytic information theory'', which applies analytic tools (i.e., complex analysis) to information theory. In this talk, we focus on the precise maximal minimax redundancy: We show how to transform the Shtarkov inequality of 1987 for the minimax redundancy into a precise equality. This leads to a decomposition of the minimax redundancy into a nondecreasing term that depends only on the underlying class of sources and a (possibly oscillating) term that depends on the optimal code. Surprisingly, it turns out that the optimal code for the maximal redundancy (for known sources) is the so called generalized Shannon code. This should be contrasted with the optimal Huffman code that minimizes the average redundancy. We illustrate our findings on memoryless sources: In obtaining them we use the methods of Fourier analysis and Gleichverteilung (i.e., sequence distributed) mod 1. This is joint work with M. Drmota, TU Wien.


Virginie Collette
Last modified: Tue Feb 20 15:49:24 MET 2001