Information Theory by Analytic Methods: Redundancy Rate Problem

Recent years have seen a resurgence of interest in redundancy rates of lossless coding. The redundancy rate problem for a class of sources consists in determining by how much the actual code length exceeds the optimal (ideal) code length. In a minimax scenario one finds the maximal redundancy over all sources within a certain class while in the average scenario one computes the average redundancy over all possible source sequences. The redundancy rate problem is typical of a situation where second-order asymptotics play a crucial role (since the leading term of the optimal code length is known to be the entropy of the source). This problem is an ideal candidate for ``analytic information theory'', which applies analytic tools (i.e., complex analysis) to information theory. In this talk, we survey our recent results on the redundancy rate problem obtained by analytic methods. In particular, we present our findings for the redundancy of Shannon-Fano codes, Huffman codes, minimax redundancy for memoryless sources and renewal processes, and Lempel-Ziv codes. We shall briefly discuss some analytic tools used to establish these results, such as Fourier analysis, generating functions, singularity analysis, Mellin transform, the saddle point method, and analytic depoissonization. To illustrate these techniques in some depth we will present one result in detail (e.g., the minimax redundancy for renewal processes -- joint work with P. Flajolet).