May 23, 2008
JOINT HIPERCOM
PROJECT. ATTENTION... IT IS A FRIDAY!!!
14h00: What is
information? Wojciech
Szpankowski, Department of Computer
Science, Purdue University, USA.
Information permeates every corner of our lives and shapes our universe.
Understanding and harnessing information holds the potential for significant
advances. The breadth and depth of underlying concepts of the science of
information transcend traditional disciplinary boundaries of scientific and
commercial endeavors. Information can be manifested in various forms:
business information is measured in dollars; chemical information is
contained in shapes of molecules; biological information stored and
processed in our cells prolongs life.
So what is information? In this talk we first attempt to identify the most
important features of information and define it in the broadest possible
sense. We subsequently turn to the notion and theory of information
introduced by Claude Shannon in 1948 that served as the backbone for digital
communication. We go on to bridge Shannon information with Boltzmann's
entropy, Maxwell's demon, Landauer's principle and Bennett's irreversible
computations. We point out, however, that while Shannon created a
successful and beautiful theory of information for communication, a wide
spread application of information theory to economics, biology, life science
and complex networks seems to be still awaiting us. We shall discuss some
examples that recently crop up in biology, chemistry, computer science, and
quantum physics. We conclude with a list of challenges for future research.
We hope to put forward some educated questions, rather than answers, to the
issues and tools that lay before researchers interested in information.
Contact Information Virginie Collette