Information theory provides a satisfactory theory for understanding stationary information sources. This theory, which was created to analyze communications between electronic devices, has found numerous applications in almost all branches of science.
A requirement to apply this theory is the existence of a fixed language, which is independent of the information that is shared. This makes this theory unsuitable for addressing fundamental questions of evolutionary biology, contemporary music cognition and many other disciplines. To the best of our knowledge, there exist no theory which is able to give account of evolving information sources and hence explain the dynamics of information.
There exist a deep link between information theory, which deals with stationary information sources, and equilibrium statistical mechanics. Thinking by analogy, we believe that non-equilibrium statistical mechanics holds the seeds for developing a theory which could explain the dynamics of information. The absence of the later may be related with the lack of a clear and general theory of non-equilibrium phenomena.
After reviewing the fundamentals concepts of information theory, the talk will present the limitations of the existent theory and explore the relationship between information dynamics and statistical physics.