X hits on this document

PDF document

Internal Accession Date Only - page 4 / 18





4 / 18


Report on the workshop

Statistical mechanics and information theory have been linked ever since the latter subject emerged out of the engineering demands of telecommunications. Claude Shannon's original paper on "A mathematical "theory of communication", [71], makes clear the analogy between information entropy and the "H" of Boltzmann's H-theorem. Earlier work of Szilard in at tempting to exorcise Maxwell's demon gave a glimpse of the significance of information to physics, [7]. Since that time, the two fields have developed separately for the most part, al though both have interacted fruitfully with the field of statistical inference and both have influenced the development of other fields such as neural networks and learning theory. In recent years, this apartheid has been breached by a number of tantalising observations: for instance, of connections between spin-glass models and error-correcting codes, [73].

To explore these developments, a workshop on "Statistical Mechancis and Information Theory" was held at Hewlett-Packard's Basic Research Institute in the Mathematical Sciences (BRIMS) in Bristol, England. The workshop was organised by Jeremy Gunawarden of BRIMS. The purpose was to bring together physicists, information theorists, statisticians and scientists in several application areas to try and demarcate the common ground and to identify the common problems of the two fields. The style of the workshop was deliberately kept informal, with time being set aside for discussions and facilities being provided to enable participants to work together.

Related conferences which have been held recently include the series on "Maximum Entropy and Bayesian Methods"-see, for instance, [41]-the NATO Advanced Study Institute "From Statistical Physics to Statistical Inference and Back", [34], and the IEEE Workshop on "Infor mation Theory and Statistics".

The workshop revealed that the phrase "information" conveyed very different messages to different people. To some, it meant the rigorous study of communication sources and channels represented by the kind of articles that appear in the IEEE Transactions on Information Theory. To others, particularly the physicists influenced by Jaynes' work, [67], it meant a methodological approach to statistical mechanics based on the maximum entropy principle. To others, it was a less rigorous but, nevertheless, stimulating circle' of ideas with broad

applications to biology and social science.

To others, particularly the mathematicians or

mathematical physicists, it represented a source of powerful mathematical theorems linking rigorous results in statistical mechanics with probability theory and large deviations. To others

still, it was the communication section, and the of material that

starting point for a "theory of information", with broad applications outside science. The abstracts of the talks, which are collected together in the next workshop bibliography, which appears after that, give some idea of the range was discussed at the workshop.

Whether the workshop succeeded in "demarcating the common ground" is a moot point. The mathematical insights, particularly the lectures of John Lewis, Anders Martin-LOf and Sergio Verdu, certainly confirmed the existence of a common ground between information theory, probability theory and statistical mechanics, which in many respects is still not properly explored. But to single out this aspect reflects the peculiar prejudices of the organiser. Perhaps the best that can be said is that the workshop revealed to many of the participants the amazing fruitfulness of the information concept and brought home vividly the importance of developing an encompassing "theory of information", upon which biologists, physicists, communication

Document info
Document views28
Page views28
Page last viewedWed Oct 26 19:52:14 UTC 2016