X hits on this document

PDF document

Internal Accession Date Only - page 10 / 18





10 / 18

Continuum entropies for fluids

David Montgomery, Dartmouth College David. C. Montgomery@dartvax.dartmouth.edu

This paper reviews efforts to define useful entropies for fluids and magnetofluids at the level of their continuous macroscopic fields, such as the fluid velocity or magnetic field, rather than at the molecular or kinetic theory level. Several years ago, a mean-field approximation was used to calculate a "most probable state" for an assembly of a large number of parallel interacting ideal line vortices. The result was a nonlinear partial differential equation for the "most probable" stream function, the so-called sinh-Poisson equation, which was then explored in a variety of mathematical contexts. Somewhat unexpectedly, this sinh-Poisson relation turned out much more recently to have quanitative predictive power for the long-time evolution of continuous, two-dimensional, Navier-Stokes turbulence at high Reynolds numbers [54]. It has been our recent effort to define an entropy for non-ideal continuous fluids and magnetofluids that makes no reference to microscopic discrete structures or particles of any kind [55], and then to test its utility in numerical solutions of fluid and magnetofluid equations. This talk will review such efforts and suggest additional possible applications. The work is thought to be a direct but tardy extension of the ideas of Boltzmann and Gibbs from point-particle" statistical mechanics. See also W.H. Matthaeus et aI, Physica D51, 531 (1991) and references therein.

Statistical mechanics and information theoretic tools for the study of formal neural networks

Jean-Pierre Nadal, ENS Paris nadal@physique.ens.fr

Abstract not available, but see [14, 34, 57, 58, 59].

Shannon Information and Algorithmic and Stochastic Complexities

Jorma Rissanen, IBM Almaden rissanen@almaden.ibm.com

This is an introduction to the formal measures of information or, synonymously, description complexity, introduced during the past 70 or so last years, beginning with Hartley. Although all of them are intimately related to ideas of coding theory, I introduce the fundamental Shannon information without it. I also discuss the central role stochastic complexity plays in modeling problems and the problem of inductive inference in general. For references, see [65, 66].

Information and the Fly's Eye

Dan Ruderman, University of Cambridge dlrl002@cus.cam.ac.uk

The design of compound eyes can be seen as a set of tradeoffs. The most basic of these was discussed by Feynman in his Lectures: smaller facets allow for finer sampling of the world but

Document info
Document views22
Page views22
Page last viewedMon Oct 24 00:10:41 UTC 2016