# Non-equilibrium biology

John Collier, University of Newcastle, Australia pljdc<olinga.newcastle.edu.au

Abstract not available, but see [20, 21, 27, 28, 78J.

# Chaos and detection

Andy Fraser, Portland State University a ndy<Osysc. pdx. ed u

The low likelihood of linear models fit to chaotic signals and the ubiquity of strange attractors in nature suggests that nonlinear modeling techniques can improve performance for some de tection problems. We review likelihood ratio detectors and limitations on the performance of linear models implied by the broad Fourier power spectra of chaotic signals. We observe that the KS entropy of a chaotic system establishes an upper limit on the expected log likelihood that any model can attain. We apply variants of the hidden Markov models used in speech research to a synthetic detection problem, and we obtain performance that surpasses the the oretical limits for linear models. KS entropy estimates suggest that still better performance is possible. For references, see [31, 32, 33J.

# Practical Entropy Computation in Complex Systems

Neil Gershenfeld, MIT Media Lab neilg<omedia.mit.edu

I will discuss the interplay between entropy production in complex systems and the estimation of entropy from observed signals. Entropy measurement in lag spaces provides a very general way to determine a system's degrees of freedom, geometrical complexity, predictability, and stochasticity, but it is notoriously difficult to do reliably. I will consider the role of regularized density estimation and sorting on adaptive trees in determining entropy from measurements, and then look at applications in nonlinear instrumentation and the optimization of information

processing systems. For references, see [33, 79J.

.

# Origin and growth of order in the expanding universe

David Layzer, Harvard layzer<Oda.harvard.edu

I define order as potential statistical entropy: the amount by which the entropy of a statistical description falls short of its maximum value subject to appropriate constraints. I argue that, as a consequence of a strong version of the postulate of spatial uniformity and isotropy, the universe contains an irreducible quantity of specific statistical entropy; and I postulate that in the initial state the specific statistical entropy was equal to its maximum value: the universe expanded from an initial state of zero order. Chemical order, which shows itself most conspic-