X hits on this document

PDF document

On pattern, categories, and alternate realities - page 3 / 14





3 / 14

Volume 14, Number 3


March 1993

Here we note that pattern-man is used as a model and described in terms of qualities or features. John Henry was originally an Anglican and his quotation took me back to the elementary school I attended as a child in India. St. Mark's School, run by two Welsh sisters, imparted a love of the English Language through tales from Shakespeare, English history, and translations from Celtic, Greek and Norse mythology. The pupils, Christians, Muslims, Hindus like myself, and others attended Bible class and on many Sundays we were expected to attend services at the Church of England next door to the school. No doubt there we heard about divine grace and sin and repentance. The quotation also triggered a memory of part of a rock-a-billy song-that's rock plus hillbilly:

"If you want to get to Heaven You've got to raise a little Hell."1

Obviously, natural virtue corresponds to local minima, not the global desired state and we know that to get out of the local minima you need to raise the temperature every now and then. Clearly a simulated annealing approach to salvation!

Some sketches from the current pattern recognition scene

Although use of simulated annealing in pattern recognition was a later development, many ideas and methodologies which V encountered when I entered the field of pattern recognition, have come back, fortunately in a more robust form, with the resurgence of artificial neural networks, dynamical systems and other "complex" systems. Some in the field have greeted this resurgence with dismay. To them we might well say what Guidillo said to Rocco in Ignazio Silone's novel A Handful of Blackberries:

"I see you are tired son, and disappointed. You have the sadness of one who has traveled far and ends up finding himself where he began. Didn't they teach you at school that the world is round?"

Unlike the critics who are dismayed I am not sad but happy at this resurgence. Neural Nets, dynamical systems, search, optimization, mean-field annealing, graduated non-convexity optimization, Genetic Algorithms-I like these approaches because not only are they proving useful and interesting, they use concepts and language I learned years ago, and so I don't have to learn many new tricks to continue playing in this game. To present an overview of "this game" now is a lot harder than it Was when I wrote my article "Patterns in Pattern Recognition, 1968-1974" (1974). Here I will just mention a few items that my colleagues and I have found useful or that have struck my fancy.

Artificial neural networks

In 1961-62, my group at GD/E had filed two patents on a Perceptron type network and implemented it in a computer system called Adaptive Pattern Encoder, or APE. As I have written elsewhere, those were the days of catchy names and audacious claims (1972). The lack of a well-defined training algorithm for multilayer networks and the emerging understanding of the relationships between perceptron type training algorithms and statistical classification methods had by 1962, when I joined Philco-Ford, shifted my attention to training procedures derived from parametric and non-parametric statistics (Kanal (1962) and my foreword, in Sethi and Jain (1992)). By 1968 hybrid linguistic-statistical models seemed to me to be more appropriate for many pattern recognition problems than purely statistical or purely structural/linguistic models. (Part of our work on this topic was presented in Kanal (1970), Kanal and Chandrasekaran (1972)). Thanks to an invitation from Azriel Rosenfeld, in 1970 I became a Professor of Computer. Science at the University of Maryland. The research projects I started there were on interactive pattern analysis and statistical classification using decision trees (1972 and 1977), on generative and descriptive models for characterizing error patterns in communication channels with memory (1971 and 1978), on linguistic models for graphical formula translation (Underwood and Kanal (1973)) and on hybrid linguistic-statistical models for waveform and time-series analysis (Stockman et al. (1973) and (1974)). At Maryland I started teaching courses in Artificial Intelligence (Al) and became interested in applications of search and problem reduction methods of AI to problems in pattern recognition (Kulkarni and Kanal (1976) and (1978), Stockman et al. (1976), Kanal (1979), Stockman and Kanal (1983)). This also led to my interest in understanding


Ozark Mountain Daredevils, “If You Want to Get to Heaven”, A&M Records, 1973


Document info
Document views28
Page views28
Page last viewedThu Oct 27 23:35:21 UTC 2016