X hits on this document

449 views

0 shares

0 downloads

0 comments

191 / 247

182

DIGITAL PEOPLE

To Breazeal, the answer is unequivocally no. She writes,Kismet is not conscious, so it does not have feelings. . . .That Kismet is not conscious (at least not yet) is [Breazeals] philosophical position.Rodney Brooks agrees.It is all very well,he writes,

for a robot to simulate having emotions . . . it is fairly easy to accept that [roboticists] have included models of emotions . . . some of todays robots and toys appear to have emotions. However, I would think most people would say that our robots do not really have emotions. (Italics in the original.)

Brookss viewpoint is echoed by other researchers, such as Rosalind Picard of the MIT Media Lab. Picard is a pioneer in affective comput- ing; as she defines it,computing that relates to, arises from, or delib- erately influences emotions.She believes that Computers do not have feelings in the way that people do . . . computers simply arent built the way we are to have that kind of an experience.

Coming from robotics and computer experts, these comments about the lack of internal emotional states suggest that intrapersonal intelligence does not exist among artificial beings. But there are rea- sons to think that at least a low-level intrapersonal component is achievable, and that such an advance would represent an opening wedge for self-awareness, as we shall see later. Whether real or not, however, some familiarity with emotions can be more than a frill for artificial beings. If a being can smile at you, and recognize your own smile, your interaction is likely to go more smoothly than without these human attributes.This is also true for humancomputer inter- actions: Picard notes that the appearance of emotions could, for in- stance, enhance computerized tutoring.

For artificial beings that move in the world, something akin to emotion can even be a survival factor. In his book Robot: Mere Ma- chine to ranscendent Mind, Hans Moravec argues that advanced robots operating in the real world would need to deal with contingencies, and could do so through internal functions that parallel what real emotions do.These functions would take the form of watchdog pro- grams,constantly operating within the robot to keep an eye out for trouble. If such a program senses danger ahead,

Document info
Document views449
Page views449
Page last viewedFri Dec 02 20:04:16 UTC 2016
Pages247
Paragraphs1773
Words80043

Comments