Whereas anthropomorphic robots have bodies that look and physically act like the human body, anthropopathic robots are able to emote. The robots discussed in this section not only perceive and respond to human emotion, but are themselves possessed of an intrinsic emotional system that permeates their control architecture. For these humanoids, emotional state is not merely an outward expression, but can be used to influence the actions and behavior of the robot.
The robot Kismet is capable of using emotional modeling to guide interaction with humans. Researchers at MIT have experimented with using many children and adults from different cultures to study how effectively Kismet can engage them through social interactions. Kismet responds not only to speech, but also to a variety of multi-modal body language including body posture, the distance of the human from the robot, the movements of the human, and the tone, volume and prosody of their speech. One of the underlying premises of the Kismet project is that emotion is necessary to guide productive learning and communication in general.
Various emotional states produced by Kismet, a robot developed at the MIT AI laboratory, in response to interaction with humans.
A project in Palo Alto, Calif., has created intuitive, human-friendly computer interfaces that can move through space, use a rich suite of senses, demonstrate a variety of physical actions, and communicate with affect instead of syntax. The goal is to produce lifelike, realistic motion that adheres to basic rules of biological entities:
- Overlapping Action: Actions performed in the real world (such as reaching and grabbing) should overlap.
- Follow Through: Actions rarely stop abruptly once the goal has been reached.
- Anticipation: Animals prepare before entering into an action.
- Arcs: Body movements in the real world usually follow curved paths.
- Ambient Motion: Biological bodies are never completely still.
- Ease-In/Ease-Out: Motions begin gradually, speed up and slow down to a halt.
To enact these behavioral rules, the authors designed small, mobile robots with six degrees of freedom on the face (2 eyebrows, 2 eyelids, upper lip and lower lip) and six degrees of freedom for the body (head yaw and pitch, neck, wheels and back). Using these various degrees of freedom, the robots can exhibit compelling emotional responses computed using a periodic function generator that incorporates random variation as smoothly varying noise. In addition, the function generator also factors in the robot's current emotional state. The result is a group of robots that are inherently social in nature. The robots perform tasks based on their emotional "mood." For instance, if the robot is sad, it may perform actions slowly, whereas if it is angry, it may proceed violently.
By constructing an emotion exhibiting robot head, a humanoid project at Waseda University is taking steps toward human-friendly robots that can emote. The project investigates the confluence of physical and psychological dimensions of intelligence by connecting four modes of sensation to differential equations that continuously compute the emotional state of the robot. The latest version of the robot is called WE-3RIII (Waseda Eye-No. 3 Refined version III) and it has four sensations, including vision, hearing, cutaneous and smell. To express emotion, the robot is equipped with eyebrows, eyelids, lips, jaw and facial skin that can change colors to express emotions such as anger or embarrassment. It also has cutaneous sensation that allows it to perceive when it is being pushed, stroked and hit. The robot can also sense warmth near its face and can perceive strong smells such as alcohol, smoke and ammonia.
The Waseda Eye-No.3 Refined Version III humanoid robotic head. Humanoid Project, Waseda University.
Using 24 degrees of freedom, the robot expresses variations of seven emotional states including: normal, happy, surprise, anger, disgust, fear and sadness. Emotional state is based on the solutions of differential equations defined within in a three-dimensional coordinate space. For example, as the robot perceives a push, stroke or hit from a human, it recognizes the action and maps it to an emotional space comprised of axes for pleasantness, certainty and activation (sleep to arousal). As one might expect, the robot finds repeated abuse unpleasant, sleeps when it receives little stimulation, and uses recognition of objects to assign certainty. The goal is not merely to give the appearance of emotion. Rather, emotion is tied intrinsically to the way the robot performs tasks. The robot's interactions with the environment affect its emotional state and, vice versa, its emotional state affects the way it acts. Although derived from purely mathematical equations, the robot compels strong emotional responses from the human viewer.
- David Bruemmer, Send E-mail