Abstract This paper focuses on the role of emotion and expressive behavior in regulating social interaction between humans and expressive anthropomorphic robots, either in communicative or teaching scenarios. We present the scientific basis underlying our humanoid robot’s emotion models and expressive behavior, and then show how these scientific viewpoints have been adapted to the current implementation. Our robot is also able to recognize affective intent through tone of voice, the implementation of which is inspired by the scientific findings of the developmental psycholinguistics community. We first evaluate the robot’s expressive displays in isolation. Next, we evaluate the robot’s overall emotive behavior (i.e. the coordination of the affective recognition system, the emotion and motivation systems, and the expression system) as it socially engages nave human subjects face-to-face.20813
r 2003 Elsevier Science Ltd. All rights reserved.
Keywords: Human–robot interaction; Emotion; Expression; Sociable humanoid robots
1. Introduction
Sociable humanoid robots pose a dramatic and intriguing shift in the way one
thinks about control of autonomous robots. Traditionally, autonomous robots are
designed to operate as independently and remotely as possible from humans, often
performing tasks in hazardous and hostile environments (such as sweeping
minefields, inspecting oil wells, or exploring other planets). Other applications such
as delivering hospital meals, mowing lawns, or vacuuming floors bring autonomousrobots into environments shared with people, but human–robot interaction in these
tasks is still minimal.
However, a new range of application domains (domestic, entertainment, health
care, etc.) are driving the development of robots that can interact and cooperate with
people as a partner, rather than as a tool. In the field of human computer interaction
(HCI), research by Reeves and Nass (1996) has shown that humans (whether
computer experts, lay people, or computer critics) generally treat computers as they
might treat other people. From their numerous studies, they argue that a social
interface may be a truly universal interface (Reeves and Nass, 1996). Humanoid
robots (and animated software agents) are arguably well suited to this. Sharing a
similar morphology, they can communicate in a manner that supports the natural
communication modalities of humans. Examples include facial expression, body
posture, gesture, gaze direction, and voice. It is not surprising that studies such as
these have strongly influenced work in designing technologies that communicate
with and cooperate with people as collaborators.
The paper is organized as follows: first, we review a number of related engineering
efforts in building computer-animated and robotic systems that interact with people
in a socialmanner. Next, we introduce our expressive humanoid robot, Kismet, and
highlight our own efforts in building sociable humanoid robots that engage people
through expressive socialcues (including emotive responses). Section 4 presents
those key principles from the theory of emotion and its expression that have inspired
the design of our robot’s emotion and expression systems. Sections 5 and 6 focus on
the computational models of emotion and motivation that have been implemented
on Kismet. The next section presents how Kismet’s expressive responses are
generated algorithmically. Section 8 evaluates the readability of Kismet’s facial
expressions, and Section 9 evaluates the robot’s ability to engage people socially
through its expressive responses. We conclude the paper with a discussion and
summary of results.
2. Embodied systems that interact with humans
There are a number of systems from different fields of research that are designed 情感和社交的人形机器人英文文献和中文翻译:http://www.751com.cn/fanyi/lunwen_12732.html