Pittsburgh (Nourbakhsh et al., 1999). In the entertainment market, there are a
growing number of synthetic pets, one of the best known being Sony’s robot dog
Aibo. Much of the research in the humanoid robotics community has focused on
traditional challenges of robot locomotion (e.g., Honda’s P3 bipedalwalker (Hirai,
1998) and upper-torso controlfor object manipulation tasks (e.g., ATR’s humanoid,DB). A few humanoid projects have explored the social dimension, such as Cog at
the MIT AI Lab (Brooks et al., 1999).
2.3. Expressive face robots
There are severalprojects that focus on the development of expressive robot faces,
ranging in appearance from being graphically animated (Bruce et al., 2001), to
resembling a mechanical cartoon (Takanobu et al., 1999; Scheef et al., 2000), to
pursuing a more organic appearance (Hara, 1998; Hara and Kobayashi, 1996). For
instance, researchers at the Science University of Tokyo have developed the most
human-like robotic faces (typically resembling a Japanese woman) that incorporate
hair, teeth, silicone skin, and a large number of control points (Hara, 1998) that map
to the facial action units of the human face (Ekman and Friesen, 1982). Using a
camera mounted in the left eyeball, the robot can recognize and produce a predefined
set of emotive facialexpressions (corresponding to anger, fear, disgust, happiness,
sorrow, and surprise). A number of simpler expressive faces have been developed at
Waseda University, one of which can adjust its amount of eye-opening and neck
posture in response to light intensity (Takanobu et al., 1999). The robot, Feelix,by
Canamero and Fredslund (2001) is a Lego-based face robot used to explore tactile
and affective interactions with people. It is increasingly common to integrate
expressive faces with mobile robots that engage people in an educational or
entertainment setting, such as museum tour guide robots (Nourbakhsh et al., 1999;
Burgard et al., 1998).
As expressive faces are incorporated into service or entertainment robots, there is
a growing interest in understanding how humans react to and interact with them.
For instance, Kiesler and Goetz (2002) explored techniques for characterizing
people’s mental models of robots and how this is influenced by varying the robot’s
appearance and dialog to make it appear either more playful and extraverted or
more caring and serious. Bruce et al. (2001) investigated people’s willingness to
engage a robot in a short interaction (i.e., taking a poll) based on the presence or
absence of an expressive face and the ability to indicate attention.
3. Kismet and the sociable machines project
The ability for people to naturally communicate with such machines is important.
However, for suitably complex environments and tasks, the ability for people to
intuitively teach these robots will also be important. Ideally, the robot could engage
in various forms of social learning (imitation, emulation, tutelage, etc.), so that one
could teach the robot just as one would teach another person. Learning by
demonstration to acquire physical skills such as pole balancing (Atkeson and Schaal,
1997a,b; Schaal, 1997), learning by imitation to acquire a proto-language (Billard,
2002), and learning to imitate in order to produce a sequence of gestures (Demiris
and Hayes, 2002; Mataric, 2000) have been explored on physical humanoid robots
and physics-based animated humanoids. Although current work in imitation-based learning with humanoid robots has dominantly focused on articulated motor
coordination, social and emotional aspects can play a profound role in building
robots that can communicate with and learn from people.
The Sociable Machines Project develops an expressive anthropomorphic robot 情感和社交的人形机器人英文文献和中文翻译(3):http://www.751com.cn/fanyi/lunwen_12732.html