It is anticipated that the use of assistive robots will be one of the most important service applications of robotic systems of the future. In this paper, the development of a unique noncontact socially assistive robot consisting of a humanlike demeanor is presented for utilization in hospital wards and nursing∕veteran homes to study its role and impact on the well-being of patients, addressing patient’s needs and its overall effect on the quality of patient care. The robot will be an embodied entity that will participate in hands-off noncontact social interaction with a patient during the convalescence, rehabilitation, or end-of-life care stage. The robot has been designed as a platform to incorporate the three design parameters of embodiment, emotion, and nonverbal communication to encourage natural human-robot interactions. Herein, we present the overall mechanical design of the socially assistive robot focusing mainly on the development of the actuation system of the face, head, and upper body. In particular, we propose the development of a unique muscle actuation mechanism for the robotic face to allow for the display of rich facial expressions during social assistive interaction scenarios. The novelty of the actuation system is in its use of the dependency of facial muscle activity to minimize the number of individual actuators required to control the robotic face.

1.
Shibata
,
T.
, 2004, “
An Overview of Human Interactive Robots for Psychological Enrichment
,”
Proceedings of Institute of Electrical and Electronics Engineers
.
2.
Montemerlo
,
M.
,
Pineau
,
J.
,
Roy
,
N.
,
Thrun
,
S.
, and
Verma
,
V.
, 2002, “
Experiences With a Mobile Robotic Guide for the Elderly
,”
National Conference on Artificial Intelligence
,
Edmonton, Canada
, Vol.
18
, pp.
587
592
.
3.
Heerink
,
M.
,
Kröse
,
B.
,
Wielinga
,
B.
, and
Evers
,
V.
, 2006, “
Human-Robot User Studies in Eldercare: Lessons Learned
,”
International Conference on Smart Homes and Health Telematics
,
Belfast, Northern Ireland
, pp.
31
38
.
4.
Libin
,
A.
, and
Libin
,
E.
, 2004, “
Person-Robot Interaction From the Robopsychologists Point of View: The Robotic Psychology and Robotherapy Approach
,”
Proceedings of Institute of Electrical and Electronics Engineers
, Vol.
92
, pp.
1789
1803
.
5.
Koda
,
T.
, and
Maes
,
P.
, 1996, “
Agents With Faces: The Effect of Personification
,”
IEEE International Workshop on Robot and Human Communication
,
Tsukuba, Japan
, pp.
189
194
.
6.
Takeuchi
,
A.
, and
Naito
,
T.
, 1995, “
Situated Facial Displays: Towards Social Interaction, Human Factors in Computing Systems
,”
Conference on Human Factors in Computing Systems
,
Denver, CO
, pp.
450
455
.
7.
Keisler
,
S.
, and
Sproull
,
L.
, 1997, “
‘Social’ Human Computer Interaction
,”
Human Values and the Design of Computer Technology
,
B.
Friedman
, ed.,
CSLI Press
,
Stanford, CA
, pp.
191
199
.
8.
Hinds
,
P.
,
Roberts
,
T.
, and
Jones
,
H.
, 2004, “
Whose Job is it Anyway? A Study of Human-Robot Interaction in a Collaborative Task
,”
Human-Computer Interaction
,
19
(
1
), pp.
151
181
.
9.
Dautenhahn
,
K.
,
Woods
,
S.
,
Kaouri
,
C.
,
Walters
,
M.
,
Koay
,
K. L.
, and
Werry
,
I.
, 2005, “
What is a Robot Companion—Friend, Assistant or Butler?
,”
IEEE International Conference on Intelligent Robots and Systems
,
Edmonton, Canada
, pp.
1488
1493
.
10.
Hashimoto
,
T.
,
Hitramatsu
,
S.
,
Tsuji
,
T.
, and
Kobayashi
,
H.
, 2006, “
Development of the Face Robot SAYA for Rich Facial Expressions
,”
SICE-ICASE International Joint Conference
,
Bexco, Korea
, pp.
5423
5428
.
11.
Hashimoto
,
T.
,
Senda
,
M.
, and
Kobayashi
,
H.
, 2004, “
Realization of Realistic and Rich Facial Expressions by Face Robot
,”
IEEE Technical Exhibition Based Conference on Robotics and Automation
,
Tokyo, Japan
, pp.
37
38
.
12.
MacDorman
,
K. F.
,
Minato
,
T.
,
Shimada
,
M.
,
Itakura
,
S.
,
Cowley
,
S.
, and
Ishiguro
,
H.
, 2005, “
Assessing Human Likeness by Eye Contact in an Android Testbed
,”
XXVII Annual Meeting of the Cognitive Science Society
,
Stresa, Italy
.
13.
MacDorman
,
K. F.
,
Chalodhorn
,
R.
, and
Ishiguro
,
H.
, 2004, “
Learning to Recognize and Reproduce Abstract Actions From Proprioception
,”
Third International Conference on Development and Learning: Developing Social Brains
,
La Jolla, CA
.
14.
Weiguo
,
W.
,
Qingmei
,
M.
, and
Yu
,
W.
, 2004, “
Development of the Humanoid Head Portrait Robot System With Flexible Face and Expression
,”
IEEE International Conference on Robotics and Biomimetics
,
Shenyang, China
, pp.
757
762
.
15.
Nejat
,
G.
,
Allison
,
B.
,
Gomez
,
N.
, and
Rosenfeld
,
A.
, 2007, “
The Design of an Interactive Socially Assistive Robot for Patient Care
,”
ASME International Mechanical Engineering Congress and Expo
,
Seattle, WA
, Paper No. IMECE2007-41811.
16.
Itoh
,
K.
,
Miwa
,
H.
,
Nukariya
,
Y.
,
Imanishi
,
K.
,
Takeda
,
D.
,
Saito
,
M.
,
Hayashi
,
K.
,
Shoji
,
M.
, and
Takanishi
,
A.
, 2005, “
Development of Face Robot to Express the Individual Face by Optimizing the Facial Features
,”
IEEE-RAS International Conference on Humanoid Robots
,
Tsukuba, Japan
, pp.
347
352
.
17.
Oh
,
J.-H.
,
Hanson
,
D.
,
Kim
,
W.-S.
,
Han
,
I. Y.
,
Kim
,
J.-Y.
, and
Park
,
I.-W.
, 2006, “
Design of Android Type Humanoid Robot Albert HUBO
,”
IEEE International Conference on Intelligent Robots and Systems
,
Beijing, China
, pp.
1428
1433
.
18.
Kobayashi
,
H.
,
Tsuji
,
T.
, and
Kikuchi
,
K.
, 2000, “
Study on Face Robot Platform as a KANSEI Medium
,”
IEEE Conference of Industrial Electronics Society
,
Nagoya, Japan
, pp.
481
486
.
19.
Feil-Seifer
,
D.
, and
Mataric
,
M. J.
, 2005, “
Defining Socially Assistive Robots
,”
IEEE International Conference on Rehabilitation Robotics
,
Chicago, IL
, pp.
465
468
.
20.
Fong
,
T.
,
Nourbakhsh
,
I.
, and
Dautenhahn
,
K.
, 2003, “
A Survey of Socially Interactive Robots
,”
Rob. Auton. Syst.
0921-8890
42
, pp.
143
166
.
21.
Pelachaud
,
C.
,
Badler
,
N. I.
, and
Viaud
,
M.-L.
, 1994, “
Final Report to NSF of the Standards for Facial Animation Workshop
,”
Institute for Research in Cognitive Science
, Technical Report.
22.
Ekman
,
P.
, and
Friesen
,
W. V.
, 1975,
Unmasking the Face
,
Prentice-Hall
,
Englewood Cliffs, NJ
.
23.
Choe
,
B.
,
Lee
,
H.
, and
Ko
,
H.
, 2001, “
Performance-Driven Muscle Based Facial Animation
,”
The Journal of Visualization and Computer Animation
,
12
(
2
), pp.
67
79
.
24.
Spalding
,
G.
, 1994,
Gray’s Anatomy
,
Vintage Books
,
New York
.
25.
Ekman
,
P.
, 1992, “
An Argument for Basic Emotions
,”
Cognition and Emotion
,
6
, pp.
169
200
.
26.
Ahlberg
,
J.
, 2001, “
CANDIDE-3—An Updated Parameterized Face
,”
Department of Electrical Engineering, Linköping University
, Report No. LiTH-ISY-R-2326.
27.
Noël
,
S.
,
Dumoulin
,
S.
,
Whalen
,
T.
, and
Stewart
,
J.
, 2006, “
Recognizing Emotions on Static and Animated Avatar Faces
,”
IEEE International Workshop on Haptic Audio Visual Environments and Their Applications
,
Ottawa, Canada
, pp.
99
104
.
You do not currently have access to this content.