0
Research Papers

The Design of an Expressive Humanlike Socially Assistive Robot

[+] Author and Article Information
Brian Allison, Emmeline Kao

Autonomous Systems Laboratory, Department of Mechanical Engineering, State University of New York (SUNY) at Stony Brook, Stony Brook, NY 11794-2300

Goldie Nejat1

Autonomous Systems Laboratory, Department of Mechanical Engineering, State University of New York (SUNY) at Stony Brook, Stony Brook, NY 11794-2300goldie.nejat@stonybrook.edu

1

Corresponding author.

J. Mechanisms Robotics 1(1), 011001 (Jul 29, 2008) (8 pages) doi:10.1115/1.2959097 History: Received March 24, 2008; Revised June 21, 2008; Published July 29, 2008

It is anticipated that the use of assistive robots will be one of the most important service applications of robotic systems of the future. In this paper, the development of a unique noncontact socially assistive robot consisting of a humanlike demeanor is presented for utilization in hospital wards and nursing∕veteran homes to study its role and impact on the well-being of patients, addressing patient’s needs and its overall effect on the quality of patient care. The robot will be an embodied entity that will participate in hands-off noncontact social interaction with a patient during the convalescence, rehabilitation, or end-of-life care stage. The robot has been designed as a platform to incorporate the three design parameters of embodiment, emotion, and nonverbal communication to encourage natural human-robot interactions. Herein, we present the overall mechanical design of the socially assistive robot focusing mainly on the development of the actuation system of the face, head, and upper body. In particular, we propose the development of a unique muscle actuation mechanism for the robotic face to allow for the display of rich facial expressions during social assistive interaction scenarios. The novelty of the actuation system is in its use of the dependency of facial muscle activity to minimize the number of individual actuators required to control the robotic face.

FIGURES IN THIS ARTICLE
<>
Copyright © 2008 by American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Figure 1

Socially assistive robot Brian

Grahic Jump Location
Figure 3

Muscle group locations and direction of motion

Grahic Jump Location
Figure 4

CANDIDE-3 wireframe model with identified control nodes: (a) neutral, (b) happy, (c) surprise, (d) angry, (e) disgust, (f) sad, and (g) fear

Grahic Jump Location
Figure 5

Six basic expressions: (a) neutral, (b) happy, (c) surprise, (d) angry, (e) disgust, (f) sad, and (g) fear

Grahic Jump Location
Figure 6

Robot muscle structure and corresponding control nodes

Grahic Jump Location
Figure 7

Actuation mechanism: (a) lever mechanism, (b) top view of the track system, and (c) bottom view of the track system

Grahic Jump Location
Figure 8

Lever arms of the actuation mechanism

Grahic Jump Location
Figure 9

Muscle actuation

Grahic Jump Location
Figure 10

Robot emotional state: (a) neutral, (b) happy, (c) surprise, (d) angry, (e) disgust, (f) sad, and (g) fear

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In