Title

Maneuvering Robot Companions via Human Facial Expressions In Human-Robot Collaborative Tasks

School Name

South Carolina Governor's School for Science & Mathematics

Grade Level

12th Grade

Presentation Topic

Computer Science

Presentation Type

Mentored

Abstract

To maneuver robot companions in human-robot collaboration, most of the previous research focused on how to use external equipment, such as teach pedants or joy sticks, to control robots. However, these approaches are not always intuitive and effective for human partners in particular the disabled group. In this study, human facial expressions are employed to teach a collaborative robot to act according to the change of human's emotions. The Radial Basis Function (RBF) network is used alongside the Local Binary Patterns (LBP) approach in order to extract the human facial expressions and train the robot in human-robot collaboration contexts. Five distinct facial expressions—happy, sad, angry, surprised, and neutral—were designed and 1000 sets of samples were collected for each in the training process. Based on the learned knowledge, the robot can recognize different kinds of human facial expressions in real-time and reply with a collaborative action for the human. In this research, over a series of tests, we also found that the tested networks are able to perform better when both the training and the post-testing were done using a singular, solid color background. In addition, objects such as thick glasses or headphones can hinder the robot's ability to understand the facial expressions.

Location

Furman Hall 109

Start Date

3-28-2020 11:15 AM

Presentation Format

Oral Only

Group Project

No

COinS
 
Mar 28th, 11:15 AM

Maneuvering Robot Companions via Human Facial Expressions In Human-Robot Collaborative Tasks

Furman Hall 109

To maneuver robot companions in human-robot collaboration, most of the previous research focused on how to use external equipment, such as teach pedants or joy sticks, to control robots. However, these approaches are not always intuitive and effective for human partners in particular the disabled group. In this study, human facial expressions are employed to teach a collaborative robot to act according to the change of human's emotions. The Radial Basis Function (RBF) network is used alongside the Local Binary Patterns (LBP) approach in order to extract the human facial expressions and train the robot in human-robot collaboration contexts. Five distinct facial expressions—happy, sad, angry, surprised, and neutral—were designed and 1000 sets of samples were collected for each in the training process. Based on the learned knowledge, the robot can recognize different kinds of human facial expressions in real-time and reply with a collaborative action for the human. In this research, over a series of tests, we also found that the tested networks are able to perform better when both the training and the post-testing were done using a singular, solid color background. In addition, objects such as thick glasses or headphones can hinder the robot's ability to understand the facial expressions.