Teaching Robot Companions to Assist Humans via Natural Language and Gestures
School Name
South Carolina Governor's School for Science & Mathematics
Grade Level
12th Grade
Presentation Topic
Engineering
Presentation Type
Mentored
Oral Presentation Award
2nd Place
Abstract
Workplaces in manufacturing contexts, especially automotive assembly, often require the transport of heavy parts, and today many of those lifts are done by pre-coded machines or people. However, in a more dynamic setting, it may be required for the robot to go off script, so it may stop on command in wait for another part to be added or removed for any adjustments needed. The pre-coded machines do not have this dynamic, and some parts are simply too heavy for people. To this end, we develop a teaching-learning framework for the robot to learn from multi-modal human demonstrations to assist its human partner in collaborative tasks. By taking advantage of our approach, humans can teach robots just like teachers teach students how to carry heavy parts via natural language and gestures. The Myo Armband is employed to acquire human gestures and parametrize human driving modes to train the robot. We then use natural language processing and structured dialogue to communicate with the robot combined with the wearable sensing to create a functioning, dynamically moving robot. Afterwards, the robot can learn what each driving mode is through the Random Forests (RF) algorithm. The proposed approach is implemented on a smart companion robot, which assists the human in carrying a heavy part around an assembly line. Testing results suggest that the human labor can be reduced by using natural language and wearable sensing, and the robot can effectively collaborate with the human to accomplish the shared task in collaborative manufacturing contexts.
Recommended Citation
Bertram, Michael, "Teaching Robot Companions to Assist Humans via Natural Language and Gestures" (2019). South Carolina Junior Academy of Science. 165.
https://scholarexchange.furman.edu/scjas/2019/all/165
Location
Founders Hall 250 B
Start Date
3-30-2019 11:30 AM
Presentation Format
Oral Only
Group Project
No
Teaching Robot Companions to Assist Humans via Natural Language and Gestures
Founders Hall 250 B
Workplaces in manufacturing contexts, especially automotive assembly, often require the transport of heavy parts, and today many of those lifts are done by pre-coded machines or people. However, in a more dynamic setting, it may be required for the robot to go off script, so it may stop on command in wait for another part to be added or removed for any adjustments needed. The pre-coded machines do not have this dynamic, and some parts are simply too heavy for people. To this end, we develop a teaching-learning framework for the robot to learn from multi-modal human demonstrations to assist its human partner in collaborative tasks. By taking advantage of our approach, humans can teach robots just like teachers teach students how to carry heavy parts via natural language and gestures. The Myo Armband is employed to acquire human gestures and parametrize human driving modes to train the robot. We then use natural language processing and structured dialogue to communicate with the robot combined with the wearable sensing to create a functioning, dynamically moving robot. Afterwards, the robot can learn what each driving mode is through the Random Forests (RF) algorithm. The proposed approach is implemented on a smart companion robot, which assists the human in carrying a heavy part around an assembly line. Testing results suggest that the human labor can be reduced by using natural language and wearable sensing, and the robot can effectively collaborate with the human to accomplish the shared task in collaborative manufacturing contexts.