표정 인지 및 모션 생성 모델을 활용하여 사람과 상호작용하는 로봇 시스템

Translated title of the contribution: A Robot System that Interacts with Human using Facial Expression Recognition and Motion Generation Model

Research output: Contribution to journalArticlepeer-review

Abstract

When people empathize, nonverbal communication such as facial expressions and gestures is more weighted than language. Therefore, in the human-robot interaction (HRI), it is important for robots to recognize and understand facial expression of humans. This is because it helps emotional communication and alleviates the psychological distance between robots and humans. In this paper, we introduce the robot system using facial expression recognition and motion generation models for interaction with human. It recognizes human emotions using POSTER++ based on Vision Transformer (ViT) and generates motions using FLAME that is developed to utilize diffusion model. In our scenario, we first perceive facial expression by POSTER++ and generate appropriate reaction texts using text generation model, Gemini AI. Then, we input the generated reaction text into FLAME to obtain 3D motion data, and the data is used to get arm joint angles that is calculated for Forward Kinematics (FK) and Inverse Kinematics (IK). Focusing on scenarios involving human interaction, we only use the head and arm motors of the humanoid robot, HUMIC. Furthermore, we simulate scenarios to verify that our system can recognize facial expression and show various reaction motions.
Translated title of the contributionA Robot System that Interacts with Human using Facial Expression Recognition and Motion Generation Model
Original languageKorean
Pages (from-to)335-346
Number of pages12
Journal로봇학회 논문지
Volume19
Issue number4
DOIs
StatePublished - 2024

Fingerprint

Dive into the research topics of 'A Robot System that Interacts with Human using Facial Expression Recognition and Motion Generation Model'. Together they form a unique fingerprint.

Cite this