맥락적 기법을 이용한 교육보조로봇의 표정디자인 사례연구 - ㈜KT의 키봇II(Kibot II)를 중심으로 -

Translated title of the contribution: Designing Facial Expressions of an Educational Assistant Robot by Contextual Methods

Research output: Contribution to journalArticlepeer-review

Abstract

Background Many robots are now on the road to being able to assist human lives, and interest in human-robot interaction has been growing over the last several years. However, few studies have been carried out in relation to how the behavior, motion, appearance, and facial expressions of robots affects human recognition, while much attention has been devoted to identifying and classifying humans' motions and intentions by engineers in the fields of robotics and computer science. Hence, we explored how the facial expressions of educational assistant robots should be developed such that they enhance human-robot interactions. Although facial expressions are an effective means of expressing a robot’s intentions, there are few studies regarding the design of facial expressions for robots or evaluations of related processes and results.
Methods In this study, we focused on an educational assistant robot and designed facial expressions using a case study design that included interviews, video analyses and a contextual approach in which children envisioned possible interactions.
Results and Conclusion The result showed that educational assistant robots should dominantly express emotions related to ‘happiness’ and ‘sadness’ among the six basic human emotions. We suggested that each emotion should be segmented into three intensity groups and designed matching facial expressions after verifying the levels of intensity with participants.
Translated title of the contributionDesigning Facial Expressions of an Educational Assistant Robot by Contextual Methods
Original languageKorean
Pages (from-to)409-435
Number of pages27
Journal디자인학연구
Volume26
Issue number2
DOIs
StatePublished - May 2013

Fingerprint

Dive into the research topics of 'Designing Facial Expressions of an Educational Assistant Robot by Contextual Methods'. Together they form a unique fingerprint.

Cite this