Abstract
Facial expression is considered as the most intuitive and effective way of conveying one's emotion among other nonverbal interactions.However, people with autism have limited access to this rich communication channel due to their inability to read facial expressions. To help them be aware of others' emotions, we developed a CNN-based facial expression recognition system using Microsoft Hololens and explored three different modes for displaying facial expressions of a conversation partner varying the levels of explicitness. Subjective feedback from a preliminary study with 6 pilot participants suggests that each mode is worth investigating for serving people with various needs and preferences who wish to receive augmented visual hints on others' emotion.
Original language | English |
---|---|
Title of host publication | Adjunct Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 435-437 |
Number of pages | 3 |
ISBN (Electronic) | 9781728147659 |
DOIs | |
State | Published - Oct 2019 |
Event | 18th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019 - Beijing, China Duration: 14 Oct 2019 → 18 Oct 2019 |
Publication series
Name | Adjunct Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019 |
---|
Conference
Conference | 18th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019 |
---|---|
Country/Territory | China |
City | Beijing |
Period | 14/10/19 → 18/10/19 |
Bibliographical note
Publisher Copyright:© 2019 IEEE.
Keywords
- Autism
- Emotion-recognition
- Facial-expressions
- Mixed-reality