Abstract
Detecting user's frustration on a smartphone could have a significant impact on applications such as intelligent tutoring systems or app testing systems. Our goal is to provide a multimodal frustration detection system using data exclusively retrieved from a smartphone; motion sensor readings, touch gestures and face videos recorded from the smartphone's front camera. Copyright is held by the author/owner(s).
Original language | English |
---|---|
Title of host publication | CHI 2015 - Extended Abstracts Publication of the 33rd Annual CHI Conference on Human Factors in Computing Systems |
Subtitle of host publication | Crossings |
Publisher | Association for Computing Machinery |
Pages | 1307-1312 |
Number of pages | 6 |
ISBN (Electronic) | 9781450331463 |
DOIs | |
State | Published - 18 Apr 2015 |
Event | 33rd Annual CHI Conference on Human Factors in Computing Systems, CHI EA 2015 - Seoul, Korea, Republic of Duration: 18 Apr 2015 → 23 Apr 2015 |
Publication series
Name | Conference on Human Factors in Computing Systems - Proceedings |
---|---|
Volume | 18 |
Conference
Conference | 33rd Annual CHI Conference on Human Factors in Computing Systems, CHI EA 2015 |
---|---|
Country/Territory | Korea, Republic of |
City | Seoul |
Period | 18/04/15 → 23/04/15 |
Bibliographical note
Funding Information:This work was supported by a DARPA grant FA8750-13-2-0279.
Keywords
- Frustration detection
- Multimodal system
- Multitask game