Multimodal frustration detection on smartphones

Esther Vasiete, Tom Yeh

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Detecting user's frustration on a smartphone could have a significant impact on applications such as intelligent tutoring systems or app testing systems. Our goal is to provide a multimodal frustration detection system using data exclusively retrieved from a smartphone; motion sensor readings, touch gestures and face videos recorded from the smartphone's front camera. Copyright is held by the author/owner(s).

Original languageEnglish
Title of host publicationCHI 2015 - Extended Abstracts Publication of the 33rd Annual CHI Conference on Human Factors in Computing Systems
Subtitle of host publicationCrossings
PublisherAssociation for Computing Machinery
Pages1307-1312
Number of pages6
ISBN (Electronic)9781450331463
DOIs
StatePublished - 18 Apr 2015
Event33rd Annual CHI Conference on Human Factors in Computing Systems, CHI EA 2015 - Seoul, Korea, Republic of
Duration: 18 Apr 201523 Apr 2015

Publication series

NameConference on Human Factors in Computing Systems - Proceedings
Volume18

Conference

Conference33rd Annual CHI Conference on Human Factors in Computing Systems, CHI EA 2015
Country/TerritoryKorea, Republic of
CitySeoul
Period18/04/1523/04/15

Bibliographical note

Funding Information:
This work was supported by a DARPA grant FA8750-13-2-0279.

Keywords

  • Frustration detection
  • Multimodal system
  • Multitask game

Fingerprint

Dive into the research topics of 'Multimodal frustration detection on smartphones'. Together they form a unique fingerprint.

Cite this