UPDF AI

Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features

S. D’Mello,A. Graesser

2010 · DOI: 10.1007/s11257-010-9074-4
User modeling and user-adapted interaction · 330 Citations

TLDR

A multimodal affect detector that combines conversational cues, gross body language, and facial features, and linear discriminant analyses to discriminate between naturally occurring experiences of boredom, engagement/flow, confusion, frustration, delight, and neutral is developed and evaluated.