Loading [MathJax]/extensions/MathMenu.js
Early vs. Late Multimodal Fusion for Recognizing Confusion in Collaborative Tasks | IEEE Conference Publication | IEEE Xplore

Early vs. Late Multimodal Fusion for Recognizing Confusion in Collaborative Tasks


Abstract:

There has been a rapid transformation in the medium of learning and communication due to the pandemic. Multitudes have adopted online video platforms to learn and work fr...Show More

Abstract:

There has been a rapid transformation in the medium of learning and communication due to the pandemic. Multitudes have adopted online video platforms to learn and work from any corner of the world. Emotion detection is vital for understanding how well instructions are communicated through online interactions and for building cognitive systems that can identify human behavior. Confusion is a key emotion that can impact online learning and can be used to verify whether students using an online platform understand the material being taught. Our research expands on previous work regarding confusion detection, focusing on data fusion techniques. We explore the impact of early fusion (feature-level) vs late fusion (decision-level) on modeling confusion identification during a collaborative block building task. Experimenting with different classifiers, our results show that late fusion performs better with larger time windows. This fusion approach can aid in model interpretability.
Date of Conference: 10-13 September 2023
Date Added to IEEE Xplore: 16 January 2024
ISBN Information:
Conference Location: Cambridge, MA, USA

Funding Agency:


I. Introduction

Humans can experience various emotions during communication. There may be rapid changes in the types and intensity of emotions experienced in a collaborative context. The pandemic has brought a substantial change in the medium of teaching, with online learning gaining popularity in many educational institutions. Affective computing specializes in developing systems that can identify and reproduce human emotions. Among the variety of emotions users may experience during online interactions, we focus on confusion. For the context of this study, we define confusion as a state of mind where an individual is uncertain about the information communicated to them. Compared to happiness, sadness, or anger, confusion is a more subtle affective state and poses a challenge [1]. This study seeks to identify this ambiguous emotion using multimodal data.

Contact IEEE to Subscribe

References

References is not available for this document.