Loading [MathJax]/extensions/MathMenu.js
Analyzing the Effect of Diverse Gaze and Head Direction on Facial Expression Recognition With Photo-Reflective Sensors Embedded in a Head-Mounted Display | IEEE Journals & Magazine | IEEE Xplore

Analyzing the Effect of Diverse Gaze and Head Direction on Facial Expression Recognition With Photo-Reflective Sensors Embedded in a Head-Mounted Display


Abstract:

As one of the facial expression recognition techniques for Head-Mounted Display (HMD) users, embedded photo-reflective sensors have been used. In this paper, we investiga...Show More

Abstract:

As one of the facial expression recognition techniques for Head-Mounted Display (HMD) users, embedded photo-reflective sensors have been used. In this paper, we investigate how gaze and face directions affect facial expression recognition using the embedded photo-reflective sensors. First, we collected a dataset of five facial expressions (Neutral, Happy, Angry, Sad, Surprised) while looking in diverse directions by moving 1) the eyes and 2) the head. Using the dataset, we analyzed the effect of gaze and face directions by constructing facial expression classifiers in five ways and evaluating the classification accuracy of each classifier. The results revealed that the single classifier that learned the data for all gaze points achieved the highest classification performance. Then, we investigated which facial part was affected by the gaze and face direction. The results showed that the gaze directions affected the upper facial parts, while the face directions affected the lower facial parts. In addition, by removing the bias of facial expression reproducibility, we investigated the pure effect of gaze and face directions in three conditions. The results showed that, in terms of gaze direction, building classifiers for each direction significantly improved the classification accuracy. However, in terms of face directions, there were slight differences between the classifier conditions. Our experimental results implied that multiple classifiers corresponding to multiple gaze and face directions improved facial expression recognition accuracy, but collecting the data of the vertical movement of gaze and face is a practical solution to improving facial expression recognition accuracy.
Published in: IEEE Transactions on Visualization and Computer Graphics ( Volume: 29, Issue: 10, 01 October 2023)
Page(s): 4124 - 4139
Date of Publication: 02 June 2022

ISSN Information:

PubMed ID: 35653450

Funding Agency:


1 Introduction

Facial expressions play an important role in communication in both physical and virtual environments. The emergence of low-cost consumer Head-Mounted Displays (HMDs) made remote communications in Virtual Reality (VR) common. Furthermore, the COVID-19 pandemic has promoted VR applications that allow us to have immersive experiences without meeting in the physical environment. In a virtual environment, VR applications typically use avatars to represent the user. Transferring not only verbal but also non-verbal information such as body gestures and facial expressions between users and avatars enriches interactions. Also, such cases do not always require photo-realistic facial expressions; even with non-photo-realistic avatars, we can have social interaction. VR social services, such as VRChat

https://hello.vrchat.com/

, provide functions that switch avatar facial expressions, indicating that discrete facial expressions also suffice social interactions. Despite the wide use of camera-based approaches for facial expression recognition, immersive HMD occludes most parts of our face, so cameras cannot capture facial expressions in ordinal configurations.

Contact IEEE to Subscribe

References

References is not available for this document.