Loading [MathJax]/extensions/MathZoom.js
Deep Learning-Based Facial Emotion Recognition for Driver Healthcare | IEEE Conference Publication | IEEE Xplore

Deep Learning-Based Facial Emotion Recognition for Driver Healthcare


Abstract:

This study proposes deep learning-based facial emotion recognition (FER) for driver health care. The FER system will monitor the emotional state of the driver's face to i...Show More

Abstract:

This study proposes deep learning-based facial emotion recognition (FER) for driver health care. The FER system will monitor the emotional state of the driver's face to identify the driver's negligence and provide immediate assistance for safety. This work uses a transfer learning-based framework for FER which will help in developing an in-vehicle driver assistance system. It implements transfer learning SqueezeNet 1.1 to classify different facial expressions. Data preprocessing techniques such as image resizing and data augmentation have been employed to improve performance. The experimental study uses static facial expressions publicly available on several benchmark databases such as CK+, KDEF, FER2013, and KMU-FED to evaluate the model's performance. The performance comparison only showed superiority over state-of-the-art technologies in the case of the KMU-FED database, i.e., maximum accuracy of 95.83 %, and the results showed comparable performance to the rest of the benchmark databases.
Date of Conference: 24-27 May 2022
Date Added to IEEE Xplore: 04 July 2022
ISBN Information:
Conference Location: Mumbai, India

I. Introduction

Driver behavior monitoring is vital in diagnosing driver's health conditions. Healthcare in real-time driving monitoring will establish an intelligent transport system (ITS) for safe driving. The advancement of information and communication technology (ICT) creates scope for implementing driver assistance systems based on the human-computer interface (HCI) [1]. Driver healthcare is an essential and most crucial factor of ITS and plays a vital role in setting up safe driving in smart cities. With the increase in the number of motor vehicles, road accidents are also increasing. Most road accidents occur due to driver's fault like a distraction due to use of mobile, aggressive driving, impairments due to alcohol consumption and drugs [2]–[5]. The monitoring of driver status can be established using driving patterns obtained through steering wheel movements, driver's physiological data captured through body sensors, and in-vehicle image data captured through vehicle dashboard camera [6]. Facial expression recognition (FER), eye-closure analysis, head pose estimation, etc., will help identify driving anomalies. Facial expressions are based on six basic emotions: happiness, surprise, anger, sadness, fear, and disgust [7], [8]. These basic emotions involve non-verbal communicative signals that do not occur very frequently in regular personal interactions and are extremely difficult to record truly spontaneous instances [9]. However, having these six basic emotions deliver a powerful message to the surroundings for health care and safety [10].

Contact IEEE to Subscribe

References

References is not available for this document.