Loading [MathJax]/extensions/MathMenu.js
Deep Learning-Based Facial Emotion Recognition for Driver Healthcare | IEEE Conference Publication | IEEE Xplore

Deep Learning-Based Facial Emotion Recognition for Driver Healthcare


Abstract:

This study proposes deep learning-based facial emotion recognition (FER) for driver health care. The FER system will monitor the emotional state of the driver's face to i...Show More

Abstract:

This study proposes deep learning-based facial emotion recognition (FER) for driver health care. The FER system will monitor the emotional state of the driver's face to identify the driver's negligence and provide immediate assistance for safety. This work uses a transfer learning-based framework for FER which will help in developing an in-vehicle driver assistance system. It implements transfer learning SqueezeNet 1.1 to classify different facial expressions. Data preprocessing techniques such as image resizing and data augmentation have been employed to improve performance. The experimental study uses static facial expressions publicly available on several benchmark databases such as CK+, KDEF, FER2013, and KMU-FED to evaluate the model's performance. The performance comparison only showed superiority over state-of-the-art technologies in the case of the KMU-FED database, i.e., maximum accuracy of 95.83 %, and the results showed comparable performance to the rest of the benchmark databases.
Date of Conference: 24-27 May 2022
Date Added to IEEE Xplore: 04 July 2022
ISBN Information:
Conference Location: Mumbai, India
References is not available for this document.

I. Introduction

Driver behavior monitoring is vital in diagnosing driver's health conditions. Healthcare in real-time driving monitoring will establish an intelligent transport system (ITS) for safe driving. The advancement of information and communication technology (ICT) creates scope for implementing driver assistance systems based on the human-computer interface (HCI) [1]. Driver healthcare is an essential and most crucial factor of ITS and plays a vital role in setting up safe driving in smart cities. With the increase in the number of motor vehicles, road accidents are also increasing. Most road accidents occur due to driver's fault like a distraction due to use of mobile, aggressive driving, impairments due to alcohol consumption and drugs [2]–[5]. The monitoring of driver status can be established using driving patterns obtained through steering wheel movements, driver's physiological data captured through body sensors, and in-vehicle image data captured through vehicle dashboard camera [6]. Facial expression recognition (FER), eye-closure analysis, head pose estimation, etc., will help identify driving anomalies. Facial expressions are based on six basic emotions: happiness, surprise, anger, sadness, fear, and disgust [7], [8]. These basic emotions involve non-verbal communicative signals that do not occur very frequently in regular personal interactions and are extremely difficult to record truly spontaneous instances [9]. However, having these six basic emotions deliver a powerful message to the surroundings for health care and safety [10].

Select All
1.
C. Bisogni, A. Castiglione, S. Hossain, F. Narducci and S. Umer, "Impact of deep learning approaches on facial expression recognition in healthcare industries", IEEE Trans. Indust. Informatics, 2022.
2.
H. Gjerde, P. T. Normann, A. S. Christophersen, S. O. Samuelsen and J. MØrland, "Alcohol psychoactive drugs and fatal road traffic accidents in norway: a case-control study", Accident Anal. Prevention, vol. 43, no. 3, pp. 1197-1203, 2011.
3.
A. Das, H. Gjerde, S. S. Gopalan and P. T. Normann, "Alcohol drugs and road traffic crashes in india: a systematic review", Traffic Injury Preven., vol. 13, no. 6, pp. 544-553, 2012.
4.
Lacey et al., 2007 national roadside survey of alcohol and drug use by drivers: drug results, 2009.
5.
J. G. Ramaekers, G. Berghaus, M. van Laar and O. H. Drummer, "Dose related risk of motor vehicle crashes after cannabis use", Drug and alcohol dependence, vol. 73, no. 2, pp. 109-119, 2004.
6.
C. Marina Martinez, M. Heucke, F. Wang, B. Gao and D. Cao, "Driving style recognition for intelligent vehicle control and advanced driver assistance: A survey", IEEE Trans. In te ll. Transp. Syst., vol. 19, no. 3, pp. 666-676, 2018.
7.
C. Darwin, The expression of the emotions in man and animals by Charles Darwin, John Murray, 1872.
8.
P. Ekman, "An argument for basic emotions", Cognition & emotion, vol. 6, no. 3–4, pp. 169-200, 1992.
9.
M. Valstar, M. Pantic et al., "Induced disgust happiness and surprise: an addition to the MMI facial expression database", Proc. 3rd Int. Workshop EMOTION (satellite of LREC): Corpora Res. Emotion Affect, pp. 65, 2010.
10.
D. Keltner and P. Ekman, Emotion: An overview, 2000.
11.
J. Shao and Y. Qian, "Three convolutional neural network models for facial expression recognition in the wild", Neurocomputing, vol. 355, pp. 82-92, 2019.
12.
Z. Fei, E. Yang, D. D.-U. Li, S. Butler, W. Ijomah, X. Li, et al., "Deep convolution network based emotion analysis towards mental health care", Neurocomputing, vol. 388, pp. 212-227, 2020.
13.
D. Lundqvist, A. Flykt and A. Öhman, "The Karolinska directed emotional faces (KDEF)", CD ROM Depart. Clinical Neuroscience Psychology Sec. Karolinska Inst., vol. 91, no. 630, pp. 2-2, 1998.
14.
P. Lucey, J. F. Cohn, T. Kanade, J. Saragih, Z. Ambadar and I. Matthews, "The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression", Proc. IEEE Comput. Society conf. Comput. Vis. Pattern Recogn. Works., pp. 94-101, 2010.
15.
Facial Expression Recognition 2013 Dataset (FER2013).
16.
M. Jeong and B. C. Ko, "Drivers facial expression recognition in real-time for safe driving", Sensors, vol. 18, no. 12, pp. 4270, 2018.
17.
M. Jeong, J. Nam and B. C. Ko, "Lightweight multilayer random forests for monitoring driver emotional status", IEEE Access, vol. 8, pp. 60344-60354, 2020.
18.
Y. Zhou and B. E. Shi, "Action unit selective feature maps in deep networks for facial expression recognition", Proc. IEEE Int. Joint Conf. Neural Netw. (IJCNN), pp. 2031-2038, 2017.
19.
A. Sajjanhar, Z. Wu and Q. Wen, "Deep learning models for facial expression recognition", Proc. IEEE Digital Imag. Comput. Techniques Appl. (DICTA), pp. 1-6, 2018.
20.
A. Krishnadas and S. Nithin, "A comparative study of machine learning and deep learning algorithms for recognizing facial emotions", Proc. IEEE 2nd Int. Conf. Electron. Sustainable Commun. Syst. (ICESC), pp. 1506-1512, 2021.
21.
A. Aggarwal, S. Garg, R. Madaan and R. Kumar, "Comparison of different machine learning and deep learning emotion detection models" in Intell. Comput. Commun. Syst., Springer, pp. 401-408, 2021.
22.
A. Leone, A. Caroppo, A. Manni and P. Siciliano, "Vision-based road rage detection framework in automotive safety applications", Sensors, vol. 21, no. 9, pp. 2942, 2021.
23.
Z. Fei, E. Yang, L. Yu, X. Li, H. Zhou and W. Zhou, "A novel deep neural network-based emotion analysis system for automatic detection of mild cognitive impairment in the elderly", Neurocomputing, vol. 468, pp. 306-316, 2022.
24.
H. Ma, T. Celik and H.-C. Li, "Lightweight attention convolutional neural network through network slimming for robust facial expression recognition", Signal Imag. Video Process., pp. 1-9, 2021.
25.
F. N. Iandola, S. Han, M. W. Moskewicz, K. Ashraf, W. J. Dally and K. Keutzer, "SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and < 0.5 MB model size", arXiv preprint, 2016.
Contact IEEE to Subscribe

References

References is not available for this document.