I. Introduction
Emotions are essential to being human and influence our decisions and actions. Therefore, they play an essential role in communication and emotional intelligence. Precise ability to understand, use, and manage emotions is critical to successful interaction. Affective computing aims to empower machines with emotional intelligence to enhance natural human-machine interaction (HMI) [1]. In human-robot interaction (HRI), robots must have human-like capabilities to monitor, interpret, and express emotion. In complex interaction scenarios such as assistive, educational, and social robots, the ability of a robot to recognize emotion has a significant impact on the resulting social interactions. Several studies have examined the modalities of emotion recognition depending on (facial expressions, body movements, movements, sounds, electrical sensors, etc.) which can transmit emotional information from humans to robots, as well as how robots perceive emotional states [2]. Robots inferring and interpreting human emotions will communicate more effectively with humans. Recent studies have focused on developing algorithms to classify emotional states based on various inputs such as facial expressions, body language, voice, and physiological cues. We report recent advances in emotion recognition (ER), particularly in the context of HRI. The ER is challenging, primarily when performed in a real HRI. The setting can differ significantly from the controlled environment where most detection experiments are usually performed.