I. Introduction
Identification of universal facial expressions [1] is a wide area of research at present. Facial action coding system (FACS) [2] is a comprehensive, anatomy based system for describing all visually discernible facial movements. It breaks down facial expressions into individual components of muscle movement, called Action Units (AUs) which provide different clues about facial expressions and act as indicators to identify abnormalities. Many attempts have been taken in improvising the ability to detect AU in facial features. Research [3] addresses the difficulties faced by the existing AU detection mechanisms in learning the facial features and provides a comprehensive study on the AUs controlling head and face rotation of adults. Research [4] focuses on the affect of differing domains such as gender diversity, participant diversity and video resolution in detecting the AU behaviors where they provide the considerations to be taken when applying the AU classifiers from one domain to another. Even though many have focused on the detection of AUs in facial features, less work have been carried out in identifying the mapping of AUs in relation to the emotional stimulation.