Abstract:
Studying eye movement has proven to be useful in the study of detecting and understanding human emotional states. This paper aims to investigate eye movement features: pu...Show MoreMetadata
Abstract:
Studying eye movement has proven to be useful in the study of detecting and understanding human emotional states. This paper aims to investigate eye movement features: pupil size, time of first fixation, first fixation duration, fixation duration and fixation count in clips emotional stimulation. Thirty seven subjects' pupil responses were measured while watching two pleasant and unpleasant emotional clips. The results showed that the fixation duration and fixation count significantly different between pleasant and unpleasant clip arousal. These results suggest that the measurement of eye fixation may be a potentially useful computer input for detecting positive and negative emotional state.
Published in: 2013 Science and Information Conference
Date of Conference: 07-09 October 2013
Date Added to IEEE Xplore: 14 November 2013
Electronic ISBN:978-0-9893193-0-0
Conference Location: London, UK
References is not available for this document.
Select All
1.
M. Pantic, A. Pentland, A. Nijholt, and T. Huang, "Human computing and machine understanding of human behavior: a survey", in Proceedings of the 8th international conference on Multimodal interfaces. ACM, 2006, pp. 239-248.
2.
R. W. Picard, "Affective computing: From laughter to IEEE", Affective Computing, IEEE Transactions on, vol. 1, no. 1, pp. 11-17, 2010.
3.
R. Picard, Affective computing. The MIT Press, 2000.
4.
R. P. Hobson, J. Ouston, A. Lee et al., "Emotion recognition in autism: Coordinating faces and voices", Psychological Medicine, vol. 18, no. 4, pp. 911-923, 1988.
5.
S. Alghowinem, R. Goecke, M. Wagner, J. Epps, M. Breakspear, and G. Parker, "From Joyous to Clinically Depressed: Mood Detection Using Spontaneous Speech", in Proc. FLAIRS-25, 2012.
6.
E. Vesterinen and others, "Affective computing", in Tik-111.590 Digital media research seminar, 2001.
7.
G. Lohse and E. Johnson, "A comparison of two process tracing methods for choice tasks", in System Sciences, 1996., Proceedings of the Twenty-Ninth Hawaii International Conference on, vol. 4. IEEE, 2002, pp. 86-97.
8.
Izard, C., Fine, S., Schultz, D., Mostow, A., Ackerman, B., & Youngstrom, E. (2001). Emotion knowledge as a predictor of social behavior and academic competence in children at risk. Psychological Science, 12, 18-23.
9.
L. Yan, X. Wen, L. Zhang, and Y. Son, "The application of unascertained measure to the video emotion type recognition", in Signal Processing Systems (ICSPS), 2010 2nd International Conference on, 2010, vol. 2, pp. V2-447.
10.
C. Zong and M. Chetouani, "Hilbert-Huang transform based physiological signals analysis for emotion recognition", in Signal Processing and Information Technology (ISSPIT), 2009 IEEE International Symposium on, 2009, pp. 334-339.
11.
J. Scheirer, R. Fernandez, J. Klein, and R. W. Picard, "Frustrating the user on purpose: a step toward building an affective computer", Interacting with computers, vol. 14, no. 2, pp. 93-118, 2002.
12.
E. H. Hess, "Pupillometrics: A method of studying mental, emotional and sensory processes", Handbook of psychophysiology, pp. 491-531, 1972.
13.
T. Partala, M. Jokiniemi, and V. Surakka, "Pupillary responses to emotionally provocative stimuli", in Proceedings of the 2000 symposium on Eye tracking research & applications. ACM, 2000, pp. 123-129.
14.
T. Partala and V. Surakka, "Pupil size variation as an indication of affective processing", International Journal of Human-Computer Studies, vol. 59, no. 1, pp. 185-198, 2003.
15.
M. M. Bradley, L. Miccoli, M. A. Escrig, and P. J. Lang, "The pupil as a measure of emotional arousal and autonomic activation", Psychophysiology, vol. 45, no. 4, pp. 602-607, 2008.
16.
D. C. Jackson, C. J. Mueller, I. Dolski, K. M. Dalton, J. B. Nitschke, H. L. Urry, M. A. Rosenkranz, C. D. Ryff, B. H. Singer, and R. J. Davidson, "Now you feel it, now you don't frontal brain electrical asymmetry and individual differences in emotion regulation", Psychological Science, vol. 14, no. 6, pp. 612-617, 2003.
17.
S. K. Sutton, R. J. Davidson, B. Donzella, W. IRWIN, and D. A. Dottl, "Manipulating affective state using extended picture presentations", Psychophysiology, vol. 34, no. 2, pp. 217-226, 2007.
18.
C. P. Niemic and K. Warren, "Studies of emotion", A Theoretical and Empirical Review of Psychophysiological Studies of Emotion. (Department of Clinical and Social Psychology). JUR Rochester, vol. 1, no. 1, pp. 15-19, 2002.
19.
J. Hyönä, J. Tommola, and A. M. Alaja, "Pupil dilation as a measure of processing load in simultaneous interpretation and other language tasks", The Quarterly Journal of Experimental Psychology, vol. 48, no. 3, pp. 598-612, 1995.
20.
P. Lang, M. Bradley, and B. Cuthbert, "International affective picture system (IAPS): Technical manual and affective ratings", NIMH Center for the Study of Emotion and Attention, 1997.
21.
J. Rottenberg, R. R. Ray, J. J. Gross. Emotion elicitation using films. In J. A. Coan and J. J. B Allen (Eds.), The handbook of emotion elicitation and assessment, pages 9-28. NewYork: Oxford University Press, 2007.
22.
Z. Zeng, M. Pantic, G. I. Roisman, and T. S. Huang, "A survey of affect recognition methods: Audio, visual, and spontaneous expressions", Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol. 31, no. 1, pp. 39-58, 2009.
23.
S. Jerritta, M. Murugappan, R. Nagarajan, and K. Wan, "Physiological signals based human emotion recognition: a review", in Signal Processing and its Applications (CSPA), 2011 IEEE 7th International Colloquium on, March, pp. 410-415.
24.
D. Nie, X. W. Wang, L. C. Shi, and B. L. Lu, "EEG-based emotion recognition during watching movies", in Neural Engineering (NER), 2011 5th International IEEE/EMBS Conference on, 2011, pp. 667-670.
25.
J. J. Gross and R. W. Levenson, "Emotion elicitation using films", Cognition & Emotion, vol. 9, no. 1, pp. 87-108, 1995.
26.
Shun-nan Yang and George W. McConkie. 2001. Eye movements during reading: a theory of saccade initiation times. Vision Research, 41:3567-3585.