1887
Volume 19, Issue 3
  • ISSN 1572-0373
  • E-ISSN: 1572-0381
GBP
Buy:£15.00 + Taxes

Abstract

Abstract

Frustration in traffic is one of the causes of aggressive driving. Knowledge whether a driver is frustrated may be utilized by future advanced driver assistance systems to counteract this source of crashes. One possibility to achieve this is to automatically recognize facial expressions of drivers. However, only little is known about the facial expressions of frustrated drivers. Here, we report the results of a driving simulator study investigating the facial muscle activity that comes along with frustration. Twenty-eight participants were video-taped during frustrated and non-frustrated driving situations. Their facial muscle activity was manually coded according to the Facial Action Coding System. Participants showed significantly more facial muscle activity in the mouth region. Thus, recording facial muscle behavior potentially provides traffic researchers and assistance system developers with the possibility to recognize frustration while driving.

Loading

Article metrics loading...

/content/journals/10.1075/is.17005.ihm
2019-03-13
2019-03-20
Loading full text...

Full text loading...

References

  1. Barrett, L. F.
    (2016) The theory of constructed emotion: an active inference account of interoception and categorization. Social Cognitive and Affective Neuroscience, nsw154. doi:  10.1093/scan/nsw154
    https://doi.org/10.1093/scan/nsw154 [Google Scholar]
  2. Bruce, V.
    (1992) What the human face tells the human mind: some challenges for the robot-human interface. InProceedings of IEEE International Workshop on Robot and Human Communication (pp.44–51). 10.1109/ROMAN.1992.253910
    https://doi.org/10.1109/ROMAN.1992.253910 [Google Scholar]
  3. Bradley, M. M., & Lang, P. J.
    (1994) Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. Journal of Behavior Therapy and Experimental Psychiatry, 25(1), 49–59. 10.1016/0005‑7916(94)90063‑9
    https://doi.org/10.1016/0005-7916(94)90063-9 [Google Scholar]
  4. D’Mello, S., Craig, S., Gholson, B., Franklin, S., Picard, R., & Graesser, A.
    (2005) Integrating affect sensors in an intelligent tutoring system. InAffective Interactions: The Computer in the Affective Loop Workshop at 2005 International Conference on Intelligent User Interfaces. New York: AMC Press.
    [Google Scholar]
  5. Deffenbacher, J. L., Lynch, R. S., Oetting, E. R., & Swaim, R. C.
    (2002) The Driving Anger Expression Inventory: a measure of how people express their anger on the road. Behaviour Research and Therapy, 40(6), 717–737. doi:  10.1016/S0005‑7967(01)00063‑8
    https://doi.org/10.1016/S0005-7967(01)00063-8 [Google Scholar]
  6. Donkor, R., Burnett, G., & Sharples, S.
    (2014) Measuring the emotional validity of driving simulators. Advances in Transportation Studies, (Special, Issue Special Vol1), 51–64.
    [Google Scholar]
  7. Ekman, P.
    (1992) An argument for basic emotions. Cognition & Emotion, 6(3), 169–200. doi:  10.1080/02699939208411068
    https://doi.org/10.1080/02699939208411068 [Google Scholar]
  8. Ekman, P., & Friesen, W. V.
    (2003) Unmasking the face: A guide to recognizing emotions from facial clues. Cambridge, MA: Malor Books.
    [Google Scholar]
  9. Ekman, P., Friesen, W. V., & Hager, J.
    (2002) The Investigator’s Guide for the Facial Action Coding System. Salt Lake City: A Human face.
    [Google Scholar]
  10. Gao, H., Yuce, A., & Thiran, J.-P.
    (2014) Detecting emotional stress from facial expressions for driving safety. InIEEE International Conference on Image Processing (ICIP) (pp.5961–5965).
    [Google Scholar]
  11. Gehrig, T., & Ekenel, H. K.
    (2011) A common framework for real-time emotion recognition and facial action unit detection. InIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPR Workshops) (pp.1–6).
    [Google Scholar]
  12. Gosselin, P., Perron, M., & Beaupré, M.
    (2010) The voluntary control of facial action units in adults. Emotion, 10(2), 266–271. doi:  10.1037/a0017748
    https://doi.org/10.1037/a0017748 [Google Scholar]
  13. Grafsgaard, J. F., Wiggins, J. B., Boyer, K. E., Wiebe, E. N., & Lester, J. C.
    (2013) Automatically recognizing facial indicators of frustration: A learning-centric analysis. InHumaine Association Conference on Affective Computing and Intelligent Interaction (ACII) (pp.159–165). 10.1109/ACII.2013.33
    https://doi.org/10.1109/ACII.2013.33 [Google Scholar]
  14. Hamm, J., Kohler, C. G., Gur, R. C., & Verma, R.
    (2011) Automated Facial Action Coding System for dynamic analysis of facial expressions in neuropsychiatric disorders. Journal of Neuroscience Methods, 200(2), 237–256. doi:  10.1016/j.jneumeth.2011.06.023
    https://doi.org/10.1016/j.jneumeth.2011.06.023 [Google Scholar]
  15. Hart, S. G., & Staveland, L.
    (1988) Development of the NASA-TLX (Task Load Index): Results of empirical and theoretical research. Advances in Psychology, 52, 139–183. 10.1016/S0166‑4115(08)62386‑9
    https://doi.org/10.1016/S0166-4115(08)62386-9 [Google Scholar]
  16. Healey, J., & Picard, R.
    (2005) Detecting stress during real-world driving tasks using physiological sensors. IEEE Transactions on Intelligent Transportation Systems, 6(2), 156–166. doi:  10.1109/TITS.2005.848368
    https://doi.org/10.1109/TITS.2005.848368 [Google Scholar]
  17. Hoque, M. E., McDuff, D. J., & Picard, R. W.
    (2012) Exploring temporal patterns in classifying frustrated and delighted smiles. IEEE Transactions on Affective Computing, 3(3), 323–334. doi:  10.1109/T‑AFFC.2012.11
    https://doi.org/10.1109/T-AFFC.2012.11 [Google Scholar]
  18. Lazarus, R. S.
    (1991) Progress on a cognitive-motivational-relational theory of emotion. American Psychologist, 46(8), 819–834. doi:  10.1037/0003‑066X.46.8.819
    https://doi.org/10.1037/0003-066X.46.8.819 [Google Scholar]
  19. Lee, Y.-C.
    (2010) Measuring drivers’ frustration in a driving simulator. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 54(19), 1531–1535. doi:  10.1177/154193121005401937
    https://doi.org/10.1177/154193121005401937 [Google Scholar]
  20. Lee, Y.-C., & LaVoie, N.
    (2014) Relationship between frustration justification and vehicle control behaviors ? A simulator study. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 58(1), 2235–2239. doi:  10.1177/1541931214581429
    https://doi.org/10.1177/1541931214581429 [Google Scholar]
  21. Malta, L., Miyajima, C., Kitaoka, N., & Takeda, K.
    (2011) Analysis of real-world driver’s frustration. IEEE Transactions on Intelligent Transportation Systems, 12(1), 109–118. doi:  10.1109/TITS.2010.2070839
    https://doi.org/10.1109/TITS.2010.2070839 [Google Scholar]
  22. Russell, J. A.
    (1980) A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178. doi:  10.1037/h0077714
    https://doi.org/10.1037/h0077714 [Google Scholar]
  23. Scherer, K. R.
    (2005) What are emotions? And how can they be measured?Social Science Information, 44(4), 695–729. doi:  10.1177/0539018405058216
    https://doi.org/10.1177/0539018405058216 [Google Scholar]
  24. Tews, T.-K., Oehl, M., Siebert, F. W., Höger, R., & Faasch, H.
    (2011) Emotional human-machine interaction: cues from facial expressions. InD. Hutchison, T. Kanade, J. Kittler, J. M. Kleinberg, F. Mattern, J. C. Mitchell, … (Eds.), Lecture Notes in Computer Science. Human Interface and the Management of Information. Interacting with Information (Vol.6771, pp.641–650). Berlin: Springer. 10.1007/978‑3‑642‑21793‑7_73
    https://doi.org/10.1007/978-3-642-21793-7_73 [Google Scholar]
http://instance.metastore.ingenta.com/content/journals/10.1075/is.17005.ihm
Loading
/content/journals/10.1075/is.17005.ihm
Loading

Data & Media loading...

  • Article Type: Research Article
Keyword(s): driving simulator , Facial Action Coding System , facial expressions and frustration

Most Cited This Month

This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error