1887
Volume 17, Issue 3
  • ISSN 1572-0373
  • E-ISSN: 1572-0381
USD
Buy:$35.00 + Taxes

Abstract

Robots should be able to represent emotional states to interact with people as social agents. There are cases where robots cannot have bio-inspired bodies, for instance because the task to be performed requires a special shape, as in the case of home cleaners, package carriers, and many others. In these cases, emotional states have to be represented by exploiting movements of the body. In this paper, we present a set of case studies aimed at identifying specific values to convey emotion trough changes in linear and angular velocities, which might be applied on different non-anthropomorphic bodies. This work originates from some of the most considered emotion expression theories and from emotion coding for people. We show that people can recognize some emotional expressions better than others, and we propose some directions to express emotions exploiting only bio-neutral movement.

Loading

Article metrics loading...

/content/journals/10.1075/is.17.3.06ang
2017-03-16
2019-10-22
Loading full text...

Full text loading...

References

  1. Angel-Fernandez, J. M. , & Bonarini, A.
    (2016) Identifying values to express emotions with a non-anthropomorphic platform. (Manuscript in preparation)
    [Google Scholar]
  2. Barakova, E. I. , & Lourens, T.
    (2010) Expressing and interpreting emotional movements in social games with robots. Personal and Ubiquitous Computing, 14(5), 457–467. doi: 10.1007/s00779‑009‑0263‑2
    https://doi.org/10.1007/s00779-009-0263-2 [Google Scholar]
  3. Beck, A. , Cañamero, L. , & Bard, K.
    (2010) Towards an affect space for robots to display emotional body language. InIeee roman conference.
    [Google Scholar]
  4. Beck, A. , Hiolle, A. , Mazel, A. , & Cañamero, L.
    (2010) Interpretation of emotional body language displayed by robots. In3rd acm workshop on affective interaction in natural environments (p.37–42). doi: 10.1145/1877826.1877837
    https://doi.org/10.1145/1877826.1877837 [Google Scholar]
  5. Breazeal, C.
    (2002) Designing sociable robots. Cambridge, MA, USA: MIT Press.
    [Google Scholar]
  6. Brown, L. , & Howard, A. M.
    (2014, Aug). Gestural behavioral implementation on a humanoid robotic platform for effective social interaction. InRobot and human interactive communication, 2014 ro-man: The 23rd ieee international symposium on robot and human interactive communication (p.471–476). doi: 10.1109/ROMAN.2014.6926297
    https://doi.org/10.1109/ROMAN.2014.6926297 [Google Scholar]
  7. Cacioppo, J. , Tassinary, L. , & Berntson, G.
    (2000) Handbook of psychophysiology. University Press.
    [Google Scholar]
  8. Crane, E. , & Gross, M.
    (2013) Effort-shape characteristics of emotion-related body movement. Journal of Nonverbal Behavior, 37(2), 91–105. doi: 10.1007/s10919‑013‑0144‑2
    https://doi.org/10.1007/s10919-013-0144-2 [Google Scholar]
  9. Destephe, M. , Zecca, M. , Hashimoto, K. , & Takanishi, A.
    (2013, Dec). Perception of emotion and emotional intensity in humanoid robots gait. InRobotics and biomimetics (robio), 2013 ieee international conference on (p.1276–1281). doi: 10.1109/ROBIO.2013.6739640
    https://doi.org/10.1109/ROBIO.2013.6739640 [Google Scholar]
  10. Ekman, P.
    (2004) Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life. Owl Books.
    [Google Scholar]
  11. Embgen, S. , Luber, M. , Becker-Asano, C. , Ragni, M. , Evers, V. , & Arras, K. O.
    (2012, September). Robot-specific social cues in emotional body language. InRobot and human interactive communication, 2012 ro-man: The 21st ieee international symposium on robot and human interactive communication (pp.1019–1025). USA: IEEE Computer Society. doi: 10.1109/ROMAN.2012.6343883
    https://doi.org/10.1109/ROMAN.2012.6343883 [Google Scholar]
  12. from Jed Wing, M. K. C. , Weston, S. , Williams, A. , Keefer, C. , & Engelhardt, A. (2012) caret: Classification and regression training [Computer software manual]. Retrieved fromCRAN.R-project.org/package=caret (R package version5.15–044)
  13. Hayes, A. F. , & Krippendorff, K.
    (2007) Answering the call for a standard reliability measure for coding data. Communication Methods and Measures, 1(1), 77–89. doi: 10.1080/19312450709336664
    https://doi.org/10.1080/19312450709336664 [Google Scholar]
  14. Hiah, L. , Beursgens, L. , Haex, R. , Romero, L. P. , Teh, Y. , ten Bhömer, M. , … Barakova, E. I.
    (2013) Abstract robots with an attitude: Applying interpersonal relation models to human-robot interaction. InRobot and human interactive communication, 2013 ro-man: The 22nd ieee international symposium on robot and human interactive communication, gyeongju, korea (south), august 26–29, 2013 (pp.37–44). doi: 10.1109/ROMAN.2013.6628528
    https://doi.org/10.1109/ROMAN.2013.6628528 [Google Scholar]
  15. Kaya, N. , & Epps, H. H.
    (2004, September01). Relationship between color and emotion: a study of college students. College student journal, 38(3). Retrieved fromeab.sagepub.com/search?fulltext=environment\%20psychology\&\#38;sortspec=date\&\#38;submit=Submit\&\#38;andorexactfulltext=phrase\&\#38;src=selected\&\#38;journal\_set=speab
    [Google Scholar]
  16. Kleinsmith, A. , & Bianchi-Berthouze, N.
    (2013) Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing, 4(1), 15–33. doi: 10.1109/T‑AFFC.2012.16
    https://doi.org/10.1109/T-AFFC.2012.16 [Google Scholar]
  17. Laban, R. , & Ullmann, L.
    (1968) Modern educational dance (2d ed., rev. by Lisa Ullmann . ed.) [Book]. Praeger New York.
    [Google Scholar]
  18. Lakatos, G. , Gacsi, M. , Konok, V. , Bruder, I. , Berczky, B. , Korondi, P. , & Miklosi, A.
    (2014, December). Emotion attribution to a non-humanoid robot in different social situations. PLos ONE. doi: 10.1371/journal.pone.0114207
    https://doi.org/10.1371/journal.pone.0114207 [Google Scholar]
  19. Lang, P. J. , Bradley, M. M. , & Cuthbert, B. N.
    (2008) International affective picture system (IAPS): Affective ratings of pictures and instruction manual (Tech. Rep. No. A-8). Gainesville, FL: The Center for Research in Psychophysiology, University of Florida. Retrieved fromcsea.phhp.ufl.edu/Media.html
    [Google Scholar]
  20. Li, J. , & Chignell, M. H.
    (2011) Communication of emotion in social robots through simple head and arm movements. I. J. Social Robotics, 3(2), 125–142. doi: 10.1007/s12369‑010‑0071‑x
    https://doi.org/10.1007/s12369-010-0071-x [Google Scholar]
  21. Marsella, S. , Gratch, J. , & Petta, P.
    (2010) A blueprint for affective computing. In (p.21–46). Oxford University Press.
    [Google Scholar]
  22. Nam, T.-J. , Lee, J.-H. , Park, S. , & Suk, H.-J.
    (2014) Understanding the relation between emotion and physical movements. International Journal of Affective Engineering, 13(3), 217–226. doi: 10.5057/ijae.13.217
    https://doi.org/10.5057/ijae.13.217 [Google Scholar]
  23. Novikova, J. , & Watts, L.
    (2015) Towards artificial emotions to assist social coordination in hri. International Journal of Social Robotics, 7(1), 77–88. doi: 10.1007/s12369‑014‑0254‑y
    https://doi.org/10.1007/s12369-014-0254-y [Google Scholar]
  24. Plutchik, R.
    (2001) The Nature of Emotions. American Scientist, 89(4), 344+. doi: 10.1511/2001.4.344
    https://doi.org/10.1511/2001.4.344 [Google Scholar]
  25. Pratto, F. , Sidanius, J. , Stallworth, L. M. , & Malle, B. F.
    (1994) Social dominance orientation: A personality variable predicting social and political attitudes. Journal of personality and social psychology, 67(4), 741. doi: 10.1037/0022‑3514.67.4.741
    https://doi.org/10.1037/0022-3514.67.4.741 [Google Scholar]
  26. Roether, C. L. , Omlor, L. , Christensen, A. , & Giese, M. A.
    (2009) Critical features for the perception of emotion from gait. Journal of Vision, 9(6), 15. doi: 10.1167/9.6.15
    https://doi.org/10.1167/9.6.15 [Google Scholar]
  27. Russell, J. A. , Bachorowski, J. A. , & Dols, J. M. F.
    (2003) Facial and Vocal Expressions of Emotion. Annual Review of Psychology, 54(1), 329–349. Retrieved from doi: 10.1146/annurev.psych.54.101601.145102
    https://doi.org/10.1146/annurev.psych.54.101601.145102 [Google Scholar]
  28. Saerbeck, M. , & Bartneck, C.
    (2010) Perception of affect elicited by robot motion. InProceedings of the 5th acm/ieee international conference on human-robot interaction (pp.53–60). Piscataway, NJ, USA: IEEE Press.
    [Google Scholar]
  29. Saerbeck, M. , & van Breemen, A. J. N.
    (2007) Design guidelines and tools for creating believable motion for personal robots. InRo-man (p.386–391).
    [Google Scholar]
  30. Sharma, M. , Hildebrandt, D. , Newman, G. , Young, J. E. , & Eskicioglu, R.
    (2013) Communicating affect via flight path: Exploring use of the laban effort system for designing affective locomotion paths. InProceedings of the 8th acm/ieee international conference on human-robot interaction (pp.293–300). Piscataway, NJ, USA: IEEE Press.
    [Google Scholar]
  31. Venture, G. , Kadone, H. , Zhang, T. , Grèzes, J. , Berthoz, A. , & Hicheur, H.
    (2014) Recognizing emotions conveyed by human gait. International Journal of Social Robotics, 6(4), 621–632. doi: 10.1007/s12369‑014‑0243‑1
    https://doi.org/10.1007/s12369-014-0243-1 [Google Scholar]
  32. Watson, D. , Clark, L. A. , & Tellegen, A.
    (1988) Development and validation of brief measures of positive and negative affect: the panas scales. Journal of Personality and Social Psychology, 54, 1063–1070. doi: 10.1037/0022‑3514.54.6.1063
    https://doi.org/10.1037/0022-3514.54.6.1063 [Google Scholar]
http://instance.metastore.ingenta.com/content/journals/10.1075/is.17.3.06ang
Loading
/content/journals/10.1075/is.17.3.06ang
Loading

Data & Media loading...

Most Cited This Month

This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error