Volume 20, Issue 3
  • ISSN 1572-0373
  • E-ISSN: 1572-0381
Buy:$35.00 + Taxes



Using artificial emotions helps in making human-robot interaction more personalised, natural, and so more likeable. In the case of humanoid robots with constrained facial expression, the literature concentrates on the expression of emotions by using other nonverbal interaction channels. When using multi-modal communication, indeed, it is important to understand the effect of the combination of such non-verbal cues, while the majority of the works addressed only the role of single channels in the human recognition performance. Here, we present an attempt to analyse the effect of the combination of different animations expressing the same emotion or different ones. Results show that when an emotion is successfully expressed using a single channel, the combination of this channel with other animations, that may have lower recognition rates, appears to be less communicative.


Article metrics loading...

Loading full text...

Full text loading...


  1. App, B., McIntosh, D. N., Reed, C. L., and Hertenstein, M. J.
    (2011) Nonverbal channel use in communication of emotion: How may depend on why. Emotion, 11(3):603–617. 10.1037/a0023164
    https://doi.org/10.1037/a0023164 [Google Scholar]
  2. App, B., Reed, C. L., and McIntosh, D. N.
    (2012) Relative contributions of face and body configurations: Perceiving emotional state and motion intention. Cognition and Emotion, 26(4):690–698. 10.1080/02699931.2011.588688
    https://doi.org/10.1080/02699931.2011.588688 [Google Scholar]
  3. Bartneck, C., Reichenbach, J., and Van Breemen, A.
    (2004) In your face, robot! the influence of a characters embodiment on how users perceive its emotional expressions. InDesign and Emotion.
    [Google Scholar]
  4. Beck, A., Cañamero, L., Hiolle, A., Damiano, L., Cosi, P., Tesser, F., and Sommavilla, G.
    (2013) Interpretation of emotional body language displayed by a humanoid robot: A case study with children. International Journal of Social Robotics, 5(3):325–334. 10.1007/s12369‑013‑0193‑z
    https://doi.org/10.1007/s12369-013-0193-z [Google Scholar]
  5. Beck, A., Hiolle, A., Mazel, A., and Cañamero, L.
    (2010) Interpretation of emotional body language displayed by robots. InProceedings of the 3rd International Workshop on Affective Interaction in Natural Environments, AFFINE ’10, pages37–42, New York, NY, USA. ACM.
    [Google Scholar]
  6. Biele, C. and Grabowska, A.
    (2006) Sex differences in perception of emotion intensity in dynamic and static facial expressions. Experimental Brain Research, 171(1):1–6. 10.1007/s00221‑005‑0254‑0
    https://doi.org/10.1007/s00221-005-0254-0 [Google Scholar]
  7. Breazeal, C.
    (2001) Emotive qualities in robot speech. InProceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, volume3, pages1388–1394.
    [Google Scholar]
  8. Breazeal, C. and Scassellati, B.
    (1999) How to build robots that make friends and influence people. InProceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289), volume2, pages858–863vol.2.
    [Google Scholar]
  9. Burattini, E. and Rossi, S.
    (2010) Periodic activations of behaviours and emotional adaptation in behaviour-based robotics. Connect. Sci, 22(3):197–213. 10.1080/09540091003749691
    https://doi.org/10.1080/09540091003749691 [Google Scholar]
  10. Calder, A. J., Keane, J., Manly, T., Sprengelmeyer, R., Scott, S., Nimmo-Smith, I., and Young, A. W.
    (2003) Facial expression recognition across the adult life span. Neuropsychologia, 41(2):195–202. The cognitive neuroscience of social behavior. 10.1016/S0028‑3932(02)00149‑5
    https://doi.org/10.1016/S0028-3932(02)00149-5 [Google Scholar]
  11. Calvo, M. G. and Nummenmaa, L.
    (2016) Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cognition and Emotion, 30(6):1081–1106. 10.1080/02699931.2015.1049124
    https://doi.org/10.1080/02699931.2015.1049124 [Google Scholar]
  12. Conti, D., Carla, C., Di Nuovo, S.,
    (2019) robot, tell me a tale!: A social robot as tool for teachers in kindergarten. Interaction Studies, 20(2):1–16.
    [Google Scholar]
  13. Ekman, P.
    (1992) An argument for basic emotions. Cognition & emotion, 6(3–4):169–200. 10.1080/02699939208411068
    https://doi.org/10.1080/02699939208411068 [Google Scholar]
  14. Haring, M., Bee, N., and Andr, E.
    (2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. InRO-MAN, pages204–209.
    [Google Scholar]
  15. Jack, R. E., Garrod, O. G., and Schyns, P. G.
    (2014) Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time. Current Biology, 24(2):187–192. 10.1016/j.cub.2013.11.064
    https://doi.org/10.1016/j.cub.2013.11.064 [Google Scholar]
  16. Kleinsmith, A. and Bianchi-Berthouze, N.
    (2013) Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing, 4(1):15–33. 10.1109/T‑AFFC.2012.16
    https://doi.org/10.1109/T-AFFC.2012.16 [Google Scholar]
  17. Leite, I.
    (2015) Long-term interactions with empathic social robots. AI Matters, 1(3):13–15. 10.1145/2735392.2735397
    https://doi.org/10.1145/2735392.2735397 [Google Scholar]
  18. Li, X., MacDonald, B., and Watson, C. I.
    (2009) Expressive facial speech synthesis on a robotic platform. InProceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS’09, pages5009–5014. IEEE Press. 10.1109/IROS.2009.5354007
    https://doi.org/10.1109/IROS.2009.5354007 [Google Scholar]
  19. Lim, A., Ogata, T., and Okuno, H. G.
    (2012) The desire model: Cross-modal emotion analysis and expression for robots. 10.1186/1687‑4722‑2012‑3
    https://doi.org/10.1186/1687-4722-2012-3 [Google Scholar]
  20. Marmpena, M., Lim, A., and Dahl, T. S.
    (2017) How does the robot feel? annotation of emotional expressions generated by a humanoid robot with affective quantifiers. InProceedings of the 2017 Workshop on Behavior Adaptation, Interaction and Learning for Assistive Robotics (BAILAR – IEEE RO-MAN2017).
    [Google Scholar]
  21. (2018) How does the robot feel? perception of valence and arousal in emotional body language. Paladyn, Journal of Behavioral Robotics, 9(1):168–182. 10.1515/pjbr‑2018‑0012
    https://doi.org/10.1515/pjbr-2018-0012 [Google Scholar]
  22. McColl, D. and Nejat, G.
    (2014) Recognizing emotional body language displayed by a humanlike social robot. International Journal of Social Robotics, 6(2):261–280. 10.1007/s12369‑013‑0226‑7
    https://doi.org/10.1007/s12369-013-0226-7 [Google Scholar]
  23. Moltchanova, E. and Bartneck, C.
    (2017) Individual differences are more important than the emotional category for the perception of emotional expressions. Interaction Studies, 18(2):161–173. 10.1075/is.18.2.01mol
    https://doi.org/10.1075/is.18.2.01mol [Google Scholar]
  24. Mutlu, B., Yamaoka, F., Kanda, T., Ishiguro, H., and Hagita, N.
    (2009) Nonverbal leakage in robots: Communication of intentions through seemingly unintentional behavior. In4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages69–76.
    [Google Scholar]
  25. Nijdam, N. A.
    (2009) Mapping emotion to color. Book Mapping emotion to color, pages2–9.
    [Google Scholar]
  26. Ortony, A., Clore, G. L., and Collins, A.
    (1990) The cognitive structure of emotions. Cambridge university press.
    [Google Scholar]
  27. Pereira, A., Leite, I., Mascarenhas, S., Martinho, C., and Paiva, A.
    (2011) Using empathy to improve human-robot relationships. InHuman-Robot Personal Relationships, pages130–138, Berlin, Heidelberg. Springer Berlin Heidelberg. 10.1007/978‑3‑642‑19385‑9_17
    https://doi.org/10.1007/978-3-642-19385-9_17 [Google Scholar]
  28. Rosenthal-von der Pütten, A. M., Krämer, N. C., and Herrmann, J.
    (2018) The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. International Journal of Social Robotics, 10(5):569–582. 10.1007/s12369‑018‑0466‑7
    https://doi.org/10.1007/s12369-018-0466-7 [Google Scholar]
  29. Rossi, S., Ferland, F., and Tapus, A.
    (2017) User profiling and behavioral adaptation for hri: A survey. Pattern Recognition Letters, 99(Supplement C):3–12. User Profiling and Behavior Adaptation for Human-Robot Interaction. 10.1016/j.patrec.2017.06.002
    https://doi.org/10.1016/j.patrec.2017.06.002 [Google Scholar]
  30. Rossi, S., Staffa, M., and Tamburro, A.
    (2018) Socially assistive robot for providing recommendations: Comparing a humanoid robot with a mobile application. International Journal of Social Robotics, 10(2):265–278. 10.1007/s12369‑018‑0469‑4
    https://doi.org/10.1007/s12369-018-0469-4 [Google Scholar]
  31. Russell, J. A.
    (1980) A circumplex model of affect. Journal of Personality and Social Psychology, 39(6):1161–1178. 10.1037/h0077714
    https://doi.org/10.1037/h0077714 [Google Scholar]
  32. Salem, M., Eyssel, F., Rohlfing, K., Kopp, S., and Joublin, F.
    (2011) Effects of gesture on the perception of psychological anthropomorphism: A case study with a humanoid robot. InSocial Robotics, pages31–41, Berlin, Heidelberg. Springer Berlin Heidelberg. 10.1007/978‑3‑642‑25504‑5_4
    https://doi.org/10.1007/978-3-642-25504-5_4 [Google Scholar]
  33. Scherer, K. R., Schorr, A., and Johnstone, T.
    (2001) Appraisal processes in emotion: Theory, methods, research. Oxford University Press.
    [Google Scholar]
  34. Schlosberg, H.
    (1954) Three dimensions of emotion. Psychological review, 61(2):81. 10.1037/h0054570
    https://doi.org/10.1037/h0054570 [Google Scholar]
  35. Song, S. and Yamada, S.
    (2017a) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. InProceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’17, pages2–11, New York, NY, USA. ACM.
    [Google Scholar]
  36. (2017b) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. InProceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pages2–11. ACM.
    [Google Scholar]
  37. Tonks, J., Williams, W. H., Frampton, I., Yates, P., and Slater, A.
    (2007) Assessing emotion recognition in 915-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Injury, 21(6):623–629. 10.1080/02699050701426865
    https://doi.org/10.1080/02699050701426865 [Google Scholar]
  38. Tsiourti, C., Weiss, A., Wac, K., and Vincze, M.
    (2017) Designing emotionally expressive robots: A comparative study on the perception of communication modalities. InProceedings of the 5th International Conference on Human Agent Interaction, HAI ’17, pages213–222, New York, NY, USA. ACM.
    [Google Scholar]
  39. Valdez, P. and Mehrabian, A.
    (1994) Effects of color on emotions. Journal of experimental psychology: General, 123(4):394. 10.1037/0096‑3445.123.4.394
    https://doi.org/10.1037/0096-3445.123.4.394 [Google Scholar]
  40. Wilhelm, O., Hildebrandt, A., Manske, K., Schacht, A., and Sommer, W.
    (2014) Test battery for measuring the perception and recognition of facial expressions of emotion. Frontiers in Psychology, 5:404.
    [Google Scholar]
  41. Wundt, W. M.
    (1907) Outlines of psychology. W. Engelmann.
    [Google Scholar]
  42. Xu, J., Broekens, J., Hindriks, K., and Neerincx, M. A.
    (2014) Robot mood is contagious: Effects of robot body language in the imitation game. InProceedings of the 2014 International Conference on Autonomous Agents and Multi-agent Systems, AAMAS ’14, pages973–980, Richland, SC. International Foundation for Autonomous Agents and Multiagent Systems.
    [Google Scholar]

Data & Media loading...

  • Article Type: Research Article
Keyword(s): coherence and incoherent composition; emotion recognition; non-verbal cues

Most Cited

This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error