1887
Volume 24, Issue 1
  • ISSN 1572-0373
  • E-ISSN: 1572-0381
USD
Buy:$35.00 + Taxes

Abstract

Abstract

Human–Human and Human–Robot Interaction are known to be influenced by a variety of modalities and parameters. Nevertheless, it remains a challenge to anticipate how a given mobile robot’s navigation and appearance will impact how it is perceived by humans. Drawing a parallel with vocal prosody, we introduce the notion of movement prosody, which encompasses spatio-temporal and appearance dimensions which are involved in a person’s perceptual experience of interacting with a mobile robot. We design a novel robot motion corpus, encompassing variables related to the kinematics, gaze, and appearance of the robot, which we hypothesize are involved in movement prosody. Initial results of three perception experiments suggest that these variables have significant influences on participants’ perceptions of robot socio-affects and physical attributes.

Loading

Article metrics loading...

/content/journals/10.1075/is.22010.sca
2023-08-28
2025-06-14
Loading full text...

Full text loading...

References

  1. Augustine, A. C., Ryusuke, M., Liu, C., Ishi, C. T., & Ishiguro, H.
    (2020) Generation and evaluation of audio-visual anger emotional expression for android robot. Companion of the 2020 ACM/IEEE International Conference on Human–Robot Interaction, 96–98. 10.1145/3371382.3378282
    https://doi.org/10.1145/3371382.3378282 [Google Scholar]
  2. Barchard, K. A., Lapping-Carr, L., Westfall, R. S., Fink-Armold, A., Banisetty, S. B., & Feil-Seifer, D.
    (2020) Measuring the perceived social intelligence of robots. ACM Transactions on Human–Robot Interaction, 9(4). 10.1145/3415139
    https://doi.org/10.1145/3415139 [Google Scholar]
  3. Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S.
    (2009) Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots. Int J Soc Robot, 11, 71–81. 10.1007/s12369‑008‑0001‑3
    https://doi.org/10.1007/s12369-008-0001-3 [Google Scholar]
  4. Bates, D., Machler, M., Bolker, B., & Walker, S.
    (2015) Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67(1), 1–48. 10.18637/jss.v067.i01
    https://doi.org/10.18637/jss.v067.i01 [Google Scholar]
  5. Brandl, C., Mertens, A., & Schlick, C. M.
    (2016) Human–Robot Interaction in Assisted Personal Services: Factors Influencing Distances That Humans Will Accept between Themselves and an Approaching Service Robot. Human Factors and Ergonomics in Manufacturing & Service Industries, 26 (6), 713–727. 10.1002/hfm.20675
    https://doi.org/10.1002/hfm.20675 [Google Scholar]
  6. Breazeal, C., Kidd, C., Thomaz, A., Hoffman, G., & Berlin, M.
    (2005) Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 708–713. 10.1109/IROS.2005.1545011
    https://doi.org/10.1109/IROS.2005.1545011 [Google Scholar]
  7. Campbell, N., & Mokhtari, P.
    (2003) Voice quality: The 4th prosodic dimension. Proc. 15th Int. Congr. Phonetic Sciences, pp. 2417–2420.
    [Google Scholar]
  8. Carpenter, J.
    (2013) The Quiet Professional: An investigation of U.S. military Explosive Ordnance Disposal personnel interactions with everyday field robots. Doctoral dissertationUniversity of Washington.
  9. Carpinella, C. M., Wyman, A. B., Perez, M. A., & Stroessner, S. J.
    (2017) The Robotic Social Attributes Scale (RoSAS): Development and Validation. 2017 12th ACM/IEEE International Conference on Human–Robot Interaction (HRI, 254–262. 10.1145/2909824.3020208
    https://doi.org/10.1145/2909824.3020208 [Google Scholar]
  10. Carton, D., Olszowy, W., Wollherr, D., & Buss, M.
    (2017) Socio-Contextual Constraints for Human Approach with a Mobile Robot. International Journal of Social Robotics, 9(2), 309–327. 10.1007/s12369‑016‑0394‑3
    https://doi.org/10.1007/s12369-016-0394-3 [Google Scholar]
  11. Chan, L., Zhang, B. J., & Fitter, N. T.
    (2021) Designing and validating expressive cozmo behaviors for accurately conveying emotions. 2021 30th IEEE International Conference on Robot & Human Interactive Communication (RO-MAN), 1037–1044. 10.1109/RO‑MAN50785.2021.9515425
    https://doi.org/10.1109/RO-MAN50785.2021.9515425 [Google Scholar]
  12. Chen, Y. F., Everett, M., Liu, M., & How, J. P.
    (2017) Socially aware motion planning with deep reinforcement learning. IEEE International Conference on Intelligent Robots and Systems, 2017-Septe, 1343–1350. 10.1109/IROS.2017.8202312
    https://doi.org/10.1109/IROS.2017.8202312 [Google Scholar]
  13. Dautenhahn, K., Nehaniv, C. L., Walters, M. L., Robins, B., Kose-Bagci, H., Mirza, N. A., & Blow, M.
    (2009) KASPAR – a minimally expressive humanoid robot for human–robot interaction research. Applied Bionics and Biomechanics, 6(3–4), 369–397. 10.1155/2009/708594
    https://doi.org/10.1155/2009/708594 [Google Scholar]
  14. Di Cesare, G., De Stefani, E., Gentilucci, M., & De Marco, D.
    (2017) Vitality Forms Expressed by Others Modulate Our Own Motor Response: A Kinematic Study. Frontiers in Human Neuroscience, 111, 565. 10.3389/fnhum.2017.00565
    https://doi.org/10.3389/fnhum.2017.00565 [Google Scholar]
  15. Drumm, P.
    (2012) Kohler, W.InR. W. Rieber (Ed.), Encyclopedia of the history of psychological theories (pp. 610–612). Springer US. 10.1007/978‑1‑4419‑0463‑8_153
    https://doi.org/10.1007/978-1-4419-0463-8_153 [Google Scholar]
  16. Fischer, K., Jensen, L. C., Suvei, S. D., & Bodenhagen, L.
    (2016) Between legibility and contact: The role of gaze in robot approach. 25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016, 646–651. 10.1109/ROMAN.2016.7745186
    https://doi.org/10.1109/ROMAN.2016.7745186 [Google Scholar]
  17. Gil, Ó., Garrell, A., & Sanfeliu, A.
    (2021) Social robot navigation tasks: Combining machine learning techniques and social force model. Sensors, 21 (21). 10.3390/s21217087
    https://doi.org/10.3390/s21217087 [Google Scholar]
  18. Gobl, C., & Ní Chasaide, A.
    (2003) The role of voice quality in communicating emotion, mood and attitude. Speech Communication, 40 (1–2), 189–212. 10.1016/S0167‑6393(02)00082‑1
    https://doi.org/10.1016/S0167-6393(02)00082-1 [Google Scholar]
  19. Guillaume, L., Aubergé, V., Magnani, R., Aman, F., Cottier, C., Sasa, Y., Wolf, C., Nebout, F., Neverova, N., Bonnefond, N., Negre, A., Tsvetanova, L., & Girard-Rivier, M.
    (2015) Hri in an ecological dynamic experiment: The gee corpus based approach for the emox robot. 2015 IEEE International Workshop on Advanced Robotics and its Social Impacts (ARSO), 1–6. 10.1109/ARSO.2015.7428207
    https://doi.org/10.1109/ARSO.2015.7428207 [Google Scholar]
  20. Hall, E. T., Birdwhistell, R. L., Bock, B., Bohannan, P., Diebold, A. R., Durbin, M., Edmonson, M. S., Fischer, J. L., Hymes, D., Kimball, S. T., Barre, W. L., Frank Lynch, S. J., McClellan, J. E., Marshall, D. S., Milner, G. B., Sarles, H. B., Trager, G. L., & Vayda, A. P.
    (1968) Proxemics [and comments and replies]. Current Anthropology, 9(2/3), 83–108. www.jstor.org/stable/2740724. 10.1086/200975
    https://doi.org/10.1086/200975 [Google Scholar]
  21. Hebesberger, D., Koertner, T., Gisinger, C., & Pripfl, J.
    (2017) A long-term autonomous robot at a Care hospital: A mixed methods study on social acceptance and experiences of staff and older adults. International Journal of Social Robotics, 91. 10.1007/s12369‑016‑0391‑6
    https://doi.org/10.1007/s12369-016-0391-6 [Google Scholar]
  22. Honig, S., & Oron-Gilad, T.
    (2020) Comparing laboratory user studies and video-enhancedweb surveys for eliciting user gestures in human–robot interactions. ACM/IEEE International Conference on Human–Robot Interaction, 248–250. 10.1145/3371382.3378325
    https://doi.org/10.1145/3371382.3378325 [Google Scholar]
  23. Honour, A., Banisetty, S. B., & Feil-Seifer, D.
    (2021) Perceived Social Intelligence as Evaluation of Socially Navigation. Companion of the 2021 ACM/IEEE International Conference on Human–Robot Interaction, 519–523. 10.1145/3434074.3447226
    https://doi.org/10.1145/3434074.3447226 [Google Scholar]
  24. Irfan, B., Kennedy, J., Lemaignan, S., Papadopoulos, F., Senft, E., & Belpaeme, T.
    (2018) Social Psychology and Human–Robot Interaction: An Uneasy Marriage. Companion of the 2018 ACM/IEEE International Conference on Human–Robot Interaction – HRI ’18, 13–20. 10.1145/3173386.3173389
    https://doi.org/10.1145/3173386.3173389 [Google Scholar]
  25. Kamezaki, M., Kobayashi, A., Yokoyama, Y., Yanagawa, H., Shrestha, M., & Sugano, S.
    (2019) A Preliminary Study of Interactive Navigation Framework with Situation-Adaptive Multimodal Inducement: Pass-By Scenario. International Journal of Social Robotics. 10.1007/s12369‑019‑00574‑3
    https://doi.org/10.1007/s12369-019-00574-3 [Google Scholar]
  26. Khambhaita, H., & Alami, R.
    (2020) Viewing Robot Navigation in Human Environment as a Cooperative Activity. Springer, Cham. 10.1007/978‑3‑030‑28619‑4_25
    https://doi.org/10.1007/978-3-030-28619-4_25 [Google Scholar]
  27. Knight, H., Thielstrom, R., & Simmons, R.
    (2016) Expressive path shape (Swagger): Simple features that illustrate a robot’s attitude toward its goal in real time. IEEE International Conference on Intelligent Robots and Systems, 2016-Novem, 1475–1482. 10.1109/IROS.2016.7759240
    https://doi.org/10.1109/IROS.2016.7759240 [Google Scholar]
  28. Kruse, T., Pandey, A. K., Alami, R., & Kirsch, A.
    (2013) Human–Aware Robot Navigation: A Survey. Robotics and Autonomous Systems, 61 (12), pp.1726–1743. https://hal.archives-ouvertes.fr/hal-01684295. 10.1016/j.robot.2013.05.007
    https://doi.org/10.1016/j.robot.2013.05.007 [Google Scholar]
  29. Matsumoto, M.
    (2021) Fragile Robot: The Fragility of Robots Induces User Attachment to Robots. International Journal of Mechanical Engineering and Robotics Research, 10 (10), 536–541. 10.18178/ijmerr.10.10.536‑541
    https://doi.org/10.18178/ijmerr.10.10.536-541 [Google Scholar]
  30. Mavrogiannis, C., Hutchinson, A. M., MacDonald, J., Alves-Oliveira, P., & Knepper, R. A.
    (2019) Effects of Distinct Robot Navigation Strategies on Human Behavior in a Crowded Environment. ACM/IEEE International Conference on Human–Robot Interaction, 2019-March (March), 421–430. 10.1109/HRI.2019.8673115
    https://doi.org/10.1109/HRI.2019.8673115 [Google Scholar]
  31. Mavrogiannis, C. I., Baldini, F., Wang, A., Zhao, D., Trautman, P., Steinfeld, A., & Oh, J.
    (2021) Core challenges of social robot navigation: A survey. CoRR, abs/2103.05668. https://arxiv.org/abs/2103.05668
    [Google Scholar]
  32. McGinn, C., & Torre, I.
    (2019) Can you tell the robot by the voice? an exploratory study on the role of voice in the perception of robots. Proceedings of the 14th ACM/IEEE International Conference on Human–Robot Interaction, 211–221. 10.1109/HRI.2019.8673305
    https://doi.org/10.1109/HRI.2019.8673305 [Google Scholar]
  33. McGurk, H., & MacDonald, J.
    (1976) Hearing lips and seeing voices. Nature, 264 (5588), 746–748. 10.1038/264746a0
    https://doi.org/10.1038/264746a0 [Google Scholar]
  34. Menne, I. M., & Schwab, F.
    (2018) Faces of Emotion: Investigating Emotional Facial Expressions Towards a Robot. International Journal of Social Robotics, 10 (2), 199–209. 10.1007/s12369‑017‑0447‑2
    https://doi.org/10.1007/s12369-017-0447-2 [Google Scholar]
  35. Moon, A., Parker, C. A. C., Croft, E. A., & Van der Loos, H. F. M.
    (2013) Design and Impact of Hesitation Gestures during Human–Robot Resource Conflicts. Journal of Human–Robot Interaction, 2(3). 10.5898/JHRI.2.3.Moon
    https://doi.org/10.5898/JHRI.2.3.Moon [Google Scholar]
  36. Mutlu, B., & Forlizzi, J.
    (2008) Robots in organizations. Proceedings of the 3rd international conference on Human robot interaction – HRI 08, (May 2014), 287. 10.1145/1349822.1349860
    https://doi.org/10.1145/1349822.1349860 [Google Scholar]
  37. Nomura, T., Suzuki, T., Kanda, T., & Kato, K.
    (2006) Measurement of negative attitudes toward robots. Interaction Studies, 7(3), 437–454. 10.1075/is.7.3.14nom
    https://doi.org/10.1075/is.7.3.14nom [Google Scholar]
  38. Ramirez, O. A., Khambhaita, H., Chatila, R., Chetouani, M., & Alami, R.
    (2016) Robots learning how and where to approach people. 25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016, 11, 347–353. 10.1109/ROMAN.2016.7745154
    https://doi.org/10.1109/ROMAN.2016.7745154 [Google Scholar]
  39. Reinhardt, J., Prasch, L., & Bengler, K.
    (2021) Back-off. ACM Transactions on Human–Robot Interaction, 10(3), 1–25. 10.1145/3418303removed for double-blind review (2017).
    https://doi.org/10.1145/3418303 [Google Scholar]
  40. Rios-Martinez, J., Spalanzani, A., & Laugier, C.
    (2015) From Proxemics Theory to Socially-Aware Navigation: A Survey. International Journal of Social Robotics, 7(2), 137–153. 10.1007/s12369‑014‑0251‑1
    https://doi.org/10.1007/s12369-014-0251-1 [Google Scholar]
  41. Robair mobile robot, designed and built by fabmstic, grenoble
    Robair mobile robot, designed and built by fabmstic, grenoble [Accessed: 2021-07-19]. (n.d.).
  42. Robinson, F. A., Velonaki, M., & Bown, O.
    (2021) Smooth operator: Tuning robot perception through artificial movement sound. ACM/IEEE International Conference on Human–Robot Interaction, 53–62. 10.1145/3434073.3444658
    https://doi.org/10.1145/3434073.3444658 [Google Scholar]
  43. Rosenthal-von der Pütten, A. M., Schulte, F. P., Eimler, S. C., Sobieraj, S., Hoffmann, L., Maderwald, S., Brand, M., & Kramer, N. C.
    (2014) Investigations on empathy towards humans and robots using fMRI. Computers in Human Behavior, 331, 201–212. 10.1016/j.chb.2014.01.004
    https://doi.org/10.1016/j.chb.2014.01.004 [Google Scholar]
  44. Saerbeck, M., & Bartneck, C.
    (2010) Perception of affect elicited by robot motion, 53–60. 10.1109/HRI.2010.5453269
    https://doi.org/10.1109/HRI.2010.5453269 [Google Scholar]
  45. Saldien, J., Vanderborght, B., Goris, K., Van Damme, M., & Lefeber, D.
    (2014) A Motion System for Social and Animated Robots. International Journal of Advanced Robotic Systems, 11 (5), 72. 10.5772/58402
    https://doi.org/10.5772/58402 [Google Scholar]
  46. Sasa, Y., & Aubergé, V.
    (2016) Perceived isolation and elderly boundaries in eee (emoz elder-ly expressions) corpus: Appeal to communication dynamics with a socio-affectively gluing robot in a smart home. Gerontechnology, 151.
    [Google Scholar]
  47. Sasa, Y., & Auberge, V.
    (2017) SASI: perspectives for a socio-affectively intelligent HRI dialog system. 1st Workshop on “Behavior, Emotion and Representation: Building Blocks of Interaction”. https://hal.inria.fr/hal-01615470
    [Google Scholar]
  48. Savery, R., Rose, R., & Weinberg, G.
    (2019) Establishing human–robot trust through music-driven robotic emotion prosody and gesture. 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), 1–7. 10.1109/RO‑MAN46459.2019.8956386
    https://doi.org/10.1109/RO-MAN46459.2019.8956386 [Google Scholar]
  49. Savery, R., Zahray, L., & Weinberg, G.
    (2021) Emotional musical prosody for the enhancement of trust: Audio design for robotic arm communication. Paladyn, Journal of Behavioral Robotics, 12(1), 454–467. 10.1515/pjbr‑2021‑0033
    https://doi.org/10.1515/pjbr-2021-0033 [Google Scholar]
  50. Scales, P., Aycard, O., & Aubergé, V.
    (2020) Studying navigation as a form of interaction: A design approach for social robot navigation methods. 2020 IEEE International Conference on Robotics and Automation (ICRA), 6965–6972. 10.1109/ICRA40945.2020.9197037
    https://doi.org/10.1109/ICRA40945.2020.9197037 [Google Scholar]
  51. Schulz, T., Holthaus, P., Amirabdollahian, F., Koay, K. L., Torresen, J., & Herstad, J.
    (2020) Differences of Human Perceptions of a Robot Moving using Linear or Slow in, Slow out Velocity Profiles When Performing a Cleaning Task. 2019 28th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2019. 10.1109/RO‑MAN46459.2019.8956355
    https://doi.org/10.1109/RO-MAN46459.2019.8956355 [Google Scholar]
  52. Sharpe, D.
    (2015) Your chi-square test is statistically significant: Now what?Practical Assessment, Research and Evaluation, 201, 1–10.
    [Google Scholar]
  53. Shiomi, M., Zanlungo, F., Hayashi, K., & Kanda, T.
    (2014) Towards a Socially Acceptable Collision Avoidance for a Mobile Robot Navigating Among Pedestrians Using a Pedestrian Model. International Journal of Social Robotics, 6(3), 443–455. 10.1007/s12369‑014‑0238‑y
    https://doi.org/10.1007/s12369-014-0238-y [Google Scholar]
  54. Sorrentino, A., Khalid, O., Coviello, L., Cavallo, F., & Fiorini, L.
    (2021) Modeling human-like robot personalities as a key to foster socially aware navigation. 2021 30th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2021, 95–101. 10.1109/RO‑MAN50785.2021.9515556
    https://doi.org/10.1109/RO-MAN50785.2021.9515556 [Google Scholar]
  55. Takenaka, H.
    (2005) Loss experience and rebirth of elderly people.
    [Google Scholar]
  56. Tanaka, K.
    (1997) Geratology Isagoge.
    [Google Scholar]
  57. Tennent, H., Moore, D., Jung, M., & Ju, W.
    (2017) Good vibrations: How consequential sounds affect perception of robotic arms. RO-MAN 2017 – 26th IEEE International Symposium on Robot and Human Interactive Communication, 2017-January, 928–935. 10.1109/ROMAN.2017.8172414
    https://doi.org/10.1109/ROMAN.2017.8172414 [Google Scholar]
  58. Torre, I., Linard, A., Steen, A., Tumova, J., & Leite, I.
    (2021) Should robots chicken? How anthropomorphism and perceived autonomy influence trajectories in a game-theoretic problem. ACM/IEEE International Conference on Human–Robot Interaction, 370–379. 10.1145/3434073.3444687
    https://doi.org/10.1145/3434073.3444687 [Google Scholar]
  59. Tsvetanova, L., Aubergé, V., & Sasa, Y.
    (2017) Multimodal breathiness in interaction : From breathy voice quality to global breathy “body behavior quality”. Proc. of the 1st International Workshop on Vocal Interactivity in-and-between Humans, Animals and Robots – VIHAR 2017.
    [Google Scholar]
  60. Watanabe, K., Greenberg, Y., & Sagisaka, Y.
    (2014) Sentiment analysis of color attributes derived from vowel sound impression for multimodal expression. Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific, 1–5. 10.1109/APSIPA.2014.7041586
    https://doi.org/10.1109/APSIPA.2014.7041586 [Google Scholar]
  61. Zecca, M., Endo, N., Momoki, S., Itoh, K., & Takanishi, A.
    (2008) Design of the humanoid robot KOBIAN – preliminary analysis of facial and whole body emotion expression capabilities-. 2008 8th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2008, 487–492. 10.1109/ICHR.2008.4755969
    https://doi.org/10.1109/ICHR.2008.4755969 [Google Scholar]
  62. Zhou, A., & Dragan, A. D.
    (2018) Cost Functions for Robot Motion Style. IEEE International Conference on Intelligent Robots and Systems, 3632–3639. 10.1109/IROS.2018.8594433
    https://doi.org/10.1109/IROS.2018.8594433 [Google Scholar]
/content/journals/10.1075/is.22010.sca
Loading
/content/journals/10.1075/is.22010.sca
Loading

Data & Media loading...

  • Article Type: Research Article
Keyword(s): corpus; HRI; prosody of movement
This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error