1887
Volume 26, Issue 3
  • ISSN 1572-0373
  • E-ISSN: 1572-0381
USD
Buy:$35.00 + Taxes

Abstract

Abstract

Social robotics is a multidisciplinary field focused on designing and implementing robots capable of interacting with humans in social environments. However, group conversations challenge robots in interpreting social signals for effective participation. This study evaluates control policies for moderating multi-party conversation dynamics using a humanoid robot. The system employs a cloud-based framework to calculate speaker dominance as a weighted combination of speaking time and word count, while the Louvain algorithm identifies subgroups among participants. Control policies aim to minimize dominance disparities and subgroup formation, fostering balanced participation and group cohesion. A study with 300 middle school students compared these policies to a baseline in which the robot did not address individuals directly. The results demonstrated that the proposed policies reduced dominance gaps and subgroup formation, promoting more balanced interactions. These findings highlight the potential applicability of the approach across education, healthcare, and entertainment.

Loading

Article metrics loading...

/content/journals/10.1075/is.25013.gra
2026-04-02
2026-04-20
Loading full text...

Full text loading...

References

  1. Addlesee, A., Cherakara, N., Nelson, N., Hernández Garcia, D., Gunson, N., Sieińska, W., Romeo, M., Dondrup, C., & Lemon, O.
    (2024) A multiparty conversational social robot using llms. Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction (HRl), 1273–1275. 10.1145/3610978.3641112
    https://doi.org/10.1145/3610978.3641112 [Google Scholar]
  2. Addlesee, A., Sieińska, W., Gunson, N., Hernández Garcia, D., Dondrup, C., & Lemon, O.
    (2023) Multi-party goal tracking with llms: Comparing pre-training, fine-tuning, and prompt engineering. Proceedings of the 24th Annual Meeting of the Special Interest Group on Discourse and Dialogue (SIGDIAL), 229–241. 10.18653/v1/2023.sigdial‑1.22
    https://doi.org/10.18653/v1/2023.sigdial-1.22 [Google Scholar]
  3. Aliasghari, P., Taheri, A., Meghdari, A., & E., M.
    (2020) Implementing a gaze control system on a social robot in multi-person interactions. SN Applied Sciences, 21, 1–13. 10.1007/s42452‑020‑2911‑0
    https://doi.org/10.1007/s42452-020-2911-0 [Google Scholar]
  4. Axelsson, M., Spitale, M., & Gunes, H.
    (2023) Robotic coaches delivering group mindfulness practice at a public cafe. HRI’23, 86–90. 10.1145/3568294.3580048
    https://doi.org/10.1145/3568294.3580048 [Google Scholar]
  5. Bales, R. F.
    (1950) Interaction process analysis: A method for the study of small groups. Addison-Wesley.
    [Google Scholar]
  6. Ban, Y., Alameda-Pineda, X., Badeig, F., Ba, S., & Horaud, R.
    (2017) Tracking a varying number of people with a visually-controlled robotic head. IROS 2017, 4144–4151. 10.1109/IROS.2017.8206274
    https://doi.org/10.1109/IROS.2017.8206274 [Google Scholar]
  7. Belpaeme, T., Kennedy, J., Ramachandran, A., Scassellati, B., & Tanaka, F.
    (2018) Social robots for education: A review. Science Robotics, 3 (21), eaat5954. 10.1126/scirobotics.aat5954
    https://doi.org/10.1126/scirobotics.aat5954 [Google Scholar]
  8. Bi, J., Hu, F., Wang, Y., Luo, M., & He, M.
    (2023a) Human engagement intention intensity recognition method based on two states fusion fuzzy inference system. Intelligent Service Robotics, 121, 307–322. 10.1007/s11370‑023‑00464‑8
    https://doi.org/10.1007/s11370-023-00464-8 [Google Scholar]
  9. (2023b) A method based on interpretable machine learning for recognizing the intensity of human engagement intention. Scientific Reports, 131, 1–14. 10.1038/s41598‑023‑29661‑2
    https://doi.org/10.1038/s41598-023-29661-2 [Google Scholar]
  10. Brown, T. B., Mann, B., Ryder, N., Subbiah, M., Kaplan, J., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., Agarwal, S., Herbert-Voss, A., Krueger, G., Henighan, T., Child, R., Ramesh, A., Ziegler, D. M., Wu, J., Winter, C., … Amodei, D.
    (2020) Language models are few-shot learners. Advances in Neural Information Processing Systems.
    [Google Scholar]
  11. Burgoon, J. K., & Dunbar, N. E.
    (2006) Nonverbal expressions of dominance and power in human relationships. InThe sage handbook of nonverbal communication (pp.279–297). Sage Publications, Inc. 10.4135/9781412976152.n15
    https://doi.org/10.4135/9781412976152.n15 [Google Scholar]
  12. Chew, J. Y., & Nakamura, K.
    (2023) Who to teach a robot to facilitate multiparty social interactions?Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (HRI), 51. 10.1145/3568294.3580056
    https://doi.org/10.1145/3568294.3580056 [Google Scholar]
  13. Correia, F., Campos, J., Melo, F., & Paiva, A.
    (2023) Robotic gaze responsiveness in multiparty teamwork. International Journal of Social Robotics, 151, 27–36. 10.1007/s12369‑022‑00955‑1
    https://doi.org/10.1007/s12369-022-00955-1 [Google Scholar]
  14. Cumbal, R., Kazzi, D. A., Winberg, V., & Engwall, O.
    (2022) Shaping unbalanced multi-party interactions through adaptive robot backchannels. Proc. IVA’22, 1–7. 10.1145/3514197.3549680
    https://doi.org/10.1145/3514197.3549680 [Google Scholar]
  15. Davison, D. P., Wijnen, F. M., Charisi, V., van der Meij, J., Evers, V., & Reidsma, D.
    (2020) Working with a social robot in school: A longterm real-world unsupervised deployment. Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, 63–72. 10.1145/3319502.3374803
    https://doi.org/10.1145/3319502.3374803 [Google Scholar]
  16. De Ruiter, J.-P., Mitterer, H., & Enfield, N. J.
    (2006) Projecting the end of a speaker’s turn: A cognitive cornerstone of conversation. Language, 821, 515–535. 10.1353/lan.2006.0130
    https://doi.org/10.1353/lan.2006.0130 [Google Scholar]
  17. Dunbar, N. E., & Burgoon, J. K.
    (2005) Perceptions of power and interactional dominance in interpersonal relationships. Journal of Social and Personal Relationships, 22 (2), 207–233. 10.1177/0265407505050944
    https://doi.org/10.1177/0265407505050944 [Google Scholar]
  18. Ellyson, S. L., & Dovidio, J. F.
    (1985) Power dominance and nonverbal behavior. Springer. 10.1007/978‑1‑4612‑5106‑4
    https://doi.org/10.1007/978-1-4612-5106-4 [Google Scholar]
  19. Escobar-Planas, M., Charisi, V., & Gomez, E.
    (2022) “That robot played with us!” children’s perceptions of a robot after a child-robot group interaction. Proceedings of the ACM on Human-Computer Interaction, 61, 1–23. 10.1145/3555118
    https://doi.org/10.1145/3555118 [Google Scholar]
  20. Forsyth, D. R.
    (2018) Group dynamics (7th). Cengage Learning.
    [Google Scholar]
  21. Foster, M. E., Gaschler, A., Giuliani, M., Isard, A., Pateraki, M., & Petrick, R. P.
    (2012) Two people walk into a bar: Dynamic multi-party social interaction with a robot agent. ICMI’12, 3–10. 10.1145/2388676.2388680
    https://doi.org/10.1145/2388676.2388680 [Google Scholar]
  22. Foster, M., Gaschler, A., & Giuliani, M.
    (2017) Automatically classifying user engagement for dynamic multi-party human-robot interaction. International Journal of Social Robotics, 91, 659–674. 10.1007/s12369‑017‑0414‑y
    https://doi.org/10.1007/s12369-017-0414-y [Google Scholar]
  23. Fraune, M. R., Šabanović, S., & Kanda, T.
    (2019) Human group presence, group characteristics, and group norms affect human-robot interaction in naturalistic settings. Frontiers in Robotics and AI, 61 (JUN). 10.3389/frobt.2019.00048
    https://doi.org/10.3389/frobt.2019.00048 [Google Scholar]
  24. Gillet, S., Cumbal, R., Pereira, A., Lopes, J., Engwall, O., & Leite, I.
    (2021) Robot gaze can mediate participation imbalance in groups with different skill levels. HRI’21, 303–311. 10.1145/3434073.3444670
    https://doi.org/10.1145/3434073.3444670 [Google Scholar]
  25. Gillet, S., Parreira, M. T., Vázquez, M., & Leite, I.
    (2022) Learning gaze behaviors for balancing participation in group human-robot interactions. HRI’22, 265–274. 10.1109/HRI53351.2022.9889416
    https://doi.org/10.1109/HRI53351.2022.9889416 [Google Scholar]
  26. Gillet, S., Vázquez, M., Peters, C., Yang, F., & Leite, I.
    (2022) Multiparty interaction between humans and socially interactive agents. InThe handbook on socially interactive agents: 20 years of research on embodied conversational agents, intelligent virtual agents, and social robotics volume 2: Interactivity, platforms, application (pp.113–154). ACM. 10.1145/3563659.3563665
    https://doi.org/10.1145/3563659.3563665 [Google Scholar]
  27. Gonzalez, J., Belgiovine, G., Sciutti, A., Sandini, G., & Rea, F.
    (2021) Towards a cognitive framework for multimodal person recognition in multiparty hri. HAI’21, 412–416. 10.1145/3472307.3484675
    https://doi.org/10.1145/3472307.3484675 [Google Scholar]
  28. Grassi, L., Recchiuto, C. T., & Sgorbissa, A.
    (2021) Cloud services for social robots and artificial agents. Proc. AIRO 2021, 1–6.
    [Google Scholar]
  29. (2022) Knowledge-grounded dialogue flow management for social robots and conversational agents. International Journal of Social Robotics, 1–21. 10.1007/s12369‑022‑00868‑z
    https://doi.org/10.1007/s12369-022-00868-z [Google Scholar]
  30. (2024) Enhancing llm-based human-robot interaction with nuances for diversity awareness. https://arxiv.org/abs/2406.17531. 10.1109/RO‑MAN60168.2024.10731381
    https://doi.org/10.1109/RO-MAN60168.2024.10731381
  31. Hung, H., & Gatica-Perez, D.
    (2010) Estimating cohesion in small groups using audio-visual nonverbal behavior. IEEE Transactions on Multimedia, 12 (6), 563–575. 10.1109/TMM.2010.2055233
    https://doi.org/10.1109/TMM.2010.2055233 [Google Scholar]
  32. Kim, J., Yun, S.-S., Kang, B.-N., Kim, D., & Choi, J.
    (2017) Reliable multiperson identification using dcnn-based face recognition algorithm and scale-ratio method. Proc. URAI 2017, 97–101. 10.1109/URAI.2017.7992895
    https://doi.org/10.1109/URAI.2017.7992895 [Google Scholar]
  33. Klotz, D., Wienke, J., Peltason, J., Wrede, B., Wrede, S., Khalidov, V., & Odobez, J.-M.
    (2011) Engagement-based multi-party dialog with a humanoid robot. SIGDIAL’11, 341–343.
    [Google Scholar]
  34. Li, L., Yu, X., Li, J., Wang, G., Shi, J.-Y., Tan, Y. K., & Li, H.
    (2012) Visionbased attention estimation and selection for social robot to perform natural interaction in the open world. HRI’12, 183–184.
    [Google Scholar]
  35. Lin, W., Li, Y., Xiao, H., See, J., Zou, J., Xiong, H., Wang, J., & Mei, T.
    (2021) Group reidentification with multigrained matching and integration. IEEE Transactions on Cybernetics, 51 (3), 1478–1492. 10.1109/TCYB.2019.2917713
    https://doi.org/10.1109/TCYB.2019.2917713 [Google Scholar]
  36. Ma, F., Ma, Z., Sun, B., & Li, S.
    (2022) TA-CNN: A unified network for human behavior analysis in multi-person conversations. MM ’22, 7099–7103. 10.1145/3503161.3551587
    https://doi.org/10.1145/3503161.3551587 [Google Scholar]
  37. Massé, B., Ba, S., & Horaud, R.
    (2018) Tracking gaze and visual focus of attention of people involved in social interaction. IEEE Transactions on Pattern Analysis and Machine Intelligence, 40 (11), 2711–2724. 10.1109/TPAMI.2017.2782819
    https://doi.org/10.1109/TPAMI.2017.2782819 [Google Scholar]
  38. Matsuyama, Y., Akiba, I., S., F., & T., K.
    (2015) Four-participant group conversation. Computer Speech & Language, 33 (1), 1–24. 10.1016/j.csl.2014.12.001
    https://doi.org/10.1016/j.csl.2014.12.001 [Google Scholar]
  39. Mongile, S., Pusceddu, G., Cocchella, F., Lastrico, L., Belgiovine, G., Tanevska, A., Rea, F., & Sciutti, A.
    (2023) What if a social robot excluded you? using a conversational game to study social exclusion in teen-robot mixed groups. HRI’23, 208–212. 10.1145/3568294.3580073
    https://doi.org/10.1145/3568294.3580073 [Google Scholar]
  40. Moujahid, M., Wilson, B., Hastie, H., & Lemon, O.
    (2022) Demonstration of a robot receptionist with multi-party situated interaction. HRI’22, 1202–1203. 10.1109/HRI53351.2022.9889314
    https://doi.org/10.1109/HRI53351.2022.9889314 [Google Scholar]
  41. Murali, P., Steenstra, I., Yun, H. S., Shamekhi, A., & Bickmore, T.
    (2023) Improving multiparty interactions with a robot using large language models. CHI EA ’23, 1–8. 10.1145/3544549.3585602
    https://doi.org/10.1145/3544549.3585602 [Google Scholar]
  42. Mussakhojayeva, S., Zhanbyrtayev, M., Agzhanov, Y., & Sandygulova, A.
    (2016) Who should robots adapt to within a multi-party interaction in a public space?2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 483–484. 10.1109/HRI.2016.7451817
    https://doi.org/10.1109/HRI.2016.7451817 [Google Scholar]
  43. Neto, I., Correia, F., Rocha, F., Piedade, P., Paiva, A., & Nicolau, H.
    (2023) The robot made us hear each other: Fostering inclusive conversations among mixed-visual ability children. HRI’23, 13–23. 10.1145/3568162.3576997
    https://doi.org/10.1145/3568162.3576997 [Google Scholar]
  44. Pappu, A., Sun, M., Sridharan, S., & Rudnicky, A.
    (2013) Situated multiparty interaction between humans and agents. InHci’13 (pp.107–116). Springer. 10.1007/978‑3‑642‑39330‑3_12
    https://doi.org/10.1007/978-3-642-39330-3_12 [Google Scholar]
  45. Pereira, A., Prada, R., & Paiva, A.
    (2014) Improving social presence in human-agent interaction. Proc, CHI ’14, 1449–1458. 10.1145/2556288.2557180
    https://doi.org/10.1145/2556288.2557180 [Google Scholar]
  46. Que, X., Checconi, F., Petrini, F., & Gunnels, J. A.
    (2015) Scalable community detection with the louvain algorithm. IPDPS 2015, 28–37. 10.1109/IPDPS.2015.59
    https://doi.org/10.1109/IPDPS.2015.59 [Google Scholar]
  47. Recchiuto, C., Gava, L., Grassi, L., Grillo, A., Lagomarsino, M., Lanza, D., Liu, Z., Papadopoulos, C., Papadopoulos, I., Scalmato, A.,
    (2020) Cloud services for culture aware conversation: Socially assistive robots and virtual assistants. Proc. UR’20, 270–277.
    [Google Scholar]
  48. Recchiuto, C. T., & Sgorbissa, A.
    (2020) A feasibility study of culture-aware cloud services for conversational robots. IEEE Robotics and Automation Letters, 5 (4), 6559–6566. 10.1109/LRA.2020.3015461
    https://doi.org/10.1109/LRA.2020.3015461 [Google Scholar]
  49. Schmid Mast, M.
    (2002) Dominance as expressed and inferred through speaking time: A meta-analysis. Human Communication Research, 28 (3), 420–450.
    [Google Scholar]
  50. Sebo, S., Dong, L. L., Chang, N., Lewkowicz, M., Schutzman, M., & Scassellati, B.
    (2020) The influence of robot verbal support on human team members: Encouraging outgroup contributions and suppressing ingroup supportive behavior. Frontiers in Psychology, 111, 1–16. 10.3389/fpsyg.2020.590181
    https://doi.org/10.3389/fpsyg.2020.590181 [Google Scholar]
  51. Shintani, T., Ishi, C. T., & Ishiguro, H.
    (2021) Analysis of role-based gaze behaviors and gaze aversions, and implementation of robot’s gaze control for multi-party dialogue. HAI’21, 332–336. 10.1145/3472307.3484653
    https://doi.org/10.1145/3472307.3484653 [Google Scholar]
  52. Short, E., & Mataric, M. J.
    (2017) Robot moderation of a collaborative game: Towards socially assistive robotics in group interactions. ROMAN 2017, 385–390. 10.1109/ROMAN.2017.8172331
    https://doi.org/10.1109/ROMAN.2017.8172331 [Google Scholar]
  53. Short, E. S., Sittig-Boyd, K., & Matarić, M. J.
    (2016) Modeling moderation for multi-party socially assistive robotics. RO-MAN 2016.
    [Google Scholar]
  54. Skantze, G.
    (2021) Turn-taking in conversational systems and human-robot interaction: A review. Computer Speech & Language, 671, 1–26. 10.1016/j.csl.2020.101178
    https://doi.org/10.1016/j.csl.2020.101178 [Google Scholar]
  55. Skantze, G., Johansson, M., & Beskow, J.
    (2015) Exploring turn-taking cues in multi-party human-robot discussions about objects. ICMI’15, 67–74. 10.1145/2818346.2820749
    https://doi.org/10.1145/2818346.2820749 [Google Scholar]
  56. Tatarian, K., Chamoux, M., Pandey, A. K., & Chetouani, M.
    (2021) Robot gaze behavior and proxemics to coordinate conversational roles in group interactions. RO-MAN 2021, 1297–1304. 10.1109/RO‑MAN50785.2021.9515550
    https://doi.org/10.1109/RO-MAN50785.2021.9515550 [Google Scholar]
  57. Taylor, A., & Riek, L. D.
    (2022) Regroup: A robot-centric group detection and tracking system. HRI’22, 412–421. 10.1109/HRI53351.2022.9889634
    https://doi.org/10.1109/HRI53351.2022.9889634 [Google Scholar]
  58. Telisheva, Z., Zhanatkyzy, A., Oralbayeva, N., Amirova, A., Aimysheva, A., & Sandygulova, A.
    (2022) The effects of dyadic vs triadic interaction on children’s cognitive and affective gains in robot-assisted alphabet learning. ICSR 2022, 204–213. 10.1007/978‑3‑031‑24670‑8_19
    https://doi.org/10.1007/978-3-031-24670-8_19 [Google Scholar]
  59. Trombly, M., Shahverdi, P., Huang, N., Chen, Q., Korneder, J., & Louie, W.-Y. G.
    (2022) Robot-mediated group instruction for children with asd: A pilot study. RO-MAN 2022, 1506–1513. 10.1109/RO‑MAN53752.2022.9900584
    https://doi.org/10.1109/RO-MAN53752.2022.9900584 [Google Scholar]
  60. Tuncer, S., Gillet, S., & Leite, I.
    (2022) Robot-mediated inclusive processes in groups of children: From gaze aversion to mutual smiling gaze. Frontiers in Robotics and AI, 91, 1–15. 10.3389/frobt.2022.729146
    https://doi.org/10.3389/frobt.2022.729146 [Google Scholar]
  61. Xu, Q., Li, L., & Wang, G.
    (2013) Designing engagement-aware agents for multiparty conversations. CHI’13, 2233–2242. 10.1145/2470654.2481308
    https://doi.org/10.1145/2470654.2481308 [Google Scholar]
  62. Yatsushiro, M., Ikeda, N., Hayashi, Y., & Nakano, Y. I.
    (2013) A dominance estimation mechanism using eye-gaze and turn-taking information. Proc. GazeIn ’13, 13–18. 10.1145/2535948.2535956
    https://doi.org/10.1145/2535948.2535956 [Google Scholar]
  63. Yoshino, T., Takase, Y., & Nakano, Y. I.
    (2015) Controlling robot’s gaze according to participation roles and dominance in multiparty conversations. HRI’15, 127–128. 10.1145/2701973.2702012
    https://doi.org/10.1145/2701973.2702012 [Google Scholar]
  64. Yumak, Z., & Magnenat-Thalmann, N.
    (2015) Multimodal and multi-party social interactions. Springer. 10.1007/978‑3‑319‑19947‑4_13
    https://doi.org/10.1007/978-3-319-19947-4_13 [Google Scholar]
  65. Żarkowski, M.
    (2019) Multi-party turn-taking in repeated human-robot interactions: An interdisciplinary evaluation. International Journal of Social Robotics, 111, 693–707. 10.1007/s12369‑019‑00603‑1
    https://doi.org/10.1007/s12369-019-00603-1 [Google Scholar]
  66. Zhang, Z., Zheng, J., & Thalmann, N. M.
    (2021) Engagement intention estimation in multiparty human-robot interaction. RO-MAN 2021, 117–122. 10.1109/RO‑MAN50785.2021.9515373
    https://doi.org/10.1109/RO-MAN50785.2021.9515373 [Google Scholar]
/content/journals/10.1075/is.25013.gra
Loading
/content/journals/10.1075/is.25013.gra
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error