1887
Volume 24, Issue 3
  • ISSN 1572-0373
  • E-ISSN: 1572-0381
USD
Buy:$35.00 + Taxes

Abstract

Abstract

On reviewing the literature regarding acceptance and trust in human-robot interaction (HRI), there are a number of open questions that needed to be addressed in order to establish effective collaborations between humans and robots in real-world applications. In particular, we identified four principal open areas that should be investigated to create guidelines for the successful deployment of robots in the wild. These areas are focused on: (1) the robot’s abilities and limitations; in particular when it makes errors with different severity of consequences, (2) individual differences, (3) the dynamics of human-robot trust, and (4) the interaction between humans and robots over time. In this paper, we present two very similar studies, one with a virtual robot with human-like abilities, and one with a Care-O-bot 4 robot. In the first study, we create an immersive narrative using an interactive storyboard to collect responses of 154 participants. In the second study, 6 participants had repeated interactions over three weeks with a physical robot. We summarise and discuss the findings of our investigations of the effects of robots’ errors on people’s trust in robots for designing mechanisms that allow robots to recover from a breach of trust. In particular, we observed that robots’ errors had greater impact on people’s trust in the robot when the errors were made at the beginning of the interaction and had severe consequences.

Our results also provided insights on how these errors vary according to the individuals’ personalities, expectations and previous experiences.

Loading

Article metrics loading...

/content/journals/10.1075/is.21025.ros
2024-02-15
2024-12-04
Loading full text...

Full text loading...

References

  1. Adams, B. D., Bruyn, L. E., Houde, S., Angelopoulos, P., Iwasa-Madge, K., and McCann, C.
    (2003) Trust in automated systems. Ministry of National Defence.
    [Google Scholar]
  2. Agresti, A.
    (2002) Categorical data analysis. Wiley-Interscience, Chichester;New York;, 2nd edition. 10.1002/0471249688
    https://doi.org/10.1002/0471249688 [Google Scholar]
  3. Ambady, N., Bernieri, F. J., and Richeson, J. A.
    (2000) Toward a histology of social behavior: Judgmental accuracy from thin slices of the behavioral stream. Advances in Experimental Social Psychology, 321:201–271. 10.1016/S0065‑2601(00)80006‑4
    https://doi.org/10.1016/S0065-2601(00)80006-4 [Google Scholar]
  4. Aroyo, A. M., Pasquali, D., Kothig, A., Rea, F., Sandini, G., and Sciutti, A.
    (2021) Expectations vs. reality: Unreliability and transparency in a treasure hunt game with icub. IEEE Robotics and Automation Letters, 6(3):5681–5688. 10.1109/LRA.2021.3083465
    https://doi.org/10.1109/LRA.2021.3083465 [Google Scholar]
  5. Atkinson, D. J., Clancey, W. J., and Clark, M. H.
    (2014) Shared awareness, autonomy and trust in human-robot teamwork. InIn Artificial Intelligence and Human-Computer Interaction: Papers from the 2014 AAAI Spring Symposium on.
    [Google Scholar]
  6. Bainbridge, W. A., Hart, J. W., Kim, E. S., and Scassellati, B.
    (2011) The benefits of interactions with physically present robots over video-displayed agents. International Journal of Social Robotics, 3(1):41–52. 10.1007/s12369‑010‑0082‑7
    https://doi.org/10.1007/s12369-010-0082-7 [Google Scholar]
  7. Barrick, M. R., Stewart, G. L., Neubert, M. J., and Mount, M. K.
    (1998) Relating member ability and personality to work-team processes and team effectiveness. Journal of Applied Psychology, 83(3):377–391. 10.1037/0021‑9010.83.3.377
    https://doi.org/10.1037/0021-9010.83.3.377 [Google Scholar]
  8. Bottom, W. P., Gibson, K., Daniels, S. E., and Murnighan, J. K.
    (2002) When talk is not cheap: Substantive penance and expressions of intent in rebuilding cooperation. Organization Science, 13(5):497–513. 10.1287/orsc.13.5.497.7816
    https://doi.org/10.1287/orsc.13.5.497.7816 [Google Scholar]
  9. Buhrmester, M., Kwang, T., and Gosling, S.
    (2011) Amazon’s mechanical turk: A new source of inexpensive, yet high-quality data?Perspectives on Psychological Science, 61:3–5. 10.1177/1745691610393980
    https://doi.org/10.1177/1745691610393980 [Google Scholar]
  10. Chekroun, P. and Brauer, M.
    (2002) The bystander effect and social control behavior: the effect of the presence of others on people’s reactions to norm violations. European Journal of Social Psychology, 32(6):853–867. 10.1002/ejsp.126
    https://doi.org/10.1002/ejsp.126 [Google Scholar]
  11. Colquitt, J. A., Scott, B. A., and LePine, J. A.
    (2007) Trust, trustworthiness, and trust propensity: A meta-analytic test of their unique relationships with risk taking and job performance. Journal of Applied Psychology, 92(4):909–927. 10.1037/0021‑9010.92.4.909
    https://doi.org/10.1037/0021-9010.92.4.909 [Google Scholar]
  12. Corritore, C. L., Kracher, B., and Wiedenbeck, S.
    (2003) On-line trust: concepts, evolving themes, a model. International Journal of Human – Computer Studies, 58(6):737–758. 10.1016/S1071‑5819(03)00041‑7
    https://doi.org/10.1016/S1071-5819(03)00041-7 [Google Scholar]
  13. Costa, A. C., Roe, R. A., and Taillieu, T. C. B.
    (2001) Trust implications for performance and effectiveness. European Journal of Work and Organizational Psychology, 10(3):225–244. 10.1080/13594320143000654
    https://doi.org/10.1080/13594320143000654 [Google Scholar]
  14. Daniela, C. E. C., and S., B.
    (2017) Personality factors and acceptability of socially assistive robotics in teachers with and without specialized training for children with disability. Life Span and Disability, 20(2):251–272.
    [Google Scholar]
  15. Dautenhahn, K., Woods, S., Kaouri, C., Walters, M. L., Koay, Kheng Lee and Werry, I.
    (2005) “What is a robot companion – friend, assistant or butler?,” 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada 2005, pp.1192–1197, 10.1109/IROS.2005.1545189
    https://doi.org/10.1109/IROS.2005.1545189 [Google Scholar]
  16. de Graaf, M. M. and Ben Allouch, S.
    (2014) Expectation setting and personality attribution in hri. InProceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’14, page144–145, New York, NY, USA. Association for Computing Machinery. 10.1145/2559636.2559796
    https://doi.org/10.1145/2559636.2559796 [Google Scholar]
  17. de Visser, E. J., Peeters, M. M. M., Jung, M. F., Kohn, S., Shaw, T. H., Pak, R., and Neerincx, M. A.
    (2020) Towards a theory of longitudinal trust calibration in human-robot teams. International Journal of Social Robotics, page459–478. 10.1007/s12369‑019‑00596‑x
    https://doi.org/10.1007/s12369-019-00596-x [Google Scholar]
  18. DeNeve, K. M. and Cooper, H.
    (1998) The happy personality: A meta-analysis of 137 personality traits and subjective well-being. Psychological Bulletin, 1241:197–229. 10.1037/0033‑2909.124.2.197
    https://doi.org/10.1037/0033-2909.124.2.197 [Google Scholar]
  19. Desai, M., Kaniarasu, P., Medvedev, M., Steinfeld, A., and Yanco, H.
    (2013) Impact of robot failures and feedback on real-time trust. InACM/IEEE International Conference on HumanRobot Interaction, pages251–258. 10.1109/HRI.2013.6483596
    https://doi.org/10.1109/HRI.2013.6483596 [Google Scholar]
  20. Deutsch, M.
    (1958) Trust and suspicion. The Journal of Conflict Resolution, 2(4):265–279. 10.1177/002200275800200401
    https://doi.org/10.1177/002200275800200401 [Google Scholar]
  21. Drury, J. L., Scholtz, J., and Yanco, H. A.
    (2003) Awareness in human-robot interactions. InSystems, Man and Cybernetics, 2003. IEEE International Conference on, volume11, pages912–918. 10.1109/ICSMC.2003.1243931
    https://doi.org/10.1109/ICSMC.2003.1243931 [Google Scholar]
  22. Elson, J. S., Derrick, D. C., and Ligon, G. S.
    (2018) Examining trust and reliance in collaborations between humans and automated agents. InHICSS. 10.24251/HICSS.2018.056
    https://doi.org/10.24251/HICSS.2018.056 [Google Scholar]
  23. Freedy, A., DeVisser, E., Weltman, G., and Coeyman, N.
    (2007) Measurement of trust in human-robot collaboration. InProceedings of the 2007 International Symposium on Collaborative Technologies and Systems, CTS, pages106–114. 10.1109/CTS.2007.4621745
    https://doi.org/10.1109/CTS.2007.4621745 [Google Scholar]
  24. Furlough, C., Stokes, T., and Gillan, D. J.
    (2019) Attributing blame to robots: I. the influence of robot autonomy. Human Factors, 63(4):592–602. 10.1177/0018720819880641
    https://doi.org/10.1177/0018720819880641 [Google Scholar]
  25. Gockley, R. and Matariundefined, M. J.
    (2006) Encouraging physical therapy compliance with a hands-off mobile robot. InProceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction, page150–155, New York, NY, USA. Association for Computing Machinery. 10.1145/1121241.1121268
    https://doi.org/10.1145/1121241.1121268 [Google Scholar]
  26. Gosling, S. D., Rentfrow, P. J., and Swann, W. B. J.
    (2003) A very brief measure of the big five personality domains. Journal of Research in Personality, 371:504–528. 10.1016/S0092‑6566(03)00046‑1
    https://doi.org/10.1016/S0092-6566(03)00046-1 [Google Scholar]
  27. Haberlandt, K.
    (1982) Reader expectations in text comprehension. InNy], J.-F. L. and Kintsch, W., editors, Language and Comprehension, volume 9 of Advances in Psychology, pages239–249. North-Holland. 10.1016/S0166‑4115(09)60055‑8
    https://doi.org/10.1016/S0166-4115(09)60055-8 [Google Scholar]
  28. Hancock, P. A., Billings, D. R., and Schaefer, K. E.
    (2011a) Can you trust your robot?Ergonomics in Design, 19(3):24–29. 10.1177/1064804611415045
    https://doi.org/10.1177/1064804611415045 [Google Scholar]
  29. Hancock, P. A., Billings, D. R., Schaefer, K. E., Chen, J. Y. C., de Visser, E. J., and Parasuraman, R.
    (2011b) A meta-analysis of factors affecting trust in human-robot interaction. Human Factors: The Journal of Human Factors and Ergonomics Society, 53(5):517–527. 10.1177/0018720811417254
    https://doi.org/10.1177/0018720811417254 [Google Scholar]
  30. Haring, K. S., Matsumoto, Y., and Watanabe, K.
    (2013) How do people perceive and trust a lifelike robot. Lecture Notes in Engineering and Computer Science, 11:425–430. 9789881925169
    [Google Scholar]
  31. Haselhuhn, M. P., Schweitzer, M. E., and Wood, A. M.
    (2010) How implicit beliefs influence trust recovery. Psychological Science, 51:645–648. 10.1177/0956797610367752
    https://doi.org/10.1177/0956797610367752 [Google Scholar]
  32. Honig, S. and Oron-Gilad, T.
    (2018) Understanding and resolving failures in human-robot interaction: Literature review and model development. Frontiers in Psychology, 91. 10.3389/fpsyg.2018.00861
    https://doi.org/10.3389/fpsyg.2018.00861 [Google Scholar]
  33. Koay, K., Walters, M., May, A., Dumitriu, A., Christianson, B., Burke, N., and Dautenhahn, K.
    (2013) Exploring robot etiquette: Refining a hri home companion scenario based on feedback from two artists who lived with robots in the uh robot house. InSocial Robotics, Lecture Notes in Computer Science, pages290–300. Springer. 5th Int Conf on Social Robotics, ICSR 2013 ; Conference date: 27-10-2013Through29-10-2013. 10.1007/978‑3‑319‑02675‑6_29
    https://doi.org/10.1007/978-3-319-02675-6_29 [Google Scholar]
  34. Kudryavtsev, A. and Pavlodsky, J.
    (2012) Description-based and experience-based decisions: Individual analysis. Judgment and Decision Making, 7(3):316–331. 10.1017/S193029750000228X
    https://doi.org/10.1017/S193029750000228X [Google Scholar]
  35. Lee, A. Y.
    (2001) The mere exposure effect: An uncertainty reduction explanation revisited. Personality and Social Psychology Bulletin, 27(10):1255–1266. 10.1177/01461672012710002
    https://doi.org/10.1177/01461672012710002 [Google Scholar]
  36. Lee, J. and Moray, N.
    (1992) Trust, control strategies and allocation of function in humanmachine systems. Ergonomics, 35(10):1243–1270. 10.1080/00140139208967392
    https://doi.org/10.1080/00140139208967392 [Google Scholar]
  37. Lee, J. D. and See, K. A.
    (2004) Trust in automation: Designing for appropriate reliance. Human Factors: The Journal of the Human Factors and Ergonomics Society, 46(1):50–80. 10.1518/hfes.46.1.50.30392
    https://doi.org/10.1518/hfes.46.1.50.30392 [Google Scholar]
  38. Lee, M. K., Forlizzi, J., Kiesler, S., Rybski, P., Antanitis, J., and Savetsila, S.
    (2012) Personalization in hri: A longitudinal field experiment. InProceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction, page319–326, New York, NY, USA. Association for Computing Machinery. 10.1145/2157689.2157804
    https://doi.org/10.1145/2157689.2157804 [Google Scholar]
  39. Lemaignan, S., Fink, J., Mondada, F., and Dillenbourg, P.
    (2015) You’re doing it wrong! studying unexpected behaviors in child-robot interaction. InTapus, A., André, E., Martin, J.-C., Ferland, F., and Ammi, M., editors, Social Robotics, pages390–400, Cham. Springer International Publishing. 10.1007/978‑3‑319‑25554‑5_39
    https://doi.org/10.1007/978-3-319-25554-5_39 [Google Scholar]
  40. Lewis, J. D. and Weigert, A.
    (1985) Trust as a social reality. Social Forces, 63(4):967–985. 10.2307/2578601
    https://doi.org/10.2307/2578601 [Google Scholar]
  41. MacDorman, K. F. and Ishiguro, H.
    (2006) The uncanny advantage of using androids in cognitive and social science research. Interaction Studies, 7(3):297–337. 10.1075/is.7.3.03mac
    https://doi.org/10.1075/is.7.3.03mac [Google Scholar]
  42. Macias, W.
    (2003) A beginning look at the effects of interactivity, product involvement and web experience on comprehension: Brand web sites as interactive advertising. Journal of Current Issues & Research in Advertising, 25(2):31–44. 10.1080/10641734.2003.10505147
    https://doi.org/10.1080/10641734.2003.10505147 [Google Scholar]
  43. Mayer, R. C., Davis, J. H., and Schoorman, F. D.
    (1995) An integrative model of organizational trust. Academy of Management Review, pages709–734. 10.2307/258792
    https://doi.org/10.2307/258792 [Google Scholar]
  44. McAllister, D. J.
    (1995) Affect- and cognition-based trust as foundations for interpersonal cooperation in organizations. Academy of Management Journal, 38(1):24–59. 10.2307/256727
    https://doi.org/10.2307/256727 [Google Scholar]
  45. McKnight, D. H., Choudhury, V., and Kacmar, C.
    (2001) Developing and validating trust measures for e-commerce: An integrative typology. Information Systems Research, 13(3):334–359. 10.1287/isre.13.3.334.81
    https://doi.org/10.1287/isre.13.3.334.81 [Google Scholar]
  46. Mori, M., MacDorman, K. F., and Kageki, N.
    (2012) The uncanny valley [from the field]. IEEE Robotics Automation Magazine, 19(2):98–100. 10.1109/MRA.2012.2192811
    https://doi.org/10.1109/MRA.2012.2192811 [Google Scholar]
  47. Muir, B. M. and Moray, N.
    (1996) Trust in automation: Part ii. experimental studies of trust and human intervention in a process control simulation. Ergonomics, 391:429–460. 10.1080/00140139608964474
    https://doi.org/10.1080/00140139608964474 [Google Scholar]
  48. Paetzel, M., Perugia, G., and Castellano, G.
    (2020) The persistence of first impressions: The effect of repeated interactions on the perception of a social robot. InProceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, HRI 20, page73–82, New York, NY, USA. Association for Computing Machinery. 10.1145/3319502.3374786
    https://doi.org/10.1145/3319502.3374786 [Google Scholar]
  49. Pixar
    Pixar (2011) The moon, short movie. https://www.youtube.com/watch?v=vbuq7w3ZDUQ. last accessed inMay 6th, 2020.
  50. Rae, I., Takayama, L., and Mutlu, B.
    (2013) In-body experiences: Embodiment, control, and trust in robot-mediated communication. InProceedings of the SIGCHI Conference on Human Factors in Computing Systems, page1921–1930, New York, NY, USA. Association for Computing Machinery. 10.1145/2470654.2466253
    https://doi.org/10.1145/2470654.2466253 [Google Scholar]
  51. Reber, R., Winkielman, P., and Schwarz, N.
    (1998) Effects of perceptual fluency on affective judgments. Psychological Science, 9(1):45–48. 10.1111/1467‑9280.00008
    https://doi.org/10.1111/1467-9280.00008 [Google Scholar]
  52. Reis, H.T., Maniaci, M.R., Caprariello, P.A., Eastwick, P.W., and Finkel, E.J.
    (2011) Familiarity does indeed promote attraction in live interaction. Journal of Personality and Social Psychology, 101(3):557–570. 10.1037/a0022885
    https://doi.org/10.1037/a0022885 [Google Scholar]
  53. Robert, L. P.
    (2018) Personality in the human robot interaction literature: A review and brief critique. InProceedings of the 24th Americas Conference on Information Systems.
    [Google Scholar]
  54. Robinette, P., Howard, A. M., and Wagner, A. R.
    (2015) Timing is key for robot trust repair. Social Robotics. Lecture Notes in Computer Science, 93881:574–583. 10.1007/978‑3‑319‑25554‑5_57
    https://doi.org/10.1007/978-3-319-25554-5_57 [Google Scholar]
  55. Robinette, P., Li, W., Allen, R., Howard, A. M., and Wagner, A. R.
    (2016) Overtrust of robots in emergency evacuation scenarios. InProceeding HRI ’16 The Eleventh ACM/IEEE International Conference on Human Robot Interation, pages101–108. IEEE Press Piscataway. 10.1109/HRI.2016.7451740
    https://doi.org/10.1109/HRI.2016.7451740 [Google Scholar]
  56. Roccas, S., Sagiv, L., Schwartz, S. H., and Knafo, A.
    (2002) The big five personality factors and personal values. Personality and Social Psychology Bulletin, 28(6):789–801. 10.1177/0146167202289008
    https://doi.org/10.1177/0146167202289008 [Google Scholar]
  57. Ross, J. M.
    (2008) Moderators of trust and reliance across multiple decision aids (doctoral dissertation), university of central florida, orlando.
  58. Rossi, A., Dautenhahn, K., Koay, K. L., and Walters, M. L.
    (2017a) How the timing and magnitude of robot errors influence peoples’ trust of robots in an emergency scenario. InKheddar, A., Yoshida, E., Ge, S. S., Suzuki, K., Cabibihan, J.-J., Eyssel, F., and He, H., editors, Social Robotics, pages42–52, Cham. Springer International Publishing. 10.1007/978‑3‑319‑70022‑9_5
    https://doi.org/10.1007/978-3-319-70022-9_5 [Google Scholar]
  59. (2017b) Human perceptions of the severity of domestic robot errors. InKheddar, A., Yoshida, E., Ge, S. S., Suzuki, K., Cabibihan, J.-J., Eyssel, F., and He, H., editors, Social Robotics, pages647–656, Cham. Springer International Publishing. 10.1007/978‑3‑319‑70022‑9_64
    https://doi.org/10.1007/978-3-319-70022-9_64 [Google Scholar]
  60. (2018a) The impact of peoples’ personal dispositions and personalities on their trust of robots in an emergency scenario. Paladyn Journal of Behavioral Robotics, 91. 10.1515/pjbr‑2018‑0010
    https://doi.org/10.1515/pjbr-2018-0010 [Google Scholar]
  61. (2020a) How social robots influence people’s trust in critical situations. In2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pages1020–1025. 10.1109/RO‑MAN47096.2020.9223471
    https://doi.org/10.1109/RO-MAN47096.2020.9223471 [Google Scholar]
  62. Rossi, A., Holthaus, P., Dautenhahn, K., Koay, K. L., and Walters, M. L.
    (2018b) Getting to know pepper: Effects of people’s awareness of a robot’s capabilities on their trust in the robot. InProceedings of the 6th International Conference on Human-Agent Interaction, HAI ’18, pages246–252, New York, NY, USA. ACM. 10.1145/3284432.3284464
    https://doi.org/10.1145/3284432.3284464 [Google Scholar]
  63. Rossi, A., Moros, S., Dautenhahn, K., Koay, K. L., and Walters, M. L.
    (2019) Getting to know kaspar : Effects of people’s awareness of a robot’s capabilities on their trust in the robot. In2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pages1–6. 10.1109/RO‑MAN46459.2019.8956470
    https://doi.org/10.1109/RO-MAN46459.2019.8956470 [Google Scholar]
  64. Rossi, A. and Rossi, S.
    (2021) Engaged by a bartender robot: Recommendation and personalisation in human-robot interaction. InAdjunct Proceedings of the 29th ACM Conference on User Modeling, Adaptation and Personalization, UMAP ’21, page115–119, New York, NY, USA. Association for Computing Machinery. 10.1145/3450614.3463423
    https://doi.org/10.1145/3450614.3463423 [Google Scholar]
  65. Rossi, S., Rossi, A., and Dautenhahn, K.
    (2020b) The secret life of robots: Perspectives and challenges for robot’s behaviours during non-interactive tasks. International Journal of Social Robotics. 10.1007/s12369‑020‑00650‑z
    https://doi.org/10.1007/s12369-020-00650-z [Google Scholar]
  66. Rotter, J. B.
    (1967) A new scale for the measurement of interpersonal trust1. Journal of Personality, 35(4):651–665. 10.1111/j.1467‑6494.1967.tb01454.x
    https://doi.org/10.1111/j.1467-6494.1967.tb01454.x [Google Scholar]
  67. Sadeghi, N., Kasim, Z. M., Tan, B. H., and Abdullah, F. S.
    (2012) Learning styles, personality types and reading comprehension performance. Psychology. 10.5539/elt.v5n4p116
    https://doi.org/10.5539/elt.v5n4p116 [Google Scholar]
  68. Salem, M. and Dautenhahn, K.
    (2015) Evaluating Trust and Safety in HRI: Practical Issues and Ethical Challenges. InEmerging Policy and Ethics of Human-Robot Interaction: A Workshop at 10th ACM/IEEE Int Conf on Human-Robot Interaction (HRI 2015). ACM Press, 10th ACM/IEEE Int Conf on Human-Robot Interaction, Portland, United States, 2/03/15.
    [Google Scholar]
  69. Salem, M., Lakatos, G., Amirabdollahian, F., and Dautenhahn, K.
    (2015) Would you trust a (faulty) robot? effects of error, task type and personality on human-robot cooperation and trust. InProceedings of the Tenth Annual ACM/IEEE International Conference on HumanRobot Interaction, HRI 15, page141–148, New York, NY, USA. Association for Computing Machinery. 10.1145/2696454.2696497
    https://doi.org/10.1145/2696454.2696497 [Google Scholar]
  70. Schilke, O., Reimann, M., and Cook, K. S.
    (2013) Effect of relationship experience on trust recovery following a breach. Proceedings of the National Academy of Sciences, 110(38):15236–15241. 10.1073/pnas.1314857110
    https://doi.org/10.1073/pnas.1314857110 [Google Scholar]
  71. Seo, S. H., Geiskkovitch, D., Nakane, M., King, C., and Young, J. E.
    (2015) Poor thing! would you feel sorry for a simulated robot? a comparison of empathy toward a physical and a simulated robot. InProceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction, page125–132, New York, NY, USA. Association for Computing Machinery. 10.1145/2696454.2696471
    https://doi.org/10.1145/2696454.2696471 [Google Scholar]
  72. Sheehan, K. and Pittman, M.
    (2016) The academic’s guide to using amazon’s mechanical turk: The hit handbook for social science research. Irving: Melvin & Leigh.
    [Google Scholar]
  73. Short, E., Hart, J., Vu, M., and Scassellati, B.
    (2010) No fair!! an interaction with a cheating robot. In2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages219–226. 10.1109/HRI.2010.5453193
    https://doi.org/10.1109/HRI.2010.5453193 [Google Scholar]
  74. Takayama, L. and Pantofaru, C.
    (2009) Influences on proxemic behaviors in human-robot interaction. InProceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 09, page5495–5502. IEEE Press. 10.1109/IROS.2009.5354145
    https://doi.org/10.1109/IROS.2009.5354145 [Google Scholar]
  75. Tannenbaum, K. R., Torgesen, J. K., and Wagner, R. K.
    (2006) Relationships between word knowledge and reading comprehension in third-grade children. Scientific Studies of Reading, 10(4):381–398. 10.1207/s1532799xssr1004_3
    https://doi.org/10.1207/s1532799xssr1004_3 [Google Scholar]
  76. Tseng, S. H., Hua, J. H., Ma, S. P., and e. Fu, L.
    (2013) Human awareness based robot performance learning in a social environment. In2013 IEEE International Conference on Robotics and Automation, pages4291–4296. 10.1109/ICRA.2013.6631184
    https://doi.org/10.1109/ICRA.2013.6631184 [Google Scholar]
  77. van Maris, A., Lehmann, H., Natale, L., and Grzyb, B.
    (2017) The influence of a robot’s embodiment on trust: A longitudinal study. InProceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, page313–314, New York, NY, USA. Association for Computing Machinery. 10.1145/3029798.3038435
    https://doi.org/10.1145/3029798.3038435 [Google Scholar]
  78. Voelpel, S. C., Eckhoff, R. A., and Forster, J.
    (2008) David against goliath? group size and bystander effects in virtual knowledge sharing. Human Relations, 61(2):271–295. 10.1177/0018726707087787
    https://doi.org/10.1177/0018726707087787 [Google Scholar]
  79. Walters, M. L., Oskoei, M. A., Syrdal, D. S., and Dautenhahn, K.
    (2011) A long-term humanrobot proxemic study. InIEEE RO-MAN, pages137–142.
    [Google Scholar]
  80. Williamson, J. M.
    (2018) Chapter 1 – individual differences. InWilliamson, J. M., editor, Teaching to Individual Differences in Science and Engineering Librarianship, pages1–10. Chandos Publishing. 10.1016/B978‑0‑08‑101881‑1.00001‑7
    https://doi.org/10.1016/B978-0-08-101881-1.00001-7 [Google Scholar]
  81. Wood, T.
    (2014) Exploring the role of first impressions in rater-based assessments. Adv Health Sci Educ Theory Pract, 19(3):409–427. 10.1007/s10459‑013‑9453‑9
    https://doi.org/10.1007/s10459-013-9453-9 [Google Scholar]
  82. Yu, K., Berkovsky, S., Taib, R., Conway, D., Zhou, J., and Chen, F.
    (2017) User trust dynamics: An investigation driven by differences in system performance. ACM, 1267451:307–317. 10.1145/3025171.3025219
    https://doi.org/10.1145/3025171.3025219 [Google Scholar]
  83. Zajonc, R. B.
    (1968) Attitudinal effects of mere exposure. Journal of Personality and Social Psychology, 9(2, Pt.2):1–27. 10.1037/h0025848
    https://doi.org/10.1037/h0025848 [Google Scholar]
/content/journals/10.1075/is.21025.ros
Loading
/content/journals/10.1075/is.21025.ros
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error