1887
Volume 21, Issue 3
  • ISSN 1572-0373
  • E-ISSN: 1572-0381
USD
Buy:$35.00 + Taxes

Abstract

Abstract

‘Smart’ devices are becoming increasingly ubiquitous. While these sophisticated machines are useful for various purposes, they sometimes evoke feelings of eeriness or discomfort that constitute uncanniness, a much-discussed phenomenon in robotics research. Adult participants ( = 115) rated the uncanniness of a hypothetical future smart speaker that was described as possessing the mental capacities for experience, agency, neither, or both. The novel condition prompting participants to attribute both agency and experience to the speaker filled an important theoretical gap in the literature. Consistent with the mind perception hypothesis of uncanniness (MPH; Gray & Wegner, 2012), participants in the with-experience condition rated the device significantly higher in uncanniness than those in the control condition and the with-agency condition. Participants in the with-both (experience and agency) condition also rated the device higher in uncanniness than those in the control condition and the with-agency condition, although this latter difference only approached statistical significance.

Loading

Article metrics loading...

/content/journals/10.1075/is.19015.tay
2021-02-09
2021-04-23
Loading full text...

Full text loading...

References

  1. Appel, M. , Izydorczyk, D. , Weber, S. , Mara, M. , & Lischetzke, T.
    (2020) The uncanny of mind in a machine: Humanoid robots as tools, agents, and experiencers. Computers in Human Behavior, 102, 274–286. doi:  10.1016/j.chb.2019.07.031
    https://doi.org/10.1016/j.chb.2019.07.031 [Google Scholar]
  2. Apple
    Apple (2018, January23). HomePod arrives February 9, available to order this friday [Press release]. Retrieved fromhttps://www.apple.com/newsroom/2018/01/homepod-arrives-february-9-available-to-order-this-friday/
    [Google Scholar]
  3. Brink, K. A. , Gray, K. , & Wellman, H. M.
    (2019) Creepiness creeps in: Uncanny valley feelings are acquired in childhood. Child Development, 90, 1202–1214. doi:  10.1111/cdev.12999
    https://doi.org/10.1111/cdev.12999 [Google Scholar]
  4. Broadbent, E. , Kumar, V. , Li, X. , Sollers, J., III , Stafford, R. Q. , MacDonald, B. A. , & Wegner, D. M.
    (2013) Robots with display screens: a robot with a more humanlike face display is perceived to have more mind and a better personality. PloS One, 8, e72589. doi:  10.1371/journal.pone.0072589
    https://doi.org/10.1371/journal.pone.0072589 [Google Scholar]
  5. Broadbent, E. , Kuo, I. H. , Lee, Y. I. , Rabindran, J. , Kerse, N. , Stafford, R. , & MacDonald, B. A.
    (2010) Attitudes and reactions to a healthcare robot. Telemedicine and e-Health, 16, 608–613. doi:  10.1089/tmj.2009.0171
    https://doi.org/10.1089/tmj.2009.0171 [Google Scholar]
  6. Ciechanowski, L. , Przegalinska, A. , Magnuski, M. , & Gloor, P.
    (2019) In the shades of the uncanny valley: An experimental study of human–chatbot interaction. Future Generation Computer Systems, 92, 539–548. doi:  10.1016/j.future.2018.01.055
    https://doi.org/10.1016/j.future.2018.01.055 [Google Scholar]
  7. Crandall, C. S. , & Sherman, J. W.
    (2016) On the scientific superiority of conceptual replications for scientific progress. Journal of Experimental Social Psychology, 66, 93–99. doi:  10.1016/j.jesp.2015.10.002
    https://doi.org/10.1016/j.jesp.2015.10.002 [Google Scholar]
  8. Creed, C. , & Beale, R.
    (2012) User interactions with an affective nutritional coach. Interacting with Computers, 24, 339–350. doi:  10.1016/j.intcom.2012.05.004
    https://doi.org/10.1016/j.intcom.2012.05.004 [Google Scholar]
  9. Creed, C. , Beale, R. , & Cowan, B.
    (2015) The impact of an embodied agent’s emotional expressions over multiple interactions. Interacting with Computers, 27, 172–188. doi:  10.1093/iwc/iwt064
    https://doi.org/10.1093/iwc/iwt064 [Google Scholar]
  10. Deng, E. , Mutlu, B. , & Mataric, M. J.
    (2019) Embodiment in socially interactive robots. Foundations and Trends in Robotics, 7, 251–356. doi:  10.1561/2300000056
    https://doi.org/10.1561/2300000056 [Google Scholar]
  11. Ferrey, A. E. , Burleigh, T. J. , & Fenske, M. J.
    (2015) Stimulus-category competition, inhibition, and affective devaluation: A novel account of the uncanny valley. Frontiers in Psychology, 6, 249. doi:  10.3389/fpsyg.2015.00249
    https://doi.org/10.3389/fpsyg.2015.00249 [Google Scholar]
  12. Gray, H. M. , Gray, K. , & Wegner, D. M.
    (2007) Dimensions of mind perception. Science, 315, 619. doi:  10.1126/science.1134475
    https://doi.org/10.1126/science.1134475 [Google Scholar]
  13. Gray, K. , Jenkins, A. C. , Heberlein, A. S. , & Wegner, D. M.
    (2011) Distortions of mind perception in psychopathology. Proceedings of the National Academy of Sciences, 108, 477–479. doi:  10.1073/pnas.1015493108
    https://doi.org/10.1073/pnas.1015493108 [Google Scholar]
  14. Gray, K. , & Wegner, D. M.
    (2012) Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 125, 125–130. doi:  10.1016/j.cognition.2012.06.007
    https://doi.org/10.1016/j.cognition.2012.06.007 [Google Scholar]
  15. Gray, K. , Young, L. , & Waytz, A.
    (2012) Mind perception is the essence of morality. Psychological Inquiry, 23, 101–124. doi:  10.1080/1047840X.2012.651387
    https://doi.org/10.1080/1047840X.2012.651387 [Google Scholar]
  16. Ho, C. C. , MacDorman, K. F. , & Pramono, Z. D.
    (2008, March). Human emotion and the uncanny valley: A GLM, MDS, and Isomap analysis of robot video ratings. InProceedings of the 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp.169–176). Amsterdam, the Netherlands. doi:  10.1145/1349822.1349845
    https://doi.org/10.1145/1349822.1349845 [Google Scholar]
  17. Kätsyri, J. , Förger, K. , Mäkäräinen, M. , & Takala, T.
    (2015) A review of empirical evidence on different uncanny valley hypotheses: Support for perceptual mismatch as one road to the valley of eeriness. Frontiers in Psychology, 6, 390. doi:  10.3389/fpsyg.2015.00390
    https://doi.org/10.3389/fpsyg.2015.00390 [Google Scholar]
  18. Kawabe, T. , Sasaki, K. , Ihaya, K. , & Yamada, Y.
    (2017) When categorization-based stranger avoidance explains the uncanny valley: A comment on MacDorman and Chattopadhyay (2016). Cognition, 161, 129–131. doi:  10.1016/j.cognition.2016.09.001
    https://doi.org/10.1016/j.cognition.2016.09.001 [Google Scholar]
  19. Knobe, J. , & Prinz, J.
    (2008) Intuitions about consciousness: Experimental studies. Phenomenology and the Cognitive Sciences, 7, 67–83. doi:  10.1007/s11097‑007‑9066‑y
    https://doi.org/10.1007/s11097-007-9066-y [Google Scholar]
  20. Kupferberg, A. , Glasauer, S. , Huber, M. , Rickert, M. , Knoll, A. , & Brandt, T.
    (2011) Biological movement increases acceptance of humanoid robots as human partners in motor interaction. AI & Society, 26, 339–345. doi:  10.1007/s00146‑010‑0314‑2
    https://doi.org/10.1007/s00146-010-0314-2 [Google Scholar]
  21. Lau, J. , Zimmerman, B. , & Schaub, F.
    (2018) Alexa, are you listening? Privacy perceptions, concerns and privacy-seeking behaviors with smart speakers. Proceedings of the ACM on Human-Computer Interaction, 2, 102. doi:  10.1145/3274371
    https://doi.org/10.1145/3274371 [Google Scholar]
  22. Liu, B. , & Sundar, S. S.
    (2018) Should machines express sympathy and empathy? Experiments with a health advice chatbot. Cyberpsychology, Behavior, and Social Networking, 21, 625–636. doi:  10.1089/cyber.2018.0110
    https://doi.org/10.1089/cyber.2018.0110 [Google Scholar]
  23. Lynch, J. G. Jr. , Bradlow, E. T. , Huber, J. C. , & Lehmann, D. R.
    (2015) Reflections on the replication corner: In praise of conceptual replications. International Journal of Research in Marketing, 32, 333–342. doi:  10.1016/j.ijresmar.2015.09.006
    https://doi.org/10.1016/j.ijresmar.2015.09.006 [Google Scholar]
  24. MacDorman, K. F. , & Chattopadhyay, D.
    (2016) Reducing consistency in human realism increases the uncanny valley effect; increasing category uncertainty does not. Cognition, 146, 190–205. doi:  10.1016/j.cognition.2015.09.019
    https://doi.org/10.1016/j.cognition.2015.09.019 [Google Scholar]
  25. MacDorman, K. F. , & Entezari, S. O.
    (2015) Individual differences predict sensitivity to the uncanny valley. Interaction Studies, 16, 141–172. doi:  10.1075/is.16.2.01mac
    https://doi.org/10.1075/is.16.2.01mac [Google Scholar]
  26. MacDorman, K. F. , Green, R. D. , Ho, C. C. , & Koch, C. T.
    (2009) Too real for comfort? Uncanny responses to computer generated faces. Computers in Human Behavior, 25, 695–710. doi:  10.1016/j.chb.2008.12.026
    https://doi.org/10.1016/j.chb.2008.12.026 [Google Scholar]
  27. MacDorman, K. F. , & Ishiguro, H.
    (2006) The uncanny advantage of using androids in cognitive and social science research. Interaction Studies, 7, 297–337. doi:  10.1075/is.7.3.03mac
    https://doi.org/10.1075/is.7.3.03mac [Google Scholar]
  28. MacDorman, K. F. , Vasudevan, S. K. , & Ho, C. C.
    (2009) Does Japan really have robot mania? Comparing attitudes by implicit and explicit measures. AI & Society, 23, 485–510. doi:  10.1007/s00146‑008‑0181‑2
    https://doi.org/10.1007/s00146-008-0181-2 [Google Scholar]
  29. Mitchell, W. J. , Szerszen, K. A. Sr. , Lu, A. S. , Schermerhorn, P. W. , Scheutz, M. , & MacDorman, K. F.
    (2011) A mismatch in the human realism of face and voice produces an uncanny valley. i-Perception, 2, 10–12. doi:  10.1068/i0415
    https://doi.org/10.1068/i0415 [Google Scholar]
  30. Moore, R. K.
    (2012) A Bayesian explanation of the ‘Uncanny Valley’ effect and related psychological phenomena. Nature Scientific Reports, 2, 864. doi:  10.1038/srep00864
    https://doi.org/10.1038/srep00864 [Google Scholar]
  31. Mori, M.
    (1970/2005) The uncanny valley. ( K. F. MacDorman , & T. Minato , Trans.). Energy, 7, 33–35.
    [Google Scholar]
  32. Mori, M. , MacDorman, K. F. , & Kageki, N.
    (2012) The uncanny valley [from the field]. Robotics & Automation Magazine, IEEE, 19, 98–100. doi:  10.1109/MRA.2012.2192811
    https://doi.org/10.1109/MRA.2012.2192811 [Google Scholar]
  33. Myers, C. M. , Furqan, A. , & Zhu, J.
    (2019, May). The impact of user characteristics and preferences on performance with an unfamiliar voice user interface. InProceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp.1–9). Glasgow, Scotland. doi:  10.1145/3290605.3300277
    https://doi.org/10.1145/3290605.3300277 [Google Scholar]
  34. NPR & Edison Research
    NPR & Edison Research (2018) The smart audio report, winter 2018. Retrieved fromhttps://www.nationalpublicmedia.com/smart-audio-report/latest-report/
    [Google Scholar]
  35. Pollick, F. E.
    (2010) In search of the uncanny valley. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, 40, 69–78. doi:  10.1007/978‑3‑642‑12630‑7_8
    https://doi.org/10.1007/978-3-642-12630-7_8 [Google Scholar]
  36. R Core Team
    R Core Team (2020) R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. https://www.R-project.org/
    [Google Scholar]
  37. Ramey, C. H.
    (2006) An inventory of reported characteristics for home computers, robots, and human beings: Applications for android science and the uncanny valley. In MacDorman, K. F. , & Ishiguro, H. (Eds.), Proceedings of the ICCS/CogSci 2006 Long Symposium: ‘Toward Social Mechanisms of Android Science’ (pp.21–25). Vancouver, Canada.
    [Google Scholar]
  38. Rosenthal-von der Pütten, A. , & Weiss, A.
    (2015) The uncanny valley phenomenon: Does it affect all of us?Interaction Studies, 16, 206–214. doi:  10.1075/is.16.2.07ros
    https://doi.org/10.1075/is.16.2.07ros [Google Scholar]
  39. Schein, C. , & Gray, K.
    (2015) The eyes are the window to the uncanny valley: Mind perception, autism and missing souls. Interaction Studies, 16, 173–179. doi:  10.1075/is.16.2.02sch
    https://doi.org/10.1075/is.16.2.02sch [Google Scholar]
  40. Seyama, J. I. , & Nagayama, R. S.
    (2007) The uncanny valley: Effect of realism on the impression of artificial human faces. Presence: Teleoperators and Virtual Environments, 16, 337–351. doi:  10.1162/pres.16.4.337
    https://doi.org/10.1162/pres.16.4.337 [Google Scholar]
  41. Stafford, R. Q. , Broadbent, E. , Jayawardena, C. , Unger, U. , Kuo, I. H. , Igic, A. , Wong, R. , Kerse, N. , Watson, C. , & MacDonald, B. A.
    (2010, September). Improved robot attitudes and emotions at a retirement home after meeting a robot. InProceedings of the 19th International Symposium in Robot and Human Interactive Communication (pp.82–87). Viareggio, Italy. doi:  10.1109/ROMAN.2010.5598679
    https://doi.org/10.1109/ROMAN.2010.5598679 [Google Scholar]
  42. Stafford, R. Q. , MacDonald, B. A. , Jayawardena, C. , Wegner, D. M. , & Broadbent, E.
    (2014) Does the robot have a mind? Mind perception and attitudes towards robots predict use of an eldercare robot. International Journal of Social Robotics, 6, 17–32. doi:  10.1007/s12369‑013‑0186‑y
    https://doi.org/10.1007/s12369-013-0186-y [Google Scholar]
  43. Stein, J. P. , & Ohler, P.
    (2017) Venturing into the uncanny valley of mind – The influence of mind attribution on the acceptance of human-like characters in a virtual reality setting. Cognition, 160, 43–50. doi:  10.1016/j.cognition.2016.12.010
    https://doi.org/10.1016/j.cognition.2016.12.010 [Google Scholar]
  44. Tharp, M. , Holtzman, N. S. , & Eadeh, F. R.
    (2017) Mind perception and individual differences: A replication and extension. Basic and Applied Social Psychology, 39, 68–73. doi:  10.1080/01973533.2016.1256287
    https://doi.org/10.1080/01973533.2016.1256287 [Google Scholar]
  45. Waddell, K.
    (2017, April21). Chatbots have entered the uncanny valley. The Atlantic. Retrieved fromhttps://www.theatlantic.com/technology/archive/2017/04/uncanny-valley-digital-assistants/523806/
    [Google Scholar]
  46. Wang, S. , Lilienfeld, S. O. , & Rochat, P.
    (2015) The uncanny valley: Existence and explanations. Review of General Psychology, 19, 393–407. doi:  10.1037/gpr0000056
    https://doi.org/10.1037/gpr0000056 [Google Scholar]
  47. Wang, X. , & Krumhuber, E. G.
    (2018) Mind perception of robots varies with their economic versus social function. Frontiers in Psychology, 9, 1230. doi:  10.3389/fpsyg.2018.01230
    https://doi.org/10.3389/fpsyg.2018.01230 [Google Scholar]
  48. Waytz, A. , Cacioppo, J. , & Epley, N.
    (2010) Who sees human? The stability and importance of individual differences in anthropomorphism. Perspectives on Psychological Science, 5, 219–232. doi:  10.1177/1745691610369336
    https://doi.org/10.1177/1745691610369336 [Google Scholar]
  49. Wickham, H.
    (2016) ggplot2: Elegant graphics for data analysis. New York: Springer-Verlag. 10.1007/978‑3‑319‑24277‑4
    https://doi.org/10.1007/978-3-319-24277-4 [Google Scholar]
  50. Wilson, M.
    (2018, March9). Alexa’s creepy laughter is a bigger problem than amazon admits. Fast Company. Retrieved fromhttps://www.fastcompany.com/90163588/why-alexas-laughter-creeps-us-out
    [Google Scholar]
http://instance.metastore.ingenta.com/content/journals/10.1075/is.19015.tay
Loading
/content/journals/10.1075/is.19015.tay
Loading

Data & Media loading...

Most Cited

This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error