- Home
- e-Journals
- Interaction Studies
- Previous Issues
- Volume 16, Issue, 2015
Interaction Studies - Volume 16, Issue 2, 2015
Volume 16, Issue 2, 2015
-
Individual differences predict sensitivity to the uncanny valley
Author(s): Karl F. MacDorman and Steven O. Entezaripp.: 141–172 (32)More LessIt can be creepy to notice that something human-looking is not real. But can sensitivity to this phenomenon, known as the uncanny valley, be predicted from superficially unrelated traits? Based on results from at least 489 participants, this study examines the relation between nine theoretically motivated trait indices and uncanny valley sensitivity, operationalized as increased eerie ratings and decreased warmth ratings for androids presented in videos. Animal Reminder Sensitivity, Neuroticism, its Anxiety facet, and Religious Fundamentalism significantly predicted uncanny valley sensitivity. In addition, Concern over Mistakes and Personal Distress significantly predicted android eerie ratings but not warmth. The structural equation model indicated that Religious Fundamentalism operates indirectly, through robot-related attitudes, to heighten uncanny valley sensitivity, while Animal Reminder Sensitivity increases eerie ratings directly. These results suggest that the uncanny valley phenomenon may operate through both sociocultural constructions and biological adaptations for threat avoidance, such as the fear and disgust systems. Trait indices that predict uncanny valley sensitivity warrant investigation by experimental methods to explicate the processes underlying the uncanny valley phenomenon.
-
The eyes are the window to the uncanny valley: Mind perception, autism and missing souls
Author(s): Chelsea Schein and Kurt Graypp.: 173–179 (7)More LessHorror movies have discovered an easy recipe for making people creepy: alter their eyes. Instead of normal eyes, zombies’ eyes are vacantly white, vampires’ eyes glow with the color of blood, and those possessed by demons are cavernously black. In the Academy Award winning Pan’s Labyrinth, director Guillermo del Toro created the creepiest of all creatures by entirely removing its eyes from its face, placing them instead in the palms of its hands. The unease induced by altering eyes may help to explain the uncanny valley, which is the eeriness of robots that are almost—but not quite—human (Mori, 1970). Much research has explored the uncanny valley, including the research reported by MacDorman & Entezari (in press), which focuses on individual differences that might predict the eeriness of humanlike robots. In their paper, they suggest that a full understanding of this phenomenon needs to synthesize individual differences with features of the robot. One theory that links these two concepts is mind perception, which past research highlights as essential to the uncanny valley (Gray & Wegner, 2012). Mind perception is linked to both individual differences—autism—and to features of the robot—the eyes—and can provide a deeper understanding of this arresting phenomenon. In this paper, we present original data that links uncanniness to the eyes through aberrant perceptions of mind.
-
Autistic traits and sensitivity to human-like features of robot behavior
Author(s): Agnieszka Wykowska, Jasmin Kajopoulos, Karinne Ramirez-Amaro and Gordon Chengpp.: 219–248 (30)More LessThis study examined individual differences in sensitivity to human-like features of a robot’s behavior. The paradigm comprised a non-verbal Turing test with a humanoid robot. A “programmed” condition differed from a “human-controlled” condition by onset times of the robot’s eye movements, which were either fixed across trials or modeled after prerecorded human reaction times, respectively. Participants judged whether the robot behavior was programmed or human-controlled, with no information regarding the differences between respective conditions. Autistic traits were measured with the autism-spectrum quotient (AQ) questionnaire in healthy adults. We found that the fewer autistic traits participants had, the more sensitive they were to the difference between the conditions, without explicit awareness of the nature of the difference. We conclude that although sensitivity to fine behavioral characteristics of others varies with social aptitude, humans are in general capable of detecting human-like behavior based on very subtle cues.
-
Inconsistency of personality evaluation caused by appearance gap in robotic telecommunication*
Author(s): Kaiko Kuwamura, Takashi Minato, Shuichi Nishio and Hiroshi Ishiguropp.: 249–271 (23)More LessCompared with other communication media such as cellphones and video chat, teleoperated robots have a physical existence which increases the feeling of copresence. However, the appearance of a teleoperated robot is always the same regardless of the characteristics of its operator. Since people can determine their partner’s personality from his/her appearance, a teleoperated robot’s appearance might construct a personality that confuses the user. Our research focuses on establishing what kind of appearance of the telecommunication media could prevent confusion and increase the feeling of copresence. In this study, we compare the appearance of three types of communication media (nonhumanlike robot, human-like robot, and video chat with a projection of the speaker). The result shows that, in the case of the human-like robot, the consistency of the personality judgment is better than in the case of the nonhuman-like robot. Also, we found that teleoperated robots transmit a more appropriate context-based atmosphere, while the video chat transmits more nonverbal information, such as facial expressions.
-
The effects of culture and context on perceptions of robotic facial expressions
Author(s): Casey C. Bennett and Selma abanovićpp.: 272–302 (31)More LessWe report two experimental studies of human perceptions of robotic facial expressions while systematically varying context effects and the cultural background of subjects (n = 93). Except for Fear, East Asian and Western subjects were not significantly different in recognition rates, and, while Westerners were better at judging affect from mouth movement alone, East Asians were not any better at judging affect based on eye/brow movement alone. Moreover, context effects appeared capable of over-riding such cultural differences, most notably for Fear. The results seem to run counter to previous theories of cultural differences in facial expression based on emoticons and eye fixation patterns. We connect this to broader research in cognitive science – suggesting the findings support a dynamical systems view of social cognition as an emergent phenomenon. The results here suggest that, if we can induce appropriate context effects, it may be possible to create culture-neutral models of robots and affective interaction.
-
Follow your heart: Heart rate controlled music recommendation for low stress air travel*
Author(s): Hao Liu, Jun Hu and Matthias Rauterbergpp.: 303–339 (37)More LessLong distance travel is an unusual activity for humans. The economical cabin environment (low air circulation, limited space, low humidity, etc.) during the long haul flights causes discomfort and even stress for many passengers. In-flight video and music systems are commonly available to improve the comfort level of the passengers. However, current in-flight music systems do not explore how the content can be used to reduce passengers stress. Most of these systems are designed and implemented assuming a homogeneous passenger group that has similar tastes and desires. In this paper, we present a heart rate controlled in-flight music recommendation system for reducing the stress during air travel. The system recommends personalized music playlists to the passengers and attempts to keep their heart rate in a normal range with these playlists. Experiments in a simulated long haul flight cabin environment find that the passengers’ stress can indeed be significantly reduced through listening to the recommended music playlists.
Volumes & issues
-
Volume 25 (2024)
-
Volume 24 (2023)
-
Volume 23 (2022)
-
Volume 22 (2021)
-
Volume 21 (2020)
-
Volume 20 (2019)
-
Volume 19 (2018)
-
Volume 18 (2017)
-
Volume 17 (2016)
-
Volume 16 (2015)
-
Volume 15 (2014)
-
Volume 14 (2013)
-
Volume 13 (2012)
-
Volume 12 (2011)
-
Volume 11 (2010)
-
Volume 10 (2009)
-
Volume 9 (2008)
-
Volume 8 (2007)
-
Volume 7 (2006)
-
Volume 6 (2005)
-
Volume 5 (2004)
Most Read This Month
Article
content/journals/15720381
Journal
10
5
false