1887
Volume 16, Issue 1
  • ISSN 1572-0373
  • E-ISSN: 1572-0381
USD
Buy:$35.00 + Taxes

Abstract

This article investigates the challenge of developing a robot capable of determining if a social situation demands trust. Solving this challenge may allow a robot to react when a person over or under trusts the system. Prior work in this area has focused on understanding the factors that influence a person’s trust of a robot (Hancock, et al., 2011). In contrast, by using game-theoretic representations to frame the problem, we are able to develop a set of conditions for determining if an interactive situation demands trust. In two separate experiments, human subjects were asked to evaluate either written narratives or mazes in terms of whether or not they require trust. The results indicate a ϕ1= +0.592 and ϕ2 = +0.406 correlation respectively between the subjects’ evaluations and the condition’s predictions. This is a strong correlation for a study involving human subjects.

Loading

Article metrics loading...

/content/journals/10.1075/is.16.1.05wag
2015-01-01
2024-10-10
Loading full text...

Full text loading...

/content/journals/10.1075/is.16.1.05wag
Loading
  • Article Type: Research Article
Keyword(s): autonomous system; game theory; human-robot interaction; social robot; trust
This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error