1887

You really need to know what your bot(s) are thinking about you

The projected ubiquity of personal Companion robots raises a range of interesting but also challenging questions. There can be little doubt that an effective artificial Companion, whether embodied or not, will need to be both sensitive to the emotional state of its human partner and be able to respond sensitively. It will, in other words, need artificial theory of mind – such an artificial Companion would need to behave as if it has feelings and as if it understands how its human partner is feeling. This chapter explores the implementation and implications of artificial theory of mind, and raises concerns over the asymmetry between and artificial Companion’s theory of mind for its human partner and the human’s theory of mind for his or her artificial Companion. The essay argues that social learning (imitation) is an additional requirement of artificial Companion robots, then goes on to develop the idea that an artificial Companion robot will not be one robot but several. A surprising consequence of these ideas is that a family of artificial Companion robots could acquire an artificial culture of its own, and the essay concludes by speculating on what this might mean for human(s) interacting with their artificial Companion robots.

/content/books/9789027288400-nlp.8.25win
dcterms_subject,pub_keyword
-contentType:Journal
10
5
Chapter
content/books/9789027288400
Book
false
Loading
This is a required field
Please enter a valid email address
Approval was successful
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error