You really need to know what your bot(s) are thinking about you

MyBook is a cheap paperback edition of the original book and will be sold at uniform, low price.
This Chapter is currently unavailable for purchase.

The projected ubiquity of personal Companion robots raises a range of interesting but also challenging questions. There can be little doubt that an effective artificial Companion, whether embodied or not, will need to be both sensitive to the emotional state of its human partner and be able to respond sensitively. It will, in other words, need artificial theory of mind – such an artificial Companion would need to behave as if it has feelings and as if it understands how its human partner is feeling. This chapter explores the implementation and implications of artificial theory of mind, and raises concerns over the asymmetry between and artificial Companion’s theory of mind for its human partner and the human’s theory of mind for his or her artificial Companion. The essay argues that social learning (imitation) is an additional requirement of artificial Companion robots, then goes on to develop the idea that an artificial Companion robot will not be one robot but several. A surprising consequence of these ideas is that a family of artificial Companion robots could acquire an artificial culture of its own, and the essay concludes by speculating on what this might mean for human(s) interacting with their artificial Companion robots.


This is a required field
Please enter a valid email address