Trust in humans and robots: Economically similar but emotionally different |
| |
Affiliation: | 1. Economic Science Institute, Chapman University, One University Drive, Orange, CA 92866, USA;2. Argyros School of Business and Economics, Chapman University, One University Drive, Orange, CA 92866, USA;3. Department of Psychology, University of Montreal, Pavillon Marie-Victorin, 90 Vincent d’Indy Ave., Montreal, QC H3C 3J7, Canada |
| |
Abstract: | Trust-based interactions with robots are increasingly common in the marketplace, workplace, on the road, and in the home. However, a valid concern is that people may not trust robots as they do humans. While trust in fellow humans has been studied extensively, little is known about how people extend trust to robots. Here we compare trust-based investments and self-reported emotions from across three nearly identical economic games: human-human trust games, human-robot trust games, and human-robot trust games where the robot decision impacts another human. Robots in our experiment mimic humans: they are programmed to make reciprocity decisions based on previously observed behaviors by humans in analogous situations. We find that people invest similarly in humans and robots. By contrast the social emotions (i.e., gratitude, anger, pride, guilt) elicited by the interactions (but not the non-social emotions) differed across human and robot trust games. Emotional reactions depended on the trust game interaction, and how another person was affected. |
| |
Keywords: | Trust Robots Bots Emotion Experiment C72 C90 D63 D64 L5 2360 4140 |
本文献已被 ScienceDirect 等数据库收录! |
|