Children’s trust towards erroneous robot informants
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
As social robotics continues to grow and develop, robots are increasingly finding their way into more areas of society, including hospitals, homes, daycare centres, and schools. It is essential for these robots to behave in ways that are appropriate for interacting with children, especially when they may need to elicit trust. As part of this thesis, we conducted two experiments with an overall total 115 participants investigating preschool-aged children’s trust towards robots with human-like informational (experiment 1, Chapter 5) and robot-typical speech-recognition (experiment 2, Chapter 6) errors. Our findings suggest that children trust a robot that makes informational errors less than one that does not, but may trust a robot that exhibits speech-recognition errors more than one that does not. This suggests that children may perceive robot errors, and therefore trust robots differently, from other entities such as humans or puppets. We contribute the findings from these two experiments, as well as an initial framework of child-robot trust. This thesis provides a starting point for robot designers to consider trust when designing robots for children, and for researchers to further investigate young children’s trust towards robots.
Description
Keywords
Citation
Geiskkovitch, D. Y., & Young, J. E. (2020). Children’s overtrust: Intentional use of robot errors to decrease trust. In Proceedings of the 19th IEEE International Conference on Robot and Human Interactive Communication (SCRITA Workshop). IEEE, RO-MAN ’20. 2 pages.
Geiskkovitch, D. Y., & Young, J. E. (2020). Social robots don’t do that: Exploring robot-typical errors in child-robot interaction. In Companion Proceedings of the 15th ACM/IEEE International Conference on Human-Robot Interaction, 200-202. ACM/IEEE, HRI ’20.