Novel egocentric robot teleoperation interfaces for search and rescue
Loading...
Date
2021
Authors
Seo, Stela
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Teleoperation is a powerful tool: a person can have conferences and meetings overseas, visit their families abroad, go to uncharted locations, and explore dangerous environments. However, during real-time remote teleoperation, the operator faces various challenges every moment, primarily maintaining remote awareness and performing under high cognitive load. The challenges in search and rescue teleoperation exacerbate its difficulty.
To successfully teleoperate a remote robot and accomplish tasks, the operator must maintain a high level of situation awareness by understanding the remote robot’s current states and the environment while being aware of their mission tasks. However, the operator has a limited access to the remote environment through teleoperation interfaces. Additionally, the interfaces can only deliver limited data (i.e., limited field of view and limited types of sensors). To make things worse, in search and rescue teleoperation, the operator must make important decisions that may impact victims’ life with the limited information.
As remote teleoperation interfaces are the only gateway for most of the time, how they deliver the information impacts the operator’s situation awareness. Researchers found that the ways of presenting information impact users’ overall task performance in terms of accuracy, completion time, and workload in human-computer interaction and human-robot interaction. We extend this theme to search and rescue teleoperation scenarios.
We explore novel interface designs to support the operator by retrieving remote information and presenting them in a way that the operator can understand in time. Our design helps the operator increase their situation awareness and their overall task performance. We further discuss benefits and drawbacks of our implementations further contributes to improving future teleoperation interface designs.
Description
Keywords
Teleoperation, Human-Robot Interaction
Citation
Seo, S. H., Rea, D. J., Wiebe, J., & Young, J. E. (2017). Monocle: Interactive detail-in-context using two pan-and-tilt cameras to improve teleoperation effectiveness. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 962–967). Lisbon, Portugal: IEEE. https://doi.org/10.1109/ROMAN.2017.8172419
Seo, S. H., Young, J. E., & Irani, P. (2017). Where are the robots? In-feed embedded techniques for visualizing robot team member locations. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 522–527). Lisbon, Portugal: IEEE. https://doi.org/10.1109/ROMAN.2017.8172352
Seo, S. H., Young, J. E., & Irani, P. (2020). How are Your Robot Friends Doing? A Design Exploration of Graphical Techniques Supporting Awareness of Robot Team Members in Teleoperation. International Journal of Social Robotics, (0123456789). https://doi.org/10.1007/s12369-020-00670-9
Seo, S. H., Young, J. E., & Irani, P. (2017). Where are the robots? In-feed embedded techniques for visualizing robot team member locations. In 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN) (pp. 522–527). Lisbon, Portugal: IEEE. https://doi.org/10.1109/ROMAN.2017.8172352
Seo, S. H., Young, J. E., & Irani, P. (2020). How are Your Robot Friends Doing? A Design Exploration of Graphical Techniques Supporting Awareness of Robot Team Members in Teleoperation. International Journal of Social Robotics, (0123456789). https://doi.org/10.1007/s12369-020-00670-9