The Effects of Interface Views on Performing Aerial Telemanipulation Tasks Using Small UAVs
This paper presents a human-robot interaction (HRI) study of a dedicated Mission Specialist interface for performing telemanipulation tasks using a small unoccupied aerial vehicle (UAV). Current literature suggests that the successful completion of aerial manipulation tasks in real-world environments requires human input due to challenges in autonomous perception and control. Visual information of the remote environment in a telemanipulation interface can significantly affect performance under direct control; however, the effects of interface visualizations on task performance have not been studied for UAV telemanipulation. This work evaluated the effects of interface viewpoint on aerial manipulation task performance. The interfaces evaluated in this study included video streams from cameras located onboard the UAV, including: (i) a manipulator egocentric view, (ii) a manipulator exocentric view, and (iii) a combination of egocentric and exocentric views. A total of 36 participants completed three different manipulation tasks using all three interface conditions. The observations and results showed that both the exocentric and mixed view configurations contributed to improved task performance over an egocentric-only interface. Further, this study resulted in data regarding view use, view effectiveness, and task type that can be used for further developing interfacing for aerial manipulators that change and adapt to the environment and task.
This is a post-peer-review, pre-copyedit version of an article published in the International Journal of Social Robotics. The final authenticated version is available online at DOI: 10.1007/s12369-021-00783-9. Posted with permission.