Author(s): Han U. Yoon, Ranxiao F. Wang, Seth A. Hutchinson, Pilwon Hur
This paper presents an approach to developing an assistive interface for human–robot interaction that provides users with the customized haptic and visual feedback. The developed interface targets to improve users’ task performance by customizing assistance policy and level based on users’ performance and control strategy. To achieve this, the users’ control strategy was modeled based on inverse optimal control technique. Then, features describing the geometric and behavioral characteristics of user control are derived. Finally, an expert whose features most closely matched each user was identified. The identified expert was assigned to the user to define and provide customized assistance via a virtual fixturing. In human subject experiments, control strategies of twenty-three users were identified and featured; their performance was measured with four assistance types (no-assist, haptic assistance, visual zooming assistance, haptic assistance + visual zooming assistance) × two parameterization types (customized, non-customized). By analyzing the experimental data, we found an optimal combination of assistance type × parameterization type that results in the most improvement of performance. The results showed that the users’ task completion time and mean required effort yielded the best improvements when haptic assistance (with no visual zooming assistance) × customized parameterization were provided for mobile robot driving tasks.