These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: Eye-Tracking in Physical Human-Robot Interaction: Mental Workload and Performance Prediction.
    Author: Upasani S, Srinivasan D, Zhu Q, Du J, Leonessa A.
    Journal: Hum Factors; 2024 Aug; 66(8):2104-2119. PubMed ID: 37793896.
    Abstract:
    BACKGROUND: In Physical Human-Robot Interaction (pHRI), the need to learn the robot's motor-control dynamics is associated with increased cognitive load. Eye-tracking metrics can help understand the dynamics of fluctuating mental workload over the course of learning. OBJECTIVE: The aim of this study was to test eye-tracking measures' sensitivity and reliability to variations in task difficulty, as well as their performance-prediction capability, in physical human-robot collaboration tasks involving an industrial robot for object comanipulation. METHODS: Participants (9M, 9F) learned to coperform a virtual pick-and-place task with a bimanual robot over multiple trials. Joint stiffness of the robot was manipulated to increase motor-coordination demands. The psychometric properties of eye-tracking measures and their ability to predict performance was investigated. RESULTS: Stationary Gaze Entropy and pupil diameter were the most reliable and sensitive measures of workload associated with changes in task difficulty and learning. Increased task difficulty was more likely to result in a robot-monitoring strategy. Eye-tracking measures were able to predict the occurrence of success or failure in each trial with 70% sensitivity and 71% accuracy. CONCLUSION: The sensitivity and reliability of eye-tracking measures was acceptable, although values were lower than those observed in cognitive domains. Measures of gaze behaviors indicative of visual monitoring strategies were most sensitive to task difficulty manipulations, and should be explored further for the pHRI domain where motor-control and internal-model formation will likely be strong contributors to workload. APPLICATION: Future collaborative robots can adapt to human cognitive state and skill-level measured using eye-tracking measures of workload and visual attention.
    [Abstract] [Full Text] [Related] [New Search]