Hardware And Software Systems For Control Of Assistive Robotic Devices Using Point-of-gaze Estimation
Abstract
Eye gaze based interaction has many useful applications in human-machine interfaces, assistive technologies, and multimodal systems. Traditional input methods, such as the keyboard and mouse, are not practical in many situations and can be ineffective for some users with physical impairments. Knowledge of the user point of gaze (PoG) can be a powerful data modality in intelligent systems by facilitating intuitive control, perception of user intent, and enhanced interactive experiences.This research aims to advance the use of non-traditional, multimodal interfaces in assistive robotic devices for the benefit of users with severe physical disabilities. The data modalities which are of particular interest in this work are perception of the environment using 3D scanning and computer vision, estimation of the user point of gaze using video occulography, and perception of user intent during interaction with objects and locations of interest. A novel, mobile headset design is presented that provides both monocular and binocular pupil tracking together with a 3D reconstruction of the user's field of view. Computational methods for obtaining a true 3D gaze vector and final PoG are also discussed, along with a demonstration of practical use.