Sciweavers

CHI
2009
ACM

Gaze-controlled driving

13 years 11 months ago
Gaze-controlled driving
We investigate if the gaze (point of regard) can control a remote vehicle driving on a racing track. Five different input devices (on-screen buttons, mousepointing low-cost webcam eye tracker and two commercial eye tracking systems) provide heading and speed control on the scene view transmitted from the moving robot. Gaze control was found to be similar to mouse control. This suggests that robots and wheelchairs may be controlled ―hands-free‖ through gaze. Low precision gaze tracking and image transmission delays had noticeable effect on performance. Keywords Gaze, control, input, robot, mobile, wheelchair ACM Classification Keywords H5.2. User Interfaces — Input devices and strategies.
Martin Tall, Alexandre Alapetite, Javier San Agust
Added 19 May 2010
Updated 19 May 2010
Type Conference
Year 2009
Where CHI
Authors Martin Tall, Alexandre Alapetite, Javier San Agustin, Henrik H. T. Skovsgaard, John Paulin Hansen, Dan Witzner Hansen, Emilie Møllenbach
Comments (0)