Sciweavers

HRI
2010
ACM

Multimodal interaction with an autonomous forklift

13 years 11 months ago
Multimodal interaction with an autonomous forklift
Abstract—We describe a multimodal framework for interacting with an autonomous robotic forklift. A key element enabling effective interaction is a wireless, handheld tablet with which a human supervisor can command the forklift using speech and sketch. Most current sketch interfaces treat the canvas as a blank slate. In contrast, our interface uses live and synthesized camera images from the forklift as a canvas, and augments them with object and obstacle information from the world. This connection enables users to “draw on the world,” enabling a simpler set of sketched gestures. Our interface supports commands that include summoning the forklift and directing it to lift, transport, and place loads of palletized cargo. We describe an exploratory evaluation of the system designed to identify areas for detailed study. Our framework incorporates external signaling to interact with humans near the vehicle. The robot uses audible and visual annunciation to convey its current state and...
Andrew Correa, Matthew R. Walter, Luke Fletcher, J
Added 17 May 2010
Updated 17 May 2010
Type Conference
Year 2010
Where HRI
Authors Andrew Correa, Matthew R. Walter, Luke Fletcher, Jim Glass, Seth J. Teller, Randall Davis
Comments (0)