Sciweavers

146 search results - page 13 / 30
» A Gesture Based Interface for Human-Robot Interaction
Sort
View
ISMAR
2006
IEEE
15 years 5 months ago
"Move the couch where?" : developing an augmented reality multimodal interface
This paper describes an augmented reality (AR) multimodal interface that uses speech and paddle gestures for interaction. The application allows users to intuitively arrange virtu...
Sylvia Irawati, Scott Green, Mark Billinghurst, An...
EGITALY
2011
13 years 11 months ago
Gestural Interaction for Robot Motion Control
Recent advances in gesture recognition made the problem of controlling a humanoid robot in the most natural possible way an interesting challenge. Learning from Demonstration fie...
Giuseppe Broccia, Marco Livesu, Riccardo Scateni
CHI
2004
ACM
16 years 1 days ago
A suggestive interface for image guided 3D sketching
We present an image guided pen-based suggestive interface for sketching 3D wireframe models. Rather than starting from a blank canvas, existing 2D images of similar objects serve ...
Steve Tsang, Ravin Balakrishnan, Karan Singh, Abhi...
CHI
2009
ACM
16 years 8 days ago
Double-side multi-touch input for mobile devices
We present a new mobile interaction model, called double-side multi-touch, based on a mobile device that receives simultaneous multi-touch input from both the front and the back o...
Erh-li Early Shen, Sung-sheng Daniel Tsai, Hao-hua...
ICCV
2005
IEEE
15 years 5 months ago
Tracking Body Parts of Multiple People for Multi-person Multimodal Interface
Although large displays could allow several users to work together and to move freely in a room, their associated interfaces are limited to contact devices that must generally be s...
Sébastien Carbini, Jean-Emmanuel Viallet, O...