Sciweavers

69 search results - page 12 / 14
» Hand Gesture Recognition for Human-Machine Interaction
Sort
View
81
Voted
FGR
2004
IEEE
135views Biometrics» more  FGR 2004»
15 years 1 months ago
Bayesian Fusion of Hidden Markov Models for Understanding Bimanual Movements
Understanding hand and body gestures is a part of a wide spectrum of current research in computer vision and Human-Computer Interaction. A part of this can be the recognition of m...
Atid Shamaie, Alistair Sutherland
ISER
2004
Springer
142views Robotics» more  ISER 2004»
15 years 2 months ago
Interactive Multi-Modal Robot Programming
As robots enter the human environment and come in contact with inexperienced users, they need to be able to interact with users in a multi-modal fashion—keyboard and mouse are n...
Soshi Iba, Christiaan J. J. Paredis, Pradeep K. Kh...
ICCV
2005
IEEE
15 years 3 months ago
Tracking Body Parts of Multiple People for Multi-person Multimodal Interface
Although large displays could allow several users to work together and to move freely in a room, their associated interfaces are limited to contact devices that must generally be s...
Sébastien Carbini, Jean-Emmanuel Viallet, O...
AROBOTS
2002
166views more  AROBOTS 2002»
14 years 9 months ago
Multi-Modal Interaction of Human and Home Robot in the Context of Room Map Generation
In robotics, the idea of human and robot interaction is receiving a lot of attention lately. In this paper, we describe a multi-modal system for generating a map of the environment...
Saeed Shiry Ghidary, Yasushi Nakata, Hiroshi Saito...
IUI
2012
ACM
13 years 5 months ago
Airwriting: demonstrating mobile text input by 3D-space handwriting
We demonstrate our airwriting interface for mobile handsfree text entry. The interface enables a user to input text into a computer by writing in the air like on an imaginary blac...
Christoph Amma, Tanja Schultz