Sciweavers

10 search results - page 2 / 2
» Real-Time Person Tracking and Pointing Gesture Recognition f...
Sort
View
HRI
2011
ACM
12 years 8 months ago
Learning to interpret pointing gestures with a time-of-flight camera
Pointing gestures are a common and intuitive way to draw somebody’s attention to a certain object. While humans can easily interpret robot gestures, the perception of human beha...
David Droeschel, Jörg Stückler, Sven Beh...
ICCV
2005
IEEE
13 years 10 months ago
Tracking Body Parts of Multiple People for Multi-person Multimodal Interface
Although large displays could allow several users to work together and to move freely in a room, their associated interfaces are limited to contact devices that must generally be s...
Sébastien Carbini, Jean-Emmanuel Viallet, O...
AROBOTS
2002
166views more  AROBOTS 2002»
13 years 4 months ago
Multi-Modal Interaction of Human and Home Robot in the Context of Room Map Generation
In robotics, the idea of human and robot interaction is receiving a lot of attention lately. In this paper, we describe a multi-modal system for generating a map of the environment...
Saeed Shiry Ghidary, Yasushi Nakata, Hiroshi Saito...
ICMI
2004
Springer
189views Biometrics» more  ICMI 2004»
13 years 10 months ago
A multimodal learning interface for sketch, speak and point creation of a schedule chart
We present a video demonstration of an agent-based test bed application for ongoing research into multi-user, multimodal, computer-assisted meetings. The system tracks a two perso...
Edward C. Kaiser, David Demirdjian, Alexander Grue...
WACV
2008
IEEE
13 years 11 months ago
Distributed Visual Processing for a Home Visual Sensor Network
deliver objects, handle emergency, wherever he/she is inside the home. In addition, the burden of processing We address issues dealing with distributed visual power can be distribu...
Kwangsu Kim, Gérard G. Medioni