Sciweavers

34 search results - page 6 / 7
» Interactive Museum Exhibit Using Pointing Gesture Recognitio...
Sort
View
HRI
2010
ACM
13 years 10 months ago
Recognizing engagement in human-robot interaction
—Based on a study of the engagement process between humans, we have developed and implemented an initial computational model for recognizing engagement between a human and a huma...
Charles Rich, Brett Ponsleur, Aaron Holroyd, Canda...
NIPS
2007
13 years 7 months ago
EEG-Based Brain-Computer Interaction: Improved Accuracy by Automatic Single-Trial Error Detection
Brain-computer interfaces (BCIs), as any other interaction modality based on physiological signals and body channels (e.g., muscular activity, speech and gestures), are prone to e...
Pierre W. Ferrez, José del R. Millán
ACISICIS
2010
IEEE
13 years 7 months ago
One-Finger Interaction for Ubiquitous Environment
We propose new interaction techniques named "One-finger Interaction" in the ubiquitous environment in a home. One-finger Interaction is an interaction technique for doing...
Takashi Nakamura, Shin Takahashi, Jiro Tanaka
ICMI
2004
Springer
189views Biometrics» more  ICMI 2004»
13 years 11 months ago
A multimodal learning interface for sketch, speak and point creation of a schedule chart
We present a video demonstration of an agent-based test bed application for ongoing research into multi-user, multimodal, computer-assisted meetings. The system tracks a two perso...
Edward C. Kaiser, David Demirdjian, Alexander Grue...
ICCV
2005
IEEE
13 years 11 months ago
Tracking Body Parts of Multiple People for Multi-person Multimodal Interface
Although large displays could allow several users to work together and to move freely in a room, their associated interfaces are limited to contact devices that must generally be s...
Sébastien Carbini, Jean-Emmanuel Viallet, O...