Sciweavers

2697 search results - page 134 / 540
» Developing Gestural Input
Sort
View
IJVR
2007
101views more  IJVR 2007»
15 years 4 months ago
Affective Multimodal Control of Virtual
—In this paper we report about the use of computer generated affect to control body and mind of cognitively modeled virtual characters. We use the computational model of affect A...
Martin Klesen, Patrick Gebhard
AROBOTS
2002
166views more  AROBOTS 2002»
15 years 4 months ago
Multi-Modal Interaction of Human and Home Robot in the Context of Room Map Generation
In robotics, the idea of human and robot interaction is receiving a lot of attention lately. In this paper, we describe a multi-modal system for generating a map of the environment...
Saeed Shiry Ghidary, Yasushi Nakata, Hiroshi Saito...
IVC
2011
190views more  IVC 2011»
14 years 12 months ago
A graphical model based solution to the facial feature point tracking problem
In this paper a facial feature point tracker that is motivated by applications such as human-computer interfaces and facial expression analysis systems is proposed. The proposed t...
Serhan Cosar, Müjdat Çetin
TMM
2011
177views more  TMM 2011»
14 years 12 months ago
MIMiC: Multimodal Interactive Motion Controller
Abstract—We introduce a new algorithm for real-time interactive motion control and demonstrate its application to motion captured data, pre-recorded videos and HCI. Firstly, a da...
Dumebi Okwechime, Eng-Jon Ong, Richard Bowden
AUIC
2006
IEEE
15 years 11 months ago
TIDL: mixed presence groupware support for legacy and custom applications
In this paper, we present a framework to use an arbitrary number of mouse and keyboard input devices controlling Swing based Java applications. These devices can be distributed am...
Peter Hutterer, Benjamin Close, Bruce H. Thomas