Sciweavers

45 search results - page 3 / 9
» Using 3D Touch Interaction for a Multimodal Zoomable User In...
Sort
View
ICMI
2003
Springer
143views Biometrics» more  ICMI 2003»
13 years 10 months ago
Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality
We describe an approach to 3D multimodal interaction in immersive augmented and virtual reality environments that accounts for the uncertain nature of the information sources. The...
Edward C. Kaiser, Alex Olwal, David McGee, Hrvoje ...
ICMI
2004
Springer
189views Biometrics» more  ICMI 2004»
13 years 10 months ago
A multimodal learning interface for sketch, speak and point creation of a schedule chart
We present a video demonstration of an agent-based test bed application for ongoing research into multi-user, multimodal, computer-assisted meetings. The system tracks a two perso...
Edward C. Kaiser, David Demirdjian, Alexander Grue...
CHI
2007
ACM
14 years 5 months ago
Shallow-depth 3d interaction: design and evaluation of one-, two- and three-touch techniques
On traditional tables, people frequently use the third dimension to pile, sort and store objects. However, while effective and informative for organization, this use of the third ...
Mark S. Hancock, M. Sheelagh T. Carpendale, Andy C...
ICMI
2009
Springer
304views Biometrics» more  ICMI 2009»
13 years 10 months ago
Building multimodal applications with EMMA
Multimodal interfaces combining natural modalities such as speech and touch with dynamic graphical user interfaces can make it easier and more effective for users to interact wit...
Michael Johnston
TMM
2011
177views more  TMM 2011»
13 years 9 days ago
MIMiC: Multimodal Interactive Motion Controller
Abstract—We introduce a new algorithm for real-time interactive motion control and demonstrate its application to motion captured data, pre-recorded videos and HCI. Firstly, a da...
Dumebi Okwechime, Eng-Jon Ong, Richard Bowden