Sciweavers

550 search results - page 46 / 110
» Gaze-augmented manual interaction
Sort
View
ICMI
2009
Springer
162views Biometrics» more  ICMI 2009»
15 years 4 months ago
Multi-modal features for real-time detection of human-robot interaction categories
Social interactions unfold over time, at multiple time scales, and can be observed through multiple sensory modalities. In this paper, we propose a machine learning framework for ...
Ian R. Fasel, Masahiro Shiomi, Pilippe-Emmanuel Ch...
MIDDLEWARE
2007
Springer
15 years 4 months ago
Interactive Resource-Intensive Applications Made Easy
Snowbird is a middleware system based on virtual machine (VM) technology that simplifies the development and deployment of bimodal applications. Such applications alternate betwee...
H. Andrés Lagar-Cavilla, Niraj Tolia, Eyal ...
ATAL
2005
Springer
15 years 3 months ago
Thespian: using multi-agent fitting to craft interactive drama
There has been a growing interest in designing multi-agent based interactive dramas. A key research challenge faced in the design of these systems is to support open-ended user in...
Mei Si, Stacy Marsella, David V. Pynadath
HRI
2010
ACM
15 years 4 months ago
Multimodal interaction with an autonomous forklift
Abstract—We describe a multimodal framework for interacting with an autonomous robotic forklift. A key element enabling effective interaction is a wireless, handheld tablet with ...
Andrew Correa, Matthew R. Walter, Luke Fletcher, J...
TABLETOP
2008
IEEE
15 years 4 months ago
Pokey: Interaction through covert structured light
In this paper we describe a method to support interaction with a cellphone based projectorcamera system. We describe a novel approach that uses a technique known in Computer Visio...
Christopher Richard Wren, Yuri A. Ivanov, Paul A. ...