Sciweavers

146 search results - page 19 / 30
» A Gesture Based Interface for Human-Robot Interaction
Sort
View
139
Voted
AR
2011
14 years 6 months ago
A Direct Physical Interface for Navigation and Positioning of a Robotic Nursing Assistant
People often use direct physical contact to guide a person to a desired location (e.g., leading a child by the hand) or to adjust a person’s posture for a task (e.g., a dance in...
Tiffany L. Chen, Charles C. Kemp
114
Voted
JIPS
2008
127views more  JIPS 2008»
14 years 11 months ago
Comparative Study on the Educational Use of Home Robots for Children
: Human-Robot Interaction (HRI), based on already well-researched Human-Computer Interaction (HCI), has been under vigorous scrutiny since recent developments in robot technology. ...
Jeonghye Han, Miheon Jo, Vicki Jones, Jun H. Jo
SI3D
1999
ACM
15 years 4 months ago
UniCam - 2D gestural camera controls for 3D environments
We present a novel approach to controlling a virtual 3D camera with a 2D mouse or stylus input device that is based on gestural interaction. Our approach to 3D camera manipulation...
Robert C. Zeleznik, Andrew S. Forsberg
IUI
2010
ACM
15 years 8 months ago
Usage patterns and latent semantic analyses for task goal inference of multimodal user interactions
This paper describes our work in usage pattern analysis and development of a latent semantic analysis framework for interpreting multimodal user input consisting speech and pen ge...
Pui-Yu Hui, Wai Kit Lo, Helen M. Meng
99
Voted
CHI
2009
ACM
16 years 7 days ago
A biologically inspired approach to learning multimodal commands and feedback for human-robot interaction
In this paper we describe a method to enable a robot to learn how a user gives commands and feedback to it by speech, prosody and touch. We propose a biologically inspired approac...
Anja Austermann, Seiji Yamada