Sciweavers

146 search results - page 28 / 30
» A Gesture Based Interface for Human-Robot Interaction
Sort
View
UIST
2010
ACM
13 years 3 months ago
A framework for robust and flexible handling of inputs with uncertainty
New input technologies (such as touch), recognition based input (such as pen gestures) and next-generation interactions (such as inexact interaction) all hold the promise of more ...
Julia Schwarz, Scott E. Hudson, Jennifer Mankoff, ...
CHI
2010
ACM
14 years 19 days ago
The design and evaluation of multitouch marking menus
Despite the considerable quantity of research directed towards multitouch technologies, a set of standardized UI components have not been developed. Menu systems provide a particu...
G. Julian Lepinski, Tovi Grossman, George W. Fitzm...
EUPROJECTS
2006
Springer
13 years 9 months ago
Human Computer Confluence
Pervasive Computing has postulated to invisibly integrate technology into everyday objects in such a way, that these objects turn into smart things. Not only a single object of thi...
Alois Ferscha, Stefan Resmerita, Clemens Holzmann
IJCAI
1989
13 years 7 months ago
Bidirectional Use of Knowledge in the Multi-modal NL Access System XTRA
The acceptability and effectiveness of an expert system is critically dependent on its user interface. Natural language could be a well-suited communicative medium; however, curre...
Jürgen Allgayer, Roman M. Jansen-Winkeln, Car...
CHI
2008
ACM
14 years 6 months ago
AAMU: adaptive activation area menus for improving selection in cascading pull-down menus
Selecting items in cascading pull-down menus is a frequent task in most GUIs. These selections involve two major components: steering and selection, with the steering component be...
Erum Tanvir, Jonathan Cullen, Pourang Irani, Andy ...