Information is traditionally confined to paper or digitally to a screen. In this paper, we introduce WUW, a wearable gestural interface, which attempts to bring information out in...
Abstract. It is difficult to track, parse and model human-computer interactions during editing and revising of documents, but it is necessary if we are to develop automated technol...
This paper describes the evaluation of a gesture-based mechanism for issuing the back and forward commands in web navigation. Results show that subjects were able to navigate sign...
Although the 2D desktop metaphor has been the dominating paradigm of user interfaces for over two decades, 3D models of interaction are becoming more feasible due to advances in c...
This paper proposes a system architecture for event recognition that integrates information from multiple sources (e.g., gesture and speech recognition from distributed sensors in...