Sciweavers

50 search results - page 9 / 10
» Gaze and Speech in Attentive User Interfaces
Sort
View
ICEIS
2009
IEEE
13 years 12 months ago
An Automated Meeting Assistant: A Tangible Mixed Reality Interface for the AMIDA Automatic Content Linking Device
Abstract. We describe our approach to support ongoing meetings with an automated meeting assistant. The system based on the AMIDA Content Linking Device aims at providing relevant ...
Jochen Ehnes
ISWC
1998
IEEE
13 years 9 months ago
Speaking and Listening on the Run: Design for Wearable Audio Computing
The use of speech and auditory interaction on wearable computers can provide an awareness of events and personal messages, without requiring one's full attention or disruptin...
Nitin "Nick" Sawhney, Chris Schmandt
AUGHUMAN
2010
13 years 3 months ago
Aided eyes: eye activity sensing for daily life
Our eyes collect a considerable amount of information when we use them to look at objects. In particular, eye movement allows us to gaze at an object and shows our level of intere...
Yoshio Ishiguro, Adiyan Mujibiya, Takashi Miyaki, ...
VR
2008
IEEE
149views Virtual Reality» more  VR 2008»
13 years 11 months ago
Virtual Human + Tangible Interface = Mixed Reality Human An Initial Exploration with a Virtual Breast Exam Patient
Virtual human (VH) experiences are receiving increased attention for training real-world interpersonal scenarios. Communication in interpersonal scenarios consists of not only spe...
Aaron Kotranza, Benjamin Lok
LREC
2010
156views Education» more  LREC 2010»
13 years 6 months ago
Incorporating Speech Synthesis in the Development of a Mobile Platform for e-learning
This presentation and accompanying demonstration focuses on the development of a mobile platform for e-learning purposes with enhanced text-to-speech capabilities. It reports on a...
Justus Roux, Pieter Scholtz, Daleen Klop, Claus Po...