Sciweavers

1219 search results - page 3 / 244
» Where to Look Next
Sort
View
CHI
2010
ACM
14 years 18 days ago
Knowing where and when to look in a time-critical multimodal dual task
Human-computer systems intended for time-critical multitasking need to be designed with an understanding of how humans can coordinate and interleave perceptual, memory, and motor ...
Anthony J. Hornof, Yunfeng Zhang, Tim Halverson
IUI
2004
ACM
13 years 11 months ago
Where to look: a study of human-robot engagement
This paper reports on a study of human subjects with a robot designed to mimic human conversational gaze behavior in collaborative conversation. The robot and the human subject to...
Candace L. Sidner, Cory D. Kidd, Christopher Lee, ...
UIST
2010
ACM
13 years 3 months ago
Cosaliency: where people look when comparing images
Image triage is a common task in digital photography. Determining which photos are worth processing for sharing with friends and family and which should be deleted to make room fo...
David E. Jacobs, Dan B. Goldman, Eli Shechtman
KDD
2009
ACM
196views Data Mining» more  KDD 2009»
14 years 16 days ago
WhereNext: a location predictor on trajectory pattern mining
The pervasiveness of mobile devices and location based services is leading to an increasing volume of mobility data. This side effect provides the opportunity for innovative meth...
Anna Monreale, Fabio Pinelli, Roberto Trasarti, Fo...
OHS
2000
Springer
13 years 9 months ago
Standardizing Hypertext: Where Next for OHP?
Abstract. Over the last six years the Open Hypermedia Systems Working Group (OHSWG) has been working in a coordinated effort to produce a protocol which will allow components of an...
David E. Millard, Hugh C. Davis, Luc Moreau