Sciweavers

2697 search results - page 42 / 540
» Developing Gestural Input
Sort
View
PDC
2006
ACM
15 years 10 months ago
A participatory design agenda for ubiquitous computing and multimodal interaction: a case study of dental practice
This paper reflects upon our attempts to bring a participatory design approach to design research into interfaces that better support dental practice. The project brought together...
Tim Cederman-Haysom, Margot Brereton
CHI
2006
ACM
16 years 4 months ago
Trackball text entry for people with motor impairments
We present a new gestural text entry method for trackballs. The method uses the mouse cursor and relies on crossing instead of pointing. A user writes in fluid Roman-like unistrok...
Jacob O. Wobbrock, Brad A. Myers
ICPR
2002
IEEE
16 years 5 months ago
Integrated Event Recognition from Multiple Sources
This paper proposes a system architecture for event recognition that integrates information from multiple sources (e.g., gesture and speech recognition from distributed sensors in...
Hiroaki Kawashima, Takashi Matsuyama
CHI
2009
ACM
16 years 4 months ago
Bezel swipe: conflict-free scrolling and multiple selection on mobile touch screen devices
Zooming user interfaces are increasingly popular on mobile devices with touch screens. Swiping and pinching finger gestures anywhere on the screen manipulate the displayed portion...
Volker Roth, Thea Turner
CHI
2010
ACM
15 years 11 months ago
Making muscle-computer interfaces more practical
Recent work in muscle sensing has demonstrated the potential of human-computer interfaces based on finger gestures sensed from electrodes on the upper forearm. While this approach...
T. Scott Saponas, Desney S. Tan, Dan Morris, Jim T...