Sciweavers

UIST
2009
ACM

Activity analysis enabling real-time video communication on mobile phones for deaf users

13 years 10 months ago
Activity analysis enabling real-time video communication on mobile phones for deaf users
We describe our system called MobileASL for real-time video communication on the current U.S. mobile phone network. The goal of MobileASL is to enable Deaf people to communicate with Sign Language over mobile phones by compressing and transmitting sign language video in real-time on an off-the-shelf mobile phone, which has a weak processor, uses limited bandwidth, and has little battery capacity. We develop several H.264-compliant algorithms to save system resources while maintaining ASL intelligibility by focusing on the important segments of the video. We employ a dynamic skin-based region-of-interest (ROI) that encodes the skin at higher quality at the expense of the rest of the video. We also automatically recognize periods of signing versus not signing and raise and lower the frame rate accordingly, a technique we call variable frame rate (VFR). We show that our variable frame rate technique results in a 47% gain in battery life on the phone, corresponding to an extra 68 minutes ...
Neva Cherniavsky, Jaehong Chon, Jacob O. Wobbrock,
Added 28 May 2010
Updated 28 May 2010
Type Conference
Year 2009
Where UIST
Authors Neva Cherniavsky, Jaehong Chon, Jacob O. Wobbrock, Richard E. Ladner, Eve A. Riskin
Comments (0)