Sciweavers

4 search results - page 1 / 1
» Talking and Looking: the SmartWeb Multimodal Interaction Cor...
Sort
View
LREC
2008
88views Education» more  LREC 2008»
13 years 6 months ago
Talking and Looking: the SmartWeb Multimodal Interaction Corpus
Nowadays portable devices such as smart phones can be used to capture the face of a user simultaneously with the voice input. Server based or even embedded dialogue system might u...
Florian Schiel, Hannes Mögele
ICMI
2005
Springer
143views Biometrics» more  ICMI 2005»
13 years 10 months ago
A look under the hood: design and development of the first SmartWeb system demonstrator
Experience shows that decisions in the early phases of the development of a multimodal system prevail throughout the life-cycle of a project. The distributed architecture and the ...
Norbert Reithinger, Simon Bergweiler, Ralf Engel, ...
LREC
2010
152views Education» more  LREC 2010»
13 years 6 months ago
A Software Toolkit for Viewing Annotated Multimodal Data Interactively over the Web
This paper describes a software toolkit for the interactive display and analysis of automatically extracted or manually derived annotation features of visual and audio data. It ha...
Nick Campbell, Akiko Tabata
ICRA
2009
IEEE
147views Robotics» more  ICRA 2009»
13 years 11 months ago
Generating Robot/Agent backchannels during a storytelling experiment
Abstract— This work presents the development of a realtime framework for the research of Multimodal Feedback of Robots/Talking Agents in the context of Human Robot Interaction (H...
Sames Al Moubayed, Malek Baklouti, Mohamed Chetoua...