Sciweavers

ECAI
2006
Springer

Natural and Intuitive Multimodal Dialogue for In-Car Applications: The SAMMIE System

13 years 8 months ago
Natural and Intuitive Multimodal Dialogue for In-Car Applications: The SAMMIE System
We present SAMMIE, a laboratory demonstrator of an in-car showcase of a multimodal dialogue system developed in the TALK project5 in cooperation between DFKI/USAAR/BOSCH/BMW, to show natural, intuitive mixed-initiative interaction, with particular emphasis on multimodal turn-planning and natural language generation. SAMMIE currently supports speech-centered multimodal access for the driver to a MP3-player application including search and browsing, as well as composition and modification of playlists. Our approach to dialogue modeling is based on collaborative problem solving integrated with an extended Information State Update paradigm. A formal usability evaluation of a first baseline system of SAMMIE by naive users in a simulated environment yielded positive results, and the improved final version will be integrated in a BMW research car.
Tilman Becker, Nate Blaylock, Ciprian Gerstenberge
Added 22 Aug 2010
Updated 22 Aug 2010
Type Conference
Year 2006
Where ECAI
Authors Tilman Becker, Nate Blaylock, Ciprian Gerstenberger, Ivana Kruijff-Korbayová, Andreas Korthauer, Manfred Pinkal, Michael Pitz, Peter Poller, Jan Schehl
Comments (0)