CT How can an adaptive intelligent interface decide what particular action to perform in a given situation, as a function of perceived properties of the user and the situation? Ide...
Abstract. The inclusion of additional modalities into the communicative behavior of virtual agents besides speech has moved into focus of human-computer interface researchers, as h...
At the Human Computer Interaction Lab (HCILab) at UNC Charlotte, we investigate novel ways for people to interact with computers, and through computers with their environments. Ou...
We present Dialog Moves Markup Language (DMML): an extensible markup language (XML) representation of modality independent communicative acts of automated conversational agents. I...
We analyze a corpus of referring expressions collected from user interactions with a multimodal travel guide application. The analysis suggests that, in dramatic contrast to norma...