Word space models, in the sense of vector space models built on distributional data taken from texts, are used to model semantic relations between words. We argue that the high dim...
Current vector-space models of lexical semantics create a single "prototype" vector to represent the meaning of a word. However, due to lexical ambiguity, encoding word ...
Coecke, Sadrzadeh, and Clark [3] developed a compositional model of meaning for distributional semantics, in which each word in a sentence has a meaning vector and the distributio...
Edward Grefenstette, Mehrnoosh Sadrzadeh, Stephen ...
We use Bell states to provide compositional distributed meaning for negative sentences of English. The lexical meaning of each word of the sentence is a context vector obtained wi...
We address the task of computing vector space representations for the meaning of word occurrences, which can vary widely according to context. This task is a crucial step towards ...