Sciweavers

Share
ADCS
2015

Integrating and Evaluating Neural Word Embeddings in Information Retrieval

4 years 10 months ago
Integrating and Evaluating Neural Word Embeddings in Information Retrieval
Recent advances in neural language models have contributed new methods for learning distributed vector representations of words (also called word embeddings). Two such methods are the continuous bag-of-words model and the skipgram model. These methods have been shown to produce embeddings that capture higher order relationships between words that are highly effective in natural language processing tasks involving the use of word similarity and word analogy. Despite these promising results, there has been little analysis of the use of these word embeddings for retrieval. Motivated by these observations, in this paper, we set out to determine how these word embeddings can be used within a retrieval model and what the benefit might be. To this aim, we use neural word embeddings within the well known translation language model for information retrieval. This language model captures implicit semantic relations between the words in queries and those in relevant documents, thus producing m...
Guido Zuccon, Bevan Koopman, Peter Bruza, Leif Azz
Added 13 Apr 2016
Updated 13 Apr 2016
Type Journal
Year 2015
Where ADCS
Authors Guido Zuccon, Bevan Koopman, Peter Bruza, Leif Azzopardi
Comments (0)
books