Papers
Topics
Authors
Recent
Search
2000 character limit reached

Modeling Semantic Relatedness using Global Relation Vectors

Published 14 Nov 2017 in cs.CL | (1711.05294v1)

Abstract: Word embedding models such as GloVe rely on co-occurrence statistics from a large corpus to learn vector representations of word meaning. These vectors have proven to capture surprisingly fine-grained semantic and syntactic information. While we may similarly expect that co-occurrence statistics can be used to capture rich information about the relationships between different words, existing approaches for modeling such relationships have mostly relied on manipulating pre-trained word vectors. In this paper, we introduce a novel method which directly learns relation vectors from co-occurrence statistics. To this end, we first introduce a variant of GloVe, in which there is an explicit connection between word vectors and PMI weighted co-occurrence vectors. We then show how relation vectors can be naturally embedded into the resulting vector space.

Citations (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.