Papers
Topics
Authors
Recent
Search
2000 character limit reached

Relational Memory Augmented Language Models

Published 24 Jan 2022 in cs.CL and cs.AI | (2201.09680v1)

Abstract: We present a memory-augmented approach to condition an autoregressive LLM on a knowledge graph. We represent the graph as a collection of relation triples and retrieve relevant relations for a given context to improve text generation. Experiments on WikiText-103, WMT19, and enwik8 English datasets demonstrate that our approach produces a better LLM in terms of perplexity and bits per character. We also show that relational memory improves coherence, is complementary to token-based memory, and enables causal interventions. Our model provides a simple yet effective way to combine an autoregressive LLM with a knowledge graph for a more coherent and logical generation.

Citations (29)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.