Papers
Topics
Authors
Recent
Search
2000 character limit reached

Relative Importance in Sentence Processing

Published 7 Jun 2021 in cs.CL and cs.AI | (2106.03471v1)

Abstract: Determining the relative importance of the elements in a sentence is a key factor for effortless natural language understanding. For human language processing, we can approximate patterns of relative importance by measuring reading fixations using eye-tracking technology. In neural LLMs, gradient-based saliency methods indicate the relative importance of a token for the target objective. In this work, we compare patterns of relative importance in English language processing by humans and models and analyze the underlying linguistic patterns. We find that human processing patterns in English correlate strongly with saliency-based importance in LLMs and not with attention-based importance. Our results indicate that saliency could be a cognitively more plausible metric for interpreting neural LLMs. The code is available on GitHub: https://github.com/beinborn/relative_importance

Citations (26)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.