Papers
Topics
Authors
Recent
Search
2000 character limit reached

The Importance of Context in Very Low Resource Language Modeling

Published 10 May 2022 in cs.CL | (2205.04810v1)

Abstract: This paper investigates very low resource LLM pretraining, when less than 100 thousand sentences are available. We find that, in very low resource scenarios, statistical n-gram LLMs outperform state-of-the-art neural models. Our experiments show that this is mainly due to the focus of the former on a local context. As such, we introduce three methods to improve a neural model's performance in the low-resource setting, finding that limiting the model's self-attention is the most effective one, improving on downstream tasks such as NLI and POS tagging by up to 5% for the languages we test on: English, Hindi, and Turkish.

Citations (1)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.