Papers
Topics
Authors
Recent
Search
2000 character limit reached

Polyglot Contextual Representations Improve Crosslingual Transfer

Published 26 Feb 2019 in cs.CL | (1902.09697v2)

Abstract: We introduce Rosita, a method to produce multilingual contextual word representations by training a single LLM on text from multiple languages. Our method combines the advantages of contextual word representations with those of multilingual representation learning. We produce LLMs from dissimilar language pairs (English/Arabic and English/Chinese) and use them in dependency parsing, semantic role labeling, and named entity recognition, with comparisons to monolingual and non-contextual variants. Our results provide further evidence for the benefits of polyglot learning, in which representations are shared across multiple languages.

Citations (68)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.