2000 character limit reached
Meta-Learning a Dynamical Language Model
Published 28 Mar 2018 in cs.CL | (1803.10631v1)
Abstract: We consider the task of word-level language modeling and study the possibility of combining hidden-states-based short-term representations with medium-term representations encoded in dynamical weights of a LLM. Our work extends recent experiments on LLMs with dynamically evolving weights by casting the language modeling problem into an online learning-to-learn framework in which a meta-learner is trained by gradient-descent to continuously update a LLM weights.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.