Papers
Topics
Authors
Recent
Search
2000 character limit reached

Syntactically Informed Text Compression with Recurrent Neural Networks

Published 8 Aug 2016 in cs.LG, cs.CL, cs.IT, and math.IT | (1608.02893v2)

Abstract: We present a self-contained system for constructing natural LLMs for use in text compression. Our system improves upon previous neural network based models by utilizing recent advances in syntactic parsing -- Google's SyntaxNet -- to augment character-level recurrent neural networks. RNNs have proven exceptional in modeling sequence data such as text, as their architecture allows for modeling of long-term contextual information.

Citations (16)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.