2000 character limit reached
Syntactically Informed Text Compression with Recurrent Neural Networks
Published 8 Aug 2016 in cs.LG, cs.CL, cs.IT, and math.IT | (1608.02893v2)
Abstract: We present a self-contained system for constructing natural LLMs for use in text compression. Our system improves upon previous neural network based models by utilizing recent advances in syntactic parsing -- Google's SyntaxNet -- to augment character-level recurrent neural networks. RNNs have proven exceptional in modeling sequence data such as text, as their architecture allows for modeling of long-term contextual information.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.