Dependency Language Models for Transition-based Dependency Parsing
Abstract: In this paper, we present an approach to improve the accuracy of a strong transition-based dependency parser by exploiting dependency LLMs that are extracted from a large parsed corpus. We integrated a small number of features based on the dependency LLMs into the parser. To demonstrate the effectiveness of the proposed approach, we evaluate our parser on standard English and Chinese data where the base parser could achieve competitive accuracy scores. Our enhanced parser achieved state-of-the-art accuracy on Chinese data and competitive results on English data. We gained a large absolute improvement of one point (UAS) on Chinese and 0.5 points for English.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.