Papers
Topics
Authors
Recent
Search
2000 character limit reached

Automatic Text Extractive Summarization Based on Graph and Pre-trained Language Model Attention

Published 10 Oct 2021 in cs.CL | (2110.04878v2)

Abstract: Representing a text as a graph for obtaining automatic text summarization has been investigated for over ten years. With the development of attention or Transformer on NLP, it is possible to make a connection between the graph and attention structure for a text. In this paper, an attention matrix between the sentences of the whole text is adopted as a weighted adjacent matrix of a fully connected graph of the text, which can be produced through the pre-training LLM. The GCN is further applied to the text graph model for classifying each node and finding out the salient sentences from the text. It is demonstrated by the experimental results on two typical datasets that our proposed model can achieve a competitive result in comparison with sate-of-the-art models.

Authors (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.