Automatic Text Extractive Summarization Based on Graph and Pre-trained Language Model Attention
Abstract: Representing a text as a graph for obtaining automatic text summarization has been investigated for over ten years. With the development of attention or Transformer on NLP, it is possible to make a connection between the graph and attention structure for a text. In this paper, an attention matrix between the sentences of the whole text is adopted as a weighted adjacent matrix of a fully connected graph of the text, which can be produced through the pre-training LLM. The GCN is further applied to the text graph model for classifying each node and finding out the salient sentences from the text. It is demonstrated by the experimental results on two typical datasets that our proposed model can achieve a competitive result in comparison with sate-of-the-art models.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.