Papers
Topics
Authors
Recent
Search
2000 character limit reached

Infusing Hierarchical Guidance into Prompt Tuning: A Parameter-Efficient Framework for Multi-level Implicit Discourse Relation Recognition

Published 23 Feb 2024 in cs.CL | (2402.15080v1)

Abstract: Multi-level implicit discourse relation recognition (MIDRR) aims at identifying hierarchical discourse relations among arguments. Previous methods achieve the promotion through fine-tuning PLMs. However, due to the data scarcity and the task gap, the pre-trained feature space cannot be accurately tuned to the task-specific space, which even aggravates the collapse of the vanilla space. Besides, the comprehension of hierarchical semantics for MIDRR makes the conversion much harder. In this paper, we propose a prompt-based Parameter-Efficient Multi-level IDRR (PEMI) framework to solve the above problems. First, we leverage parameter-efficient prompt tuning to drive the inputted arguments to match the pre-trained space and realize the approximation with few parameters. Furthermore, we propose a hierarchical label refining (HLR) method for the prompt verbalizer to deeply integrate hierarchical guidance into the prompt tuning. Finally, our model achieves comparable results on PDTB 2.0 and 3.0 using about 0.1% trainable parameters compared with baselines and the visualization demonstrates the effectiveness of our HLR method.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. Intrinsic dimensionality explains the effectiveness of language model fine-tuning. ACL.
  2. Hongxiao Bai and Zhao Hai. 2018. Deep enhanced representation for implicit discourse relation recognition. COLING.
  3. Hierarchy-aware label semantics matching network for hierarchical text classification. ACL.
  4. Implicit discourse relation detection via a deep architecture with gated relevance network. ACL.
  5. Class-balanced loss based on effective number of samples. CVPR.
  6. Zeyu Dai and Ruihong Huang. 2018. Improving implicit discourse relation classification by modeling inter-dependencies of discourse units in a paragraph. NAACL.
  7. Zeyu Dai and Ruihong Huang. 2019. A regularization approach for incorporating event knowledge and coreference relations into neural discourse parsing. EMNLP (Short).
  8. Delta tuning: A comprehensive study of parameter efficient methods for pre-trained language models. ArXiv.
  9. Cvae-based re-anchoring for implicit discourse relation classification. EMNLP Findings.
  10. Dialogue discourse-aware graph model and data augmentation for meeting summarization. IJCAI.
  11. Ppt: Pre-trained prompt tuning for few-shot learning. ACL.
  12. Working memory-driven neural networks with a novel knowledge enhancement paradigm for implicit discourse relation recognition. AAAI.
  13. Implicit discourse relation recognition using neural tensor network with interactive attention and sparse learning. COLING.
  14. Warp: Word-level adversarial reprogramming. ACL.
  15. Ptr: Prompt tuning with rules for text classification. ArXiv.
  16. Transs-driven joint learning architecture for implicit discourse relation recognition. ACL.
  17. Yangfeng Ji and Jacob Eisenstein. 2015. One vector is not enough: Entity-augmented distributed semantics for discourse relations. TACL.
  18. Not just classification: Recognizing implicit discourse relation on joint modeling of classification and generation. EMNLP.
  19. Implicit discourse relation classification: We need to talk about evaluation. ACL.
  20. A knowledge-augmented neural network model for implicit discourse relation classification. COLING.
  21. Adapting bert to implicit discourse relation classification with a focus on discourse connectives. LREC.
  22. Multi-task attention-based neural networks for implicit discourse relationship representation and identification. EMNLP.
  23. The power of scale for parameter-efficient prompt tuning. EMNLP.
  24. Xiang Lisa Li and Percy Liang. 2021. Prefix-tuning: Optimizing continuous prompts for generation. ACL.
  25. Dice loss for data-imbalanced nlp tasks. ACL.
  26. Composing elementary discourse units in abstractive summarization. ACL.
  27. Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing. ACM Computing Surveys (CSUR).
  28. P-tuning v2: Prompt tuning can be comparable to fine-tuning universally across scales and tasks. ArXiv.
  29. Gpt understands, too. ArXiv.
  30. On the importance of word and sentence representation learning in implicit discourse relation classification. IJCAI.
  31. Yang Liu and Sujian Li. 2016. Recognizing implicit discourse relations via repeated reading: Neural networks with multi-level attention. EMNLP.
  32. Roberta: A robustly optimized bert pretraining approach. Facebook AI.
  33. Employing the correspondence of relations and connectives to identify implicit discourse relations via label embeddings. ACL.
  34. Automatic sense prediction for implicit discourse relations in text. ACL.
  35. The penn discourse treebank 2.0. LREC.
  36. A stacking gated neural architecture for implicit discourse relation classification. EMNLP.
  37. Interactively-propagative attention learning for implicit discourse relation recognition. COLING.
  38. Entity enhancement for implicit discourse relation classification in the biomedical domain. ACL.
  39. Acquiring annotated data with cross-lingual explicitation for implicit discourse relation classification. DISRPT.
  40. Fairness-aware class imbalanced learning. EMNLP.
  41. From discourse to narrative: Knowledge projection for event relation extraction. ACL.
  42. Incorporating hierarchy into text encoder: a contrastive learning approach for hierarchical text classification. ACL.
  43. A label dependence-aware sequence generation model for multi-level implicit discourse relation recognition. AAAI.
  44. Hierarchical multi-task learning with crf for implicit discourse relation recognition. Knowledge Base System.
  45. Encoding and fusing semantic connection and linguistic evidence for implicit discourse relation recognition. ACL Findings.
  46. Connprompt: Connective-cloze prompt learning for implicit discourse relation recognition. COLING.
  47. Using active learning to expand training data for implicit discourse relation recognition. EMNLP.
  48. Shallow convolutional neural network for implicit discourse relation recognition. EMNLP.
  49. Differentiable prompt makes pre-trained language models better few-shot learners. NeurIPS, abs/2108.13161.
  50. Context tracking network: Graph-based context modeling for implicit discourse relation recognition. NAACL.
  51. Prompt-based connective prediction method for fine-grained implicit discourse relation recognition. EMNLP Findings, abs/2210.07032.
Citations (10)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.