Papers
Topics
Authors
Recent
Search
2000 character limit reached

ERNetCL: A novel emotion recognition network in textual conversation based on curriculum learning strategy

Published 12 Aug 2023 in cs.CL | (2308.06450v2)

Abstract: Emotion recognition in conversation (ERC) has emerged as a research hotspot in domains such as conversational robots and question-answer systems. How to efficiently and adequately retrieve contextual emotional cues has been one of the key challenges in the ERC task. Existing efforts do not fully model the context and employ complex network structures, resulting in limited performance gains. In this paper, we propose a novel emotion recognition network based on curriculum learning strategy (ERNetCL). The proposed ERNetCL primarily consists of temporal encoder (TE), spatial encoder (SE), and curriculum learning (CL) loss. We utilize TE and SE to combine the strengths of previous methods in a simplistic manner to efficiently capture temporal and spatial contextual information in the conversation. To ease the harmful influence resulting from emotion shift and simulate the way humans learn curriculum from easy to hard, we apply the idea of CL to the ERC task to progressively optimize the network parameters. At the beginning of training, we assign lower learning weights to difficult samples. As the epoch increases, the learning weights for these samples are gradually raised. Extensive experiments on four datasets exhibit that our proposed method is effective and dramatically beats other baseline models.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (36)
  1. doi:10.18653/v1/2022.naacl-main.108.
  2. doi:10.1016/j.ins.2023.01.098.
  3. doi:10.1016/j.knosys.2022.108668.
  4. doi:10.1016/j.eswa.2020.114382.
  5. doi:10.1016/j.knosys.2022.108900.
  6. doi:10.18653/v1/2020.acl-main.295.
  7. doi:10.18653/v1/2020.acl-main.340.
  8. doi:10.18653/v1/2021.acl-long.547.
  9. doi:10.18653/v1/2021.emnlp-main.36.
  10. doi:10.18653/v1/2021.findings-emnlp.104.
  11. doi:10.18653/v1/2020.acl-main.703.
  12. doi:10.1609/aaai.v35i15.17625.
  13. doi:10.1145/1553374.1553380.
  14. doi:10.1145/3184558.3191607.
  15. doi:10.1016/j.neucom.2020.01.034.
  16. doi:10.1016/j.eswa.2022.118525.
  17. doi:10.24963/ijcai.2022/628.
  18. doi:10.24963/IJCAI.2022/562.
  19. doi:10.18653/v1/D19-1015.
  20. doi:10.1109/TMM.2021.3117062.
  21. doi:10.1109/TAFFC.2023.3243463.
  22. doi:10.18653/v1/D19-1016.
  23. doi:10.1609/aaai.v35i8.16852.
  24. doi:10.1109/ICPR.2018.8546088.
  25. doi:10.1016/j.cviu.2021.103166.
  26. doi:10.3390/app9102171.
  27. doi:10.1109/TASLP.2019.2898816.
  28. doi:10.1109/TPAMI.2021.3069908.
  29. W. Wang, I. Caswell, C. Chelba, Dynamically composing domain-data selection with clean-data selection by “co-curricular learning” for neural machine translation, in: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics, Florence, Italy, 2019, pp. 1282–1292. doi:10.18653/v1/P19-1123.
  30. doi:10.18653/v1/2020.acl-main.41.
  31. doi:10.18653/v1/2020.acl-main.620.
  32. doi:10.18653/v1/P19-1486.
  33. doi:10.18653/v1/2020.acl-main.52.
  34. doi:10.1609/aaai.v36i10.21413.
  35. doi:10.18653/v1/p19-1050.
  36. doi:10.1007/s10579-008-9076-6.
Citations (2)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.