Papers
Topics
Authors
Recent
Search
2000 character limit reached

Training Heterogeneous Features in Sequence to Sequence Tasks: Latent Enhanced Multi-filter Seq2Seq Model

Published 18 May 2021 in cs.CL and cs.LG | (2105.08840v3)

Abstract: In language processing, training data with extremely large variance may lead to difficulty in the LLM's convergence. It is difficult for the network parameters to adapt sentences with largely varied semantics or grammatical structures. To resolve this problem, we introduce a model that concentrates the each of the heterogeneous features in the input sentences. Building upon the encoder-decoder architecture, we design a latent-enhanced multi-filter seq2seq model (LEMS) that analyzes the input representations by introducing a latent space transformation and clustering. The representations are extracted from the final hidden state of the encoder and lie in the latent space. A latent space transformation is applied for enhancing the quality of the representations. Thus the clustering algorithm can easily separate samples based on the features of these representations. Multiple filters are trained by the features from their corresponding clusters, and the heterogeneity of the training data can be resolved accordingly. We conduct two sets of comparative experiments on semantic parsing and machine translation, using the Geo-query dataset and Multi30k English-French to demonstrate the enhancement our model has made respectively.

Citations (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.