Papers
Topics
Authors
Recent
Search
2000 character limit reached

LSTM-based Mixture-of-Experts for Knowledge-Aware Dialogues

Published 5 May 2016 in cs.AI and cs.CL | (1605.01652v1)

Abstract: We introduce an LSTM-based method for dynamically integrating several word-prediction experts to obtain a conditional LLM which can be good simultaneously at several subtasks. We illustrate this general approach with an application to dialogue where we integrate a neural chat model, good at conversational aspects, with a neural question-answering model, good at retrieving precise information from a knowledge-base, and show how the integration combines the strengths of the independent components. We hope that this focused contribution will attract attention on the benefits of using such mixtures of experts in NLP.

Citations (27)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.