2000 character limit reached
Transformer-based language modeling and decoding for conversational speech recognition
Published 4 Jan 2020 in cs.CL, cs.LG, and eess.AS | (2001.01140v1)
Abstract: We propose a way to use a transformer-based LLM in conversational speech recognition. Specifically, we focus on decoding efficiently in a weighted finite-state transducer framework. We showcase an approach to lattice re-scoring that allows for longer range history captured by a transfomer-based LLM and takes advantage of a transformer's ability to avoid computing sequentially.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.