2000 character limit reached
Efficient Sampled Softmax for Tensorflow
Published 10 Apr 2020 in cs.LG | (2004.05244v1)
Abstract: This short paper discusses an efficient implementation of \emph{sampled softmax loss} for Tensorflow. The speedup over the default implementation is achieved due to simplification of the graph for the forward and backward passes.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.