Return to site

Language Modeling a Billion Words

July 25, 2016

In this Torch7 blog post, we demonstrate how noise contrastive estimation can be used to train a multi-GPU recurrent neural network language model on the Google billion words dataset. Full documentation is provided for how to train and evaluate the model, and generate samples for qualitative analysis.

All Posts
×

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!

OKSubscriptions powered by Strikingly