Natural Language Processing

This is a short course on natural language processing using neural networks.

The material is based on the book from Yoav Goldberg: Neural Network Methods for Natural Language Processing

Slides

  1. Introduction to Natural Language Processing | (tex source file)
  2. Neural Networks | (tex source file)
  3. Word Vectors | (tex source file)
  4. Convolutional Neural Networks | (tex source file)
  5. Recurrent Neural Networks | (tex source file)
  6. Sequence to Sequence Models | (tex source file)
  7. Recursive Networks and Paragraph Vectors | (tex source file)

Other Resources

  1. Speech and Language Processing (3rd ed. draft) by Dan Jurafsky and James H. Martin.
  2. Michael Collins' NLP notes.
  3. A Primer on Neural Network Models for Natural Language Processing by Joav Goldberg.
  4. Natural Language Understanding with Distributed Representation by Kyunghyun Cho
  5. Natural Language Processing Book by Jacob Eisenstein
  6. CS224n: Natural Language Processing with Deep Learning, Stanford course
  7. NLP-progress: Repository to track the progress in Natural Language Processing (NLP)
  8. NLTK book
  9. AllenNLP: Open source project for designing deep leaning-based NLP models
  10. Real World NLP Book: AllenNLP tutorials
  11. Attention is all you need explained
  12. ELMO explained
  13. BERT exaplained
  14. Better Language Models and Their Implications OpenAI Blog
  15. David Bamman NLP Slides @Berkley

Videos

  1. Natural Language Processing MOOC videos by Dan Jurafsky and Chris Manning, 2012
  2. Natural Language Processing MOOC videos by Michael Collins, 2013
  3. Natural Language Processing with Deep Learning by Chris Manning and Richard Socher, 2017
  4. CS224N: Natural Language Processing with Deep Learning | Winter 2019