Natural Language Processing

This is a short course on natural language processing using neural networks.

The material is based on the book from Yoav Goldberg: Neural Network Methods for Natural Language Processing

Slides

  1. Introduction to Natural Language Processing | (tex source file)
  2. Neural Networks | (tex source file)
  3. Word Vectors | (tex source file)
  4. Convolutional Neural Networks | (tex source file)
  5. Recurrent Neural Networks | (tex source file)
  6. Sequence to Sequence Models | (tex source file)
  7. Recursive Networks and Paragraph Vectors | (tex source file)

Other Resources

  1. Speech and Language Processing (3rd ed. draft) by Dan Jurafsky and James H. Martin.
  2. Michael Collins' NLP notes.
  3. A Primer on Neural Network Models for Natural Language Processing by Joav Goldberg.
  4. Natural Language Understanding with Distributed Representation by Kyunghyun Cho
  5. CS224n: Natural Language Processing with Deep Learning, Stanford course
  6. NLP-progress: Repository to track the progress in Natural Language Processing (NLP)
  7. NLTK book
  8. AllenNLP: Open source project for designing deep leaning-based NLP models
  9. Attention is all you need explained
  10. ELMO explained
  11. BERT exaplained

Videos

  1. Natural Language Processing MOOC videos by Dan Jurafsky and Chris Manning, 2012
  2. Natural Language Processing MOOC videos by Michael Collins, 2013
  3. Natural Language Processing with Deep Learning by Chris Manning and Richard Socher, 2017