Natural Language Processing
This is a short course on natural language processing using neural networks.
The material is based on the book from Yoav Goldberg: Neural Network Methods for Natural Language Processing
Slides
- Introduction to Natural Language Processing | (tex source file)
- Neural Networks | (tex source file)
- Word Vectors | (tex source file)
- Convolutional Neural Networks | (tex source file)
- Recurrent Neural Networks | (tex source file)
- Sequence to Sequence Models | (tex source file)
- Recursive Networks and Paragraph Vectors | (tex source file)
Other Resources
- Speech and Language Processing (3rd ed. draft) by Dan Jurafsky and James H. Martin.
- Michael Collins' NLP notes.
- A Primer on Neural Network Models for Natural Language Processing by Joav Goldberg.
- Natural Language Understanding with Distributed Representation by Kyunghyun Cho
- Natural Language Processing Book by Jacob Eisenstein
- CS224n: Natural Language Processing with Deep Learning, Stanford course
- NLP-progress: Repository to track the progress in Natural Language Processing (NLP)
- NLTK book
- AllenNLP: Open source project for designing deep leaning-based NLP models
- Real World NLP Book: AllenNLP tutorials
- Attention is all you need explained
- ELMO explained
- BERT exaplained
- Better Language Models and Their Implications OpenAI Blog
- David Bamman NLP Slides @Berkley
Videos
- Natural Language Processing MOOC videos by Dan Jurafsky and Chris Manning, 2012
- Natural Language Processing MOOC videos by Michael Collins, 2013
- Natural Language Processing with Deep Learning by Chris Manning and Richard Socher, 2017
- CS224N: Natural Language Processing with Deep Learning | Winter 2019