Language Models and Skipgram Recommenders Talk @ MIT
I had a great time speaking at the MIT Analytics Lab about some of my favorite ideas in natural language processing and their practical applications.
The talk built on top of my explanations of the word2vec algorithm, and then how embeddings are used to do product recommendations.
I demoed three basic Jupyter notebooks as hand-on practice:
Sentence Classification with BERT (Colab)
This notebook is a super fast way to use a pre-trained BERT model (using the wonderful Huggingface transformers package) use it for sentiment analysis. I later expanded this in its own blog post - A Visual Guide to Using BERT for the First Time with an update notebook (Colab)