Language Models and Skipgram Recommenders Talk @ MIT
I had a great time speaking at the MIT Analytics Lab about some of my favorite ideas in natural language processing and their practical applications.
The talk built on top of my explanations of the word2vec algorithm, and then how embeddings are used to do product recommendations.
I demoed three basic Jupyter notebooks as hand-on practice:
-
Exploring Word Embeddings (Colab)
This notebook provides the quickest way to experiment with word embeddings, visualize them, and compare them. -
Song Embeddings - Skipgram Recommender (Colab)
In this notebook, we train a word2vec model against song playlists to generate music recommendations. -
Sentence Classification with BERT (Colab)
This notebook is a super fast way to use a pre-trained BERT model (using the wonderful Huggingface transformers package) use it for sentiment analysis. I later expanded this in its own blog post - A Visual Guide to Using BERT for the First Time with an update notebook (Colab)