Presentation: Intuition & Use-Cases of Embeddings in NLP & Beyond

Track: AI/Machine Learning without a PhD

Location: St James, 4th flr.

Duration: 1:40pm - 2:30pm

Day of week: Monday

Share this on:


Machine Learning has achieved tremendous advancements in language tasks over the last few years (think of technologies like Google Duplex, Google Translate, Amazon Alexa). One of the fundamental concepts underpinning this progress is the concept of word embeddings (using something like the word2vec algorithm). Embeddings continue to show incredible power for representing words in a way that machines can use to do some very useful things by solving complex language problems. More recently, companies like Airbnb and Alibaba have started using the concept of embedding to empower non-NLP use-cases like recommendations, search ranking, and personalization.
In this talk, we will go over the intuition of word embeddings, how they're created, and look at examples of how these concepts can be carried over to solve problems like content discovery and search ranking in marketplaces and media-consumption services (e.g. movie/music recommendations).

Speaker: Jay Alammar

VC and Machine Learning Explainer @STVcapital

Through his blog and lessons on Udacity, Jay has helped tens of thousands of people wrap their heads around complex machine learning topics. Jay harnesses a visual, highly-intuitive presentation style to communicate concepts ranging from the most basic intros to data analysis, interactive intros to neural networks, to dissections of state-of-the-art models in Natural Language Processing.

Find Jay Alammar at


The all-new QCon app!

Available on iOS and Android

The new QCon app helps you make the most of your conference experience. Easily browse and follow the conference schedule, star the talks you want to attend, and keep tabs on your personal itinerary. Download the app now for free on iOS and Android.