You are viewing content from a past/completed QCon

Presentation: Intuition & Use-Cases of Embeddings in NLP & Beyond

Track: AI/Machine Learning without a PhD

Location: St James, 4th flr.

Duration: 1:40pm - 2:30pm

Day of week: Monday

Share this on:

This presentation is now available to view on InfoQ.com

Watch video with transcript

Abstract

Machine Learning has achieved tremendous advancements in language tasks over the last few years (think of technologies like Google Duplex, Google Translate, Amazon Alexa). One of the fundamental concepts underpinning this progress is the concept of word embeddings (using something like the word2vec algorithm). Embeddings continue to show incredible power for representing words in a way that machines can use to do some very useful things by solving complex language problems. More recently, companies like Airbnb and Alibaba have started using the concept of embedding to empower non-NLP use-cases like recommendations, search ranking, and personalization.

 

In this talk, we will go over the intuition of word embeddings, how they're created, and look at examples of how these concepts can be carried over to solve problems like content discovery and search ranking in marketplaces and media-consumption services (e.g. movie/music recommendations).

Speaker: Jay Alammar

VC and Machine Learning Explainer @STVcapital

Through his blog and lessons on Udacity, Jay has helped tens of thousands of people wrap their heads around complex machine learning topics. Jay harnesses a visual, highly-intuitive presentation style to communicate concepts ranging from the most basic intros to data analysis, interactive intros to neural networks, to dissections of state-of-the-art models in Natural Language Processing.

Find Jay Alammar at

Last Year's Tracks

Monday, 4 March

Tuesday, 5 March

Wednesday, 6 March