Retrieval-Augmented Generation (RAG) Patterns and Best Practices

The rise of LLMs that coherently use language has led to an appetite to ground the generation of these models in facts and private collections of data. This is motivated by the desire to reduce the hallucinations of these models, as well as supply them with up-to-date, often private information that is not a part of their training data. Retrieval-augmented generation is the method that uses a search step to ground models in relevant data sources. In this talk, we'll cover the common schematics of RAG systems and tips on how to improve them.

Interview:

What's the focus of your work these days?

I explore and advise enterprises and the developer community on applications of Large Language Models (LLMs).

What's the motivation for your talk at QCon London 2024?

I aim to give builders the intuition of problem-solving with LLMs and going beyond thinking of them as text-in / text-out monoliths.

How would you describe your main persona and target audience for this session?

This talk is accessible to a wide audience. All that's needed is curiosity around large language models.

Is there anything specific that you'd like people to walk away with after watching your session?

Insight into different possible systems to build using LLMs as individual components in a pipeline.

Is there anything interesting that you learned from a previous QCon?

How beautiful a melodica sound with algorithmically generated music in the background is. This is from the session Functional Composition by Chris Ford.


Speaker

Jay Alammar

Director & Engineering Fellow @Cohere & Co-Author of "Hands-On Large Language Models"

Jay is the co-author of Hands-On Large Language Models. Through his blog and YouTube channel, Jay has helped millions of people wrap their heads around complex machine learning topics. Jay harnesses a visual, highly-intuitive presentation style to communicate concepts ranging from the most basic intros to data analysis, interactive intros to neural networks, to dissections of state-of-the-art models in Natural Language Processing

Jay is Director and Engineering Fellow at Cohere, a leading provider of large language models for text generation, search, and retrieval-augmented generation for the enterprise.

Read more

Date

Monday Apr 8 / 10:35AM BST ( 50 minutes )

Location

Windsor (5th Fl.)

Topics

AI/ML Language Models search retrieval-augmented generation

Video

Video is not available

Slides

Slides are not available

Share

From the same track

Session AI/ML

Navigating LLM Deployment: Tips, Tricks, and Techniques

Monday Apr 8 / 11:45AM BST

Self-hosted Language Models are going to power the next generation of applications in critical industries like financial services, healthcare, and defence.

Speaker image - Meryem Arik

Meryem Arik

Co-Founder @TitanML

Session AI/ML

Reach Next-Level Autonomy with LLM-Based AI Agents

Monday Apr 8 / 01:35PM BST

Generative AI has emerged rapidly since the release of ChatGPT, yet the industry is still at its very early stage with unclear prospects and potential.

Speaker image - Tingyi Li

Tingyi Li

Enterprise Solutions Architect @AWS

Session AI/ML

LLM and Generative AI for Sensitive Data - Navigating Security, Responsibility, and Pitfalls in Highly Regulated Industries

Monday Apr 8 / 02:45PM BST

As large language models (LLM) become more prevalent in highly regulated industries, dealing with sensitive data and ensuring the security and ethical design of machine learning (ML) models is paramount.

Speaker image - Stefania Chaplin

Stefania Chaplin

Solutions Architect @GitLab

Speaker image - Azhir Mahmood

Azhir Mahmood

Research Scientist @PhysicsX

Session AI/ML

How Green is Green: LLMs to Understand Climate Disclosure at Scale

Monday Apr 8 / 05:05PM BST

Assessment of the validity of climate finance claims requires a system that can handle significant variation in language, format, and structure present in climate and financial reporting documentation, and knowledge of the domain-specific language of climate science and finance.

Speaker image - Leo Browning

Leo Browning

First ML Engineer @ClimateAligned

Session AI/ML

The AI Revolution Will Not Be Monopolized: How Open-Source Beats Economies of Scale, Even for LLMs

Monday Apr 8 / 03:55PM BST

With the latest advancements in Natural Language Processing and Large Language Models (LLMs), and big companies like OpenAI dominating the space, many people wonder: Are we heading further into a black box era with larger and larger models, obscured behind APIs controlled by big t

Speaker image - Ines Montani

Ines Montani

Co-Founder & CEO @Explosion, Core Developer of spaCy