Using LLMs Beyond Chatting

LLMs are amazing, but if you want to give your users anything more than a neat chat it becomes complex pretty quickly. In this session we want to take a complex business problem, dissect it into chewable bits for LLM agents and get a user interface up and running for it.

Key Takeaways

1 How can I take my business problems and give users a solution swiftly using off the shelf LLMs

2 Prompt Engineering best practices

3 Structured Outputs on LLMs

4 Taking LLMs private using open source tools


Date

Friday Apr 11 / 09:00AM BST ( 7 hours )

Share