SESSION

Decoding Mistral AI's Large Language Models

Accept Cookies to Play Video

OVERVIEW

EXPERIENCEIn Person
TYPEBreakout
TRACKGenerative AI
TECHNOLOGIESAI/Machine Learning, GenAI/LLMs
SKILL LEVELIntermediate
DURATION40 min

In this session, Devendra Singh Chaplot, Research Scientist at Mistral AI will explore the building blocks and training strategies that power Mistral AI’s large language models. It will feature Mistral AI's open-source models, Mixtral 8x7B and Mixtral 8x22B, which are based on a mixture-of-experts (MoE) architecture and released under the Apache 2.0 license. The presentation will also provide guidance on utilizing Mistral "La Plateforme" API endpoints and offer a preview of upcoming features. 

SESSION SPEAKERS

Devendra Chaplot

/Research Scientist
Mistral AI