MPT-30B: Raising the bar for open-source foundation modelsJune 22, 2023 by The MosaicML NLP Team in Mosaic Research Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on NVIDIA H100...
Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMsMay 5, 2023 by The MosaicML NLP Team in Mosaic Research Introducing MPT-7B, the first entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and...