Software 2.0: Shipping LLMs with New Knowledge
|DSML: Production ML/ MLOps/ LLMOps
The next generation of software, known as software 2.0, lies in building and shipping differentiated LLMs instead of traditional software. Teaching an LLM new knowledge is a highly effective way to build a differentiated LLM—such as Github Copilot. This process goes beyond just prompting or retrieving—it also involves training the LLM, which includes instruction-finetuning, content-finetuning, pretraining, and more that you’ll learn about in this talk. But training is harder to get right. Why? It's hard to get the right data and it calls for (often tribal) domain knowledge. In this talk, you'll learn about Lamini, an all-in-one LLM stack that makes LLMs less picky about the data it can learn from and makes it easy for LLMs to take in billions of new documents. Lamini exposes LLMs in easily composable functions, so every software engineer can rapidly ship differentiated LLMs and write more software 2.0.