In this episode, Nathan sits down with Jonathan Frankle, Chief Scientist, and Abhi Venigalla, Research Scientist of MosaicML.
In this episode, Nathan sits down with Jonathan Frankle, Chief Scientist, and Abhi Venigalla, Research Scientist of MosaicML. They chat about Mosaic’s custom LLMs, the customers seeking Mosaic out and what their journeys and use cases look like, and exciting developments in Mosaic’s research: including their new inference platform, as well as Mosaic’s MPT-7B-65k+ storywriter model.
RECOMMENDED PODCAST:
Founding a business is just the tip of the iceberg; the real complexity comes with scaling it. On 1 to 1000, hosts Jack Altman and Erik Torenberg dig deep into the inevitable twists and turns operators encounter along the journey of turning an idea into a business. Hear all about the tactical challenges of scaling from the people that built up the world’s leading companies like Stripe, Ramp, and Lattice. Our first episode with Eric Glyman of Ramp is out now: https://link.chtbl.com/1to1000
The Cognitive Revolution is a part of the Turpentine podcast network. To learn more: www.turpentine.co
TIMESTAMPS:
(00:00) Episode Preview
(06:04) Mosaic’s business model
(07:28) Who uses Mosaic’s custom LLMs? What does their data look like?
(09:55)Mosaic’s use cases for custom LLMs
(12:47) How much extraction and summarization was done by humans pre-LLMs?
(15:28) Sponsor: Omneky
(21:50) The journeys of Mosaic’s customers and would a Wendy’s LLM know about a Big Mac?
(25:46) The curriculum model and fine-tuning
(29:10) Language models in the life sciences
(33:20) How raw can data be before it becomes a problem?
(35:44) Using the output of bulk pre-training process vs additional after training
(38:30) Redteaming as a service
(39:40) Mosaic’s inference platform
(41:53) Spending one cent on 20,000 tokens, how is that cent distributed?
(46:00)) Selling compute on a dedicated capacity basis
(47:30) Oracle and AWS
(49:50) The storywriter model and 65,000 token window
(54:35) The transition from finite parameters into infinite attention matrix
LINKS:
MosaicML: https://www.mosaicml.com/
MPT-7B Storywriter Model: https://huggingface.co/mosaicml/mpt-7b-storywriter
TWITTER:
@jefrankle (Jonathan)
@abhi_venigalla (Abhi)
@MosaicML (Mosaic)
@CogRev_Podcast
@labenz (Nathan)
@eriktorenberg (Erik)
SPONSOR:
Thank you Omneky for sponsoring The Cognitive Revolution. Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work, customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off.
Music Credit: MusicLM