Ep 13: Training Mosaic's "llongboi" MPT-7B in 9 days for $200k with an empty logbook, how to prep good data for your training, and the future of open models
Appreciate it Aseem. Trying to make these more and more readable. If there's other feedback / requests for what you'd like to see there, please let me know :)
As an author, I just want to say that many of us are excited for using LLM's to help us write. It's important to remember that when somebody claims to represent a silent majority of opposition to something, they are usually a member of a loud minority.
Just like any industry that AI touches, there are going to be winners and losers. And as history has shown us time and again, the winners will all belong to the segment that adapts to changing times and changing technology. So, on behalf of the over 40k writers in the various pro AI writing communities I'm in, I'd like to thank Jonathan, Abhinav, and the rest of the MosaicML team for StoryWriter.
This was a great post (I did't listen to the audio but just read the notes). Thank you for this.
thanks very much Aseem! Alessio has been working very hard to improve the quality of our show notes!
The summary notes are invaluable.
Appreciate it Aseem. Trying to make these more and more readable. If there's other feedback / requests for what you'd like to see there, please let me know :)
Superb
As an author, I just want to say that many of us are excited for using LLM's to help us write. It's important to remember that when somebody claims to represent a silent majority of opposition to something, they are usually a member of a loud minority.
Just like any industry that AI touches, there are going to be winners and losers. And as history has shown us time and again, the winners will all belong to the segment that adapts to changing times and changing technology. So, on behalf of the over 40k writers in the various pro AI writing communities I'm in, I'd like to thank Jonathan, Abhinav, and the rest of the MosaicML team for StoryWriter.