So two things are happening here: first is that the OSS community (and potentially Meta) seems to be settling around 3B and 7B. Second is it's surprising Google did not go straight for GPT-4—it seems to me really unlikely that they wouldn't be able to hit that if they wanted.
OSS Community Settles on 3B and 7B Model Sizes
By
–
Leave a Reply