AI Dynamics

Global AI News Aggregator

GPT-Fast Adds Mixtral-8x7B Support in Efficient PyTorch

gpt-fast now supports mixtral-8x7B, in addition to gpt/llama.
1000 lines of simple pytorch code blazing it out! https://
github.com/pytorch-labs/g
pt-fast/pull/71

→ View original post on X — @soumithchintala,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *