gpt-fast now supports mixtral-8x7B, in addition to gpt/llama.
1000 lines of simple pytorch code blazing it out! https://
github.com/pytorch-labs/g
pt-fast/pull/71
…
GPT-Fast Adds Mixtral-8x7B Support in Efficient PyTorch
By
–
Global AI News Aggregator
By
–
gpt-fast now supports mixtral-8x7B, in addition to gpt/llama.
1000 lines of simple pytorch code blazing it out! https://
github.com/pytorch-labs/g
pt-fast/pull/71
…
Leave a Reply