Colossal AI is an Open Source project for distributed training of Large AI Models. With this you just need 1.6GB of GPU memory, and can get 7.73x acceleration in the training process of ChatGPT. Checkout the code: http://
github.com/hpcaitech/Colo
ssalAI/tree/main/applications/ChatGPT
…
Colossal AI: Distributed Training for Large Language Models with GPU Efficiency
By
–
Leave a Reply