Training foundation models require enormous resources. We can overcome this by working with the vast collective intelligence of existing models. @HuggingFace has over 500k models in dozens of modalities that, in principle, can be combined to form new models with new capabilities!
Combining 500K Models to Train Foundation Models Efficiently
By
–
Leave a Reply