AI Dynamics

Global AI News Aggregator

Pre-training ImageNet Fine-tuning CIFAR-10 Privacy

The good news is, public data can help dramatically! Pre-training on ImageNet (without privacy constraints) and fine-tuning on CIFAR-10 (privately) gets us up to 95%+ (figure from the same paper). Takes it from bad (and in some cases, unusable) to pretty good! 3/n

→ View original post on X — @thegautamkamath,

Commentaires

Leave a Reply

Your email address will not be published. Required fields are marked *