Fun game. Clocking 17973 citations: "Distilling the knowledge in a neural network" @geoffreyhinton
, @OriolVinyalsML
, @JeffDean Reviewer 38 (NeurIPS 2014): "This work is incremental and unlikely to have much impact even though it may be technically correct and well executed."
Knowledge Distillation Paper Achieves 17k Citations Despite Skeptical Review
By
–
Leave a Reply