Monday, February 24, 2020

Subclass Distillation


This video explains the new Subclass Distillation technique from Google AI! Subclass Distillation is an interesting extension to Knowledge Distillation that tasks the Teacher to invent subclasses and produce a more information dense distribution for the Student. This is implemented with a contrastive auxiliary loss in the Teacher's training! Paper Links: Subclass Distillation: https://ift.tt/32k7WlA Distilling the Knowledge in a Neural Network: https://ift.tt/2i90TEN Self-Training with Noisy Student: https://ift.tt/2Q8GfYV DistillBERT: https://ift.tt/2qjRBhG Thanks for watching! Please Subscribe!

No comments:

Post a Comment