Thursday, June 18, 2020

Rethinking Pre-training and Self-Training


This video explores an interesting paper from researchers at Google AI. They show that self-training outperforms supervised or self-supervised (SimCLR) pre-training. The video explains what self-training is and how all these methods attempt to utilize extra data (labeled or not) for better performance on downstream tasks. Thanks for watching! Please Subscribe! Paper Links: Rethinking Pre-training and Self-training: https://ift.tt/2ULTfFp OpenImages Dataset:https://ift.tt/2JIQ89h RetinaNet: https://ift.tt/2hYc33j Rethinking ImageNet Pre-training: https://ift.tt/2BrqK6B Image Classification State-of-the-Art: https://ift.tt/2K3gxBD Self-Training with Noisy Student: https://ift.tt/2warw7S Rotation Self-Supervised Learning: https://ift.tt/317z20P POET: https://ift.tt/2xUnFwp ImageGPT: https://ift.tt/2Yap1hh

No comments:

Post a Comment