Tuesday, April 14, 2020

Momentum Contrastive Learning


Contrastive Self-Supervised Learning aims to train representations to distinguish objects from one another. Momentum Contrast is one of the most successful ways of doing this avoiding the memory bottlenecks of end-to-end gradients through query and key encoders and avoiding outdated key encodings through a momentum update from the query and a queue of recently encoded keys! MoCo plays an important role in CURL, achieving DeepMind Control without physical state inputs! Thanks for watching and please subscribe! Please subscribe to Machine Learning Street Talk where Yannic Kilcher, Tim Scarfe and I will be joined by Aravind Srinivas to talk about CURL (in which MoCo plays a key role)! ML Street Talk: https://www.youtube.com/channel/UCMLtBahI5DMrt0NPvDSoIRQ Paper Links: MoCo: https://ift.tt/2pggQ4i MoCo v2: https://ift.tt/2xtZ81r SimCLR: https://ift.tt/31TZZTM CURL: https://ift.tt/2UT4PyG The Empty Brain: https://ift.tt/1sxGdLp

No comments:

Post a Comment