Wednesday, April 8, 2020

Evolving Normalization-Activation Layers


This video explains the latest large-scale AutoML study from researchers at Google and DeepMind. The product of this evolutionary AutoML search is a new normalization-activation layer that outperforms the common practice of Batch Norm followed by ReLU! A ResNet-50 with BN-ReLU achieves 76.1% ImageNet accuracy whereas EvoNorm achieves 77.8%! Thanks for watching! Please Subscribe! Paper Links: Evolving Normalization Activation Layers: https://ift.tt/34ohoFi AutoML-Zero: https://ift.tt/2Q4IQCl Searching for Activation Functions (SWISH): https://ift.tt/2gYfK5N BatchNorm: https://ift.tt/2c7r2m1 StyleGAN2: https://ift.tt/325Ino8 GauGAN (SPADE): https://ift.tt/2CsbLsZ Evaluation of ReLU: https://ift.tt/2nIWK0K

No comments:

Post a Comment