Thursday, April 9, 2020

Evolving Normalization-Activation Layers


Normalization and activation layers have seen a long history of hand-crafted variants with various results. This paper proposes an evolutionary search to determine the ultimate, final and best combined normalization-activation layer... in a very specific setting. https://ift.tt/3aTX6Gr Abstract: Normalization layers and activation functions are critical components in deep neural networks that frequently co-locate with each other. Instead of designing them separately, we unify them into a single computation graph, and evolve its structure starting from low-level primitives. Our layer search algorithm leads to the discovery of EvoNorms, a set of new normalization-activation layers that go beyond existing design patterns. Several of these layers enjoy the property of being independent from the batch statistics. Our experiments show that EvoNorms not only excel on a variety of image classification models including ResNets, MobileNets and EfficientNets, but also transfer well to Mask R-CNN for instance segmentation and BigGAN for image synthesis, outperforming BatchNorm and GroupNorm based layers by a significant margin in many cases. Authors: Hanxiao Liu, Andrew Brock, Karen Simonyan, Quoc V. Le Links: YouTube: https://www.youtube.com/c/yannickilcher Twitter: https://twitter.com/ykilcher BitChute: https://ift.tt/38iX6OV Minds: https://ift.tt/37igBpB

No comments:

Post a Comment