Resource of free step by step video how to guides to get you started with machine learning.
Sunday, July 5, 2020
SpineNet: Learning Scale-Permuted Backbone for Recognition and Localization (Paper Explained)
#machinelearning #ai #google The high-level architecture of CNNs has not really changed over the years. We tend to build high-resolution low-dimensional layers first, followed by ever more coarse, but deep layers. This paper challenges this decades-old heuristic and uses neural architecture search to find an alternative, called SpineNet that employs multiple rounds of re-scaling and long-range skip connections. OUTLINE: 0:00 - Intro & Overview 1:00 - Problem Statement 2:30 - The Problem with Current Architectures 8:20 - Scale-Permuted Networks 11:40 - Neural Architecture Search 14:00 - Up- and Downsampling 19:10 - From ResNet to SpineNet 24:20 - Ablations 27:00 - My Idea: Attention Routing for CNNs 29:55 - More Experiments 34:45 - Conclusion & Comments Papers: https://ift.tt/2YY641E Code: https://ift.tt/2NdsUhT Abstract: Convolutional neural networks typically encode an input image into a series of intermediate features with decreasing resolutions. While this structure is suited to classification tasks, it does not perform well for tasks requiring simultaneous recognition and localization (e.g., object detection). The encoder-decoder architectures are proposed to resolve this by applying a decoder network onto a backbone model designed for classification tasks. In this paper, we argue encoder-decoder architecture is ineffective in generating strong multi-scale features because of the scale-decreased backbone. We propose SpineNet, a backbone with scale-permuted intermediate features and cross-scale connections that is learned on an object detection task by Neural Architecture Search. Using similar building blocks, SpineNet models outperform ResNet-FPN models by ~3% AP at various scales while using 10-20% fewer FLOPs. In particular, SpineNet-190 achieves 52.5% AP with a MaskR-CNN detector and achieves 52.1% AP with a RetinaNet detector on COCO for a single model without test-time augmentation, significantly outperforms prior art of detectors. SpineNet can transfer to classification tasks, achieving 5% top-1 accuracy improvement on a challenging iNaturalist fine-grained dataset. Code is at: this https URL. Authors: Xianzhi Du, Tsung-Yi Lin, Pengchong Jin, Golnaz Ghiasi, Mingxing Tan, Yin Cui, Quoc V. Le, Xiaodan Song Thumbnail art by Lucas Ferreira Links: YouTube: https://www.youtube.com/c/yannickilcher Twitter: https://twitter.com/ykilcher Discord: https://ift.tt/3dJpBrR BitChute: https://ift.tt/38iX6OV Minds: https://ift.tt/37igBpB
Subscribe to:
Post Comments (Atom)
-
Using GPUs in TensorFlow, TensorBoard in notebooks, finding new datasets, & more! (#AskTensorFlow) [Collection] In a special live ep...
-
JavaやC++で作成された具体的なルールに従って動く従来のプログラムと違い、機械学習はデータからルール自体を推測するシステムです。機械学習は具体的にどのようなコードで構成されているでしょうか? 機械学習ゼロからヒーローへの第一部ではそのような疑問に応えるため、ガイドのチャー...
-
#deeplearning #noether #symmetries This video includes an interview with first author Ferran Alet! Encoding inductive biases has been a lo...
-
How to Do PS2 Filter (Tiktok PS2 Filter Tutorial), AI tiktok filter Create your own PS2 Filter photos with this simple guide! 🎮📸 Please...
-
#ai #attention #transformer #deeplearning Transformers are famous for two things: Their superior performance and their insane requirements...
-
K Nearest Neighbors Application - Practical Machine Learning Tutorial with Python p.14 [Collection] In the last part we introduced Class...
-
Challenge scenario You were recently hired as a Machine Learning Engineer at a startup movie review website. Your manager has tasked you wit...
-
We Talked To Sophia — The AI Robot That Once Said It Would 'Destroy Humans' [Collection] This AI robot once said it wanted to de...
-
Programming R Squared - Practical Machine Learning Tutorial with Python p.11 [Collection] Now that we know what we're looking for, l...
-
RNN Example in Tensorflow - Deep Learning with Neural Networks 11 [Collection] In this deep learning with TensorFlow tutorial, we cover ...
No comments:
Post a Comment