Saturday, July 4, 2020

Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention (Paper Explained)


#ai #attention #transformer #deeplearning Transformers are famous for two things: Their superior performance and their insane requirements of compute and memory. This paper reformulates the attention mechanism in terms of kernel functions and obtains a linear formulation, which reduces these requirements. Surprisingly, this formulation also surfaces an interesting connection between autoregressive transformers and RNNs. OUTLINE: 0:00 - Intro & Overview 1:35 - Softmax Attention & Transformers 8:40 - Quadratic Complexity of Softmax Attention 9:40 - Generalized Attention Mechanism 13:45 - Kernels 20:40 - Linear Attention 25:20 - Experiments 28:30 - Intuition on Linear Attention 33:55 - Connecting Autoregressive Transformers and RNNs 41:30 - Caveats with the RNN connection 46:00 - More Results & Conclusion Paper: https://ift.tt/3g9qKtf Website: https://ift.tt/2YO6lV3 Code: https://ift.tt/38rVYsZ My Video on Attention: https://youtu.be/iDulhoQ2pro My Video on BERT: https://youtu.be/-9evrZnBorM Abstract: Transformers achieve remarkable performance in several tasks but due to their quadratic complexity, with respect to the input's length, they are prohibitively slow for very long sequences. To address this limitation, we express the self-attention as a linear dot-product of kernel feature maps and make use of the associativity property of matrix products to reduce the complexity from (N2) to (N), where N is the sequence length. We show that this formulation permits an iterative implementation that dramatically accelerates autoregressive transformers and reveals their relationship to recurrent neural networks. Our linear transformers achieve similar performance to vanilla transformers and they are up to 4000x faster on autoregressive prediction of very long sequences. Authors: Angelos Katharopoulos, Apoorv Vyas, Nikolaos Pappas, François Fleuret Links: YouTube: https://www.youtube.com/c/yannickilcher Twitter: https://twitter.com/ykilcher Discord: https://ift.tt/3dJpBrR BitChute: https://ift.tt/38iX6OV Minds: https://ift.tt/37igBpB

No comments:

Post a Comment