Resource of free step by step video how to guides to get you started with machine learning.
Thursday, July 2, 2020
BERTology Meets Biology: Interpreting Attention in Protein Language Models (Paper Explained)
Proteins are the workhorses of almost all cellular functions and a core component of life. But despite their versatility, all proteins are built as sequences of the same 20 amino acids. These sequences can be analyzed with tools from NLP. This paper investigates the attention mechanism of a BERT model that has been trained on protein sequence data and discovers that the language model has implicitly learned non-trivial higher-order biological properties of proteins. OUTLINE: 0:00 - Intro & Overview 1:40 - From DNA to Proteins 5:20 - BERT for Amino Acid Sequences 8:50 - The Structure of Proteins 12:40 - Investigating Biological Properties by Inspecting BERT 17:45 - Amino Acid Substitution 24:55 - Contact Maps 30:15 - Binding Sites 33:45 - Linear Probes 35:25 - Conclusion & Comments Paper: https://ift.tt/2VGc5y6 Code: https://ift.tt/3idS1N9 My Video on BERT: https://youtu.be/-9evrZnBorM My Video on Attention: https://youtu.be/iDulhoQ2pro Abstract: Transformer architectures have proven to learn useful representations for protein classification and generation tasks. However, these representations present challenges in interpretability. Through the lens of attention, we analyze the inner workings of the Transformer and explore how the model discerns structural and functional properties of proteins. We show that attention (1) captures the folding structure of proteins, connecting amino acids that are far apart in the underlying sequence, but spatially close in the three-dimensional structure, (2) targets binding sites, a key functional component of proteins, and (3) focuses on progressively more complex biophysical properties with increasing layer depth. We also present a three-dimensional visualization of the interaction between attention and protein structure. Our findings align with known biological processes and provide a tool to aid discovery in protein engineering and synthetic biology. The code for visualization and analysis is available at this https URL. Authors: Jesse Vig, Ali Madani, Lav R. Varshney, Caiming Xiong, Richard Socher, Nazneen Fatema Rajani Links: YouTube: https://www.youtube.com/c/yannickilcher Twitter: https://twitter.com/ykilcher Discord: https://ift.tt/3dJpBrR BitChute: https://ift.tt/38iX6OV Minds: https://ift.tt/37igBpB
Subscribe to:
Post Comments (Atom)
-
Using GPUs in TensorFlow, TensorBoard in notebooks, finding new datasets, & more! (#AskTensorFlow) [Collection] In a special live ep...
-
JavaやC++で作成された具体的なルールに従って動く従来のプログラムと違い、機械学習はデータからルール自体を推測するシステムです。機械学習は具体的にどのようなコードで構成されているでしょうか? 機械学習ゼロからヒーローへの第一部ではそのような疑問に応えるため、ガイドのチャー...
-
#deeplearning #noether #symmetries This video includes an interview with first author Ferran Alet! Encoding inductive biases has been a lo...
-
How to Do PS2 Filter (Tiktok PS2 Filter Tutorial), AI tiktok filter Create your own PS2 Filter photos with this simple guide! 🎮📸 Please...
-
#ai #attention #transformer #deeplearning Transformers are famous for two things: Their superior performance and their insane requirements...
-
K Nearest Neighbors Application - Practical Machine Learning Tutorial with Python p.14 [Collection] In the last part we introduced Class...
-
Machine Learning in Python using Visual Studio | Getting Started Python is a popular programming language. It was created by Guido van Ross...
-
We Talked To Sophia — The AI Robot That Once Said It Would 'Destroy Humans' [Collection] This AI robot once said it wanted to de...
-
Programming R Squared - Practical Machine Learning Tutorial with Python p.11 [Collection] Now that we know what we're looking for, l...
-
#minecraft #neuralnetwork #backpropagation I built an analog neural network in vanilla Minecraft without any mods or command blocks. The n...
No comments:
Post a Comment