paint-brush
Predicting a Protein’s Stability under a Million Mutations: Abstract & Introductionby@mutation

Predicting a Protein’s Stability under a Million Mutations: Abstract & Introduction

by The Mutation PublicationMarch 12th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Protein engineering is the discipline of mutating a natural protein sequence to improve properties for industrial and pharmaceutical applications.
featured image - Predicting a Protein’s Stability
under a Million Mutations: Abstract & Introduction
The Mutation Publication HackerNoon profile picture

This paper is available on arxiv under CC 4.0 license.

Authors:

(1) Jeffrey Ouyang-Zhang, UT Austin

(2) Daniel J. Diaz, UT Austin

(3) Adam R. Klivans, UT Austin

(4) Philipp Krähenbühl, UT Austin

Abstract

Stabilizing proteins is a foundational step in protein engineering. However, the evolutionary pressure of all extant proteins makes identifying the scarce number of mutations that will improve thermodynamic stability challenging. Deep learning has recently emerged as a powerful tool for identifying promising mutations. Existing approaches, however, are computationally expensive, as the number of model inferences scales with the number of mutations queried. Our main contribution is a simple, parallel decoding algorithm. Our Mutate Everything is capable of predicting the effect of all single and double mutations in one forward pass. It is even versatile enough to predict higher-order mutations with minimal computational overhead. We build our Mutate Everything on top of ESM2 and AlphaFold, neither of which were trained to predict thermodynamic stability. We trained on the MegaScale cDNA proteolysis dataset and achieved state-of-the-art performance on single and higher-order mutations on S669, ProTherm, and ProteinGym datasets. Our code is available at https://github.com/jozhang97/MutateEverything.


1 Introduction

Protein engineering is the discipline of mutating a natural protein sequence to improve properties for industrial [5, 78] and pharmaceutical applications [2, 25, 41]. However, evolution simultaneously optimizes several properties of a protein within its native environment, resulting in proteins with marginal thermodynamic stability (∼ 5-15 kcal/mol) [39] which become non-functional in an industrial setting. Therefore, repurposing a natural protein for biotechnological applications usually begins with identifying non-deleterious mutations that stabilize the structure. With a stabilized structure, downstream engineering goals, which often require exploring destabilizing mutations, become tractable. Historically, this process has been bottlenecked by the requirements of extensive laboratory characterization of rational designs [33, 79] or directed evolution libraries [4, 12, 21, 22]. The recent explosion in biological data [9, 29, 72, 73] has enabled deep learning frameworks to accelerate the identification of stabilizing mutations. A successfully stabilized protein often requires several mutations. However, current frameworks do not take into account epistatic interactions between multiple mutations. Thus, to significantly accelerate protein engineering, it is critical to efficiently navigate the epistatic landscape of higher-order mutations [10, 69]. However, due to its combinatorial nature, thorough exploration quickly becomes computationally prohibitive.


In this paper, we introduce Mutate Everything to directly predict changes in thermodynamic stability (∆∆G) for all single and higher-order mutations jointly. Mutate Everything is a parallel decoding algorithm that works in conjunction with a sequence model for predicting thermodynamic stability.



Figure 1: Mutate Everything efficiently predicts ∆∆G, the change in thermodynamic stability of folding, for over a million mutations (e.g. all single, double mutations) in a single inference step. This helps identify and prioritize stabilizing mutations (∆∆G < 0) in protein engineering efforts. The notation for a mutation is AAf rom, pos, AAto (e.g. “P1K” mutates from P to K at position 1)


Prior models for predicting thermodynamic stability work by producing an embedding or representation of the input sequence. Our decoder takes this representation and uses a linear network to create further representations, one for every possible single mutation. These mutation-level representations are then further aggregated to form representations for higher-order mutations. We feed these higher-order representations into a lightweight multi-layer perception (MLP) head to output predicted ∆∆G measurements. Since the mutation-level representations are computed only once, we are able to scale to millions of (higher-order) mutations, as the aggregation and MLP-head computations are inexpensive. As such, given a fixed computational budget, our model evaluates millions more potential mutations than prior methods.


Mutate Everything estimates the effects of all single and double mutations for a single protein in seconds on one GPU. This can efficiently compute the stability of all single and double mutants across all ∼20,000 proteins in the human proteome within a short time frame. To the best of our knowledge, Mutate Everything is the first tool that renders the computational analysis of double mutants across the entire proteome tractable. Mutate Everything can be used in conjunction with any model that generates representations for protein sequences. In this paper, we use AlphaFold [31] as a backbone architecture and fine-tune it on a dataset labeled by experimentally derived ∆∆G measurements. This results in the first accurate model of stability prediction based on AlphaFold.


We evaluate the performance of our Mutate Everything on well-established benchmarks for predicting experimentally-validated ∆∆G measurements of mutations: ProTherm [48], S669 [54], and ProteinGym [49]. On ProTherm high-order mutations (PTMul), our model achieves a Spearman of 0.53, compared to 0.50 in the next best method. On S669 single mutations, our model achieves a Spearman of 0.56, outperforming the state-of-the-art at 0.53. On ProteinGym, the Mutate Everything outperforms state-of-the-art methods from 0.48 to 0.49. Our model makes finding the most stabilizing double mutations computationally tractable. We show on the double mutant subset of the cDNA-proteolysis dataset (cDNA2) [72], where only 3% of known mutations are stabilizing, that Mutate Everything effectively ranks stabilizing mutations ahead of destabilizing ones.