Loading…

CodonBERT: Large Language Models for mRNA Design and Optimization

mRNA based vaccines and therapeutics are gaining popularity and usage across a wide range of conditions. One of the critical issues when designing such mRNAs is sequence optimization. Even small proteins or peptides can be encoded by an enormously large number of mRNAs. The actual mRNA sequence can...

Full description

Saved in:
Bibliographic Details
Published in:bioRxiv 2023-11
Main Authors: Li, Sizhen, Moayedpour, Saeed, Li, Ruijiang, Bailey, Michael, Riahi, Saleh, Kogler-Anele, Lorenzo, Miladi, Milad, Miner, Jacob, Zheng, Dinghai, Wang, Jun, Balsubramani, Akshay, Tran, Khang, Zacharia, Minnie, Wu, Monica, Gu, Xiaobo, Ryan, Clinton, Asquith, Carla, Skaleski, Joseph, Boeglin, Lianne, Chivukula, Sudha, Dias, Anusha, Fernando Ulloa Montoya, Agarwal, Vikram, Bar-Joseph, Ziv, Jager, Sven Dr
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:mRNA based vaccines and therapeutics are gaining popularity and usage across a wide range of conditions. One of the critical issues when designing such mRNAs is sequence optimization. Even small proteins or peptides can be encoded by an enormously large number of mRNAs. The actual mRNA sequence can have a large impact on several properties including expression, stability, immunogenicity, and more. To enable the selection of an optimal sequence, we developed CodonBERT, a large language model (LLM) for mRNAs. Unlike prior models, CodonBERT uses codons as inputs which enables it to learn better representations. CodonBERT was trained using more than 10 million mRNA sequences from a diverse set of organisms. The resulting model captures important biological concepts. CodonBERT can also be extended to perform prediction tasks for various mRNA properties. CodonBERT outperforms previous mRNA prediction methods including on a new flu vaccine dataset.Competing Interest StatementThe authors have declared no competing interest.Footnotes* Revise the author list and correct some citations.* https://github.com/Sanofi-Public/CodonBERT
DOI:10.1101/2023.09.09.556981