Link to full page (citation export, more details):
Fine-Tuning a Transformer Model for METTL3 Lead Optimization
Transformers are machine learning models originally developed to translate between natural languages. Recently, a transformer model was trained on knowledge of medicinal chemistry, i.e., matched molecular pairs of nearly a million bioactive compounds from the ChEMBL database. Here, we customize (i.e., fine-tune) the pretrained model to enhance the affinity and/or metabolic stability of a series of inhibitors of methyltransferase-like protein 3 (METTL3). We first fine-tune the transformer model using a data set of about 500 METTL3 inhibitors with known binding affinities and validate it by retrospective analysis. Then, we fine-tune the original transformer model to simultaneously optimize binding affinity and metabolic stability in a prospective application. Two of the five METTL3 inhibitors predicted by the multiobjective optimized model show low-nanomolar potency and higher stability than the lead compound of the chemical series used for fine-tuning.