Fast and Fourier features for transfer learning of interatomic potentials

P Novelli and G Meanti and PJ Buigues and L Rosasco and M Parrinello and M Pontil and L Bonati, NPJ COMPUTATIONAL MATERIALS, 11, 293 (2025).

DOI: 10.1038/s41524-025-01779-z

Training machine learning interatomic potentials that are both computationally and data-efficient is a key challenge for enabling their routine use in atomistic simulations. To this effect, we introduce franken, a scalable and lightweight transfer learning framework that extracts atomic descriptors from pre-trained graph neural networks and transfers them to new systems using random Fourier features - an efficient and scalable approximation of kernel methods. It also provides a closed-form fine-tuning strategy for general-purpose potentials such as MACE-MP0, enabling fast and accurate adaptation to new systems or levels of quantum mechanical theory with minimal hyperparameter tuning. On a benchmark dataset of 27 transition metals, franken outperforms optimized kernel-based methods in both training time and accuracy, reducing model training from tens of hours to minutes on a single GPU. We further demonstrate the framework's strong data-efficiency by training stable and accurate potentials for bulk water and the Pt(111)/water interface using just tens of training structures. Our open-source implementation (https://franken.readthedocs.io) offers a fast and practical solution for training potentials and deploying them for molecular dynamics simulations across diverse systems.

Return to Publications page