Hybrid variable spiking graph neural networks for energy-efficient scientific machine learning
I Jain and S Garg and S Shriyam and S Chakraborty, JOURNAL OF THE MECHANICS AND PHYSICS OF SOLIDS, 200, 106152 (2025).
DOI: 10.1016/j.jmps.2025.106152
Graph-based representations for samples of computational mechanics- related datasets can prove instrumental when dealing with problems like irregular domains or molecular structures of materials, etc. To effectively analyze and process such datasets, deep learning offers Graph Neural Networks (GNNs) that utilize techniques like message- passing within their architecture. The issue, however, is that as the individual graph scales and/ or GNN architecture becomes increasingly complex, the increased energy budget of the overall deep learning model makes it unsustainable and restricts its applications in applications like edge computing. To overcome this, we propose in this paper Variable Spiking Graph Neural Networks (VS-GNNs) and their hybrid variants, collectively termed VS-GNN architectures, that utilize Variable Spiking Neurons (VSNs) within their architecture to promote sparse communication and hence reduce the overall energy budget. VSNs, while promoting sparse event-driven computations, also perform well for regression tasks, which are often encountered in computational mechanics applications and are the main target of this paper. Three examples dealing with the prediction of mechanical properties of materials based on their microscale/ mesoscale structures are shown to test the performance of the proposed VS-GNNs architectures in regression tasks. We have compared the performance of VS-GNN architectures with the performance of vanilla GNNs, GNNs utilizing leaky integrate and fire neurons, and GNNs utilizing recurrent leaky integrate and fire neurons. The results produced show that VS-GNN architectures perform well for regression tasks, all while promoting sparse communication and, hence, energy efficiency.
Return to Publications page