The discovery and optimization of materials is a key bottleneck in developing and deploying better energy storage and conversion technologies. Finding or designing novel materials that meet the full array of physicochemical requirements for use in battery electrolytes is a challenging multi-variate optimization problem. Conventionally, electrolyte discovery has been done experimentally or with ab-initio thermodynamic calculations. These methods are time and resource expensive making them intractable in large high-dimensional chemical spaces. Foundation Models (FMs) offer a solution to both the exploration and evaluation issues. These models use self-supervised pre-training strategies to leverage unlabeled datasets and learn data representations that can then be applied to a wide range downstream tasks. Prior attempts to train FMs for molecular property prediction demonstrate promise; however, equivariant geometric models trained using supervised machine learning are still more accurate and state-of-the-art. Our project will train a foundation model on the largest available chemical dataset (49B molecules, ~50x larger than prior attempts) using wafer-scale computing to achieve accuracy similar to quantum mechanical computational methods. We will then fine-tune the model on the PubChemQC dataset for tasks relevant to electrolyte design. In addition, our research aims to explore how emergent behavior observed in large-scale training of FMs can lead to improved electrolyte compositions. Furthermore, we develop neural scaling laws for compute-optimal training of molecular FMs on distributed GPU and wafer-scale clusters, enabling researchers to make reasonable estimates of the hyperparameters for future molecular FMs.
2023
Foundation Models at Quantum Mechanical Accuracy For Battery Electrolyte Design using Wafer-Scale Computing
Other Researchers
Venkat Viswanathan (Aerospace Engineering)
Vikram Gavini (Mechanical Engineering)