Research articles using hippynn

hippynn implements a variety of methods from the research literature. Some of the earlier research was created with an older, internal implementation of HIP-NN using theano. However, the capabilities are available in hippynn.

One of the main components of hippynn is the implementation of HIP-NN, or the Hierarchical Interacting Particle Neural Network, was introduced in Lubbers et al. [LSB18] for the modeling of molecular energies and forces from atomistic configuration data. HIP-NN was also used to help validate results for potential energy surfaces in Suwa et al. [SSL+19] and Smith et al. [SNM+21], and was later extended to a more flexible functional form, HIP-NN with Tensor Sensitivities, or HIP-NN-TS, in Chigaev et al. [CSA+23]. Fedik et al. [FLL+25] critically examined the performance of this improved functional form for transitions states and transition path sampling. Matin et al. [MAS+24] demonstrated a method for improving the performance of potentials with respect to experiment by incorporating experimental structural data. Burrill et al. [BLT+25] showed how a linear combination of semi-empirical and machine learning models can be more powerful than either model alone. Shinkle et al. [SPB+24] demonstrated that HIP-NN can model free energies for coarse-grained models using force-matching, and that these many-body models provide improved transferability between thermodynamic states. A more advanced form of the interaction layer, which directly captures high-order many-body information, was built in Allen et al. [ASBL25]. Knowledge distillation techniques, where initial teacher models are used to improve the performance of students, have been examined and found effective using hippynn, in Matin et al. [MAS+25] and Matin et al. [MSP+25]. In Zhang et al. [ZCI+24], a physics-informed constraint for atomization of molecules was introduced to improve the performance of models far from equilibrium. A potential for Uranium Nitride was investigated in Alzate-Vargas et al. [AVSL+24].

HIP-NN is also useful for modeling properties aside from energy/forces. It was adapted to learn charges in Nebgen et al. [NLS+18] and to learn charge predictions from dipole information in Sifain et al. [SLN+18]. Bond order regression to predict two-body quantities was explored in Magedov et al. [MKM+21]. The atom (charge) and two-body (bond) regressions were combined to build Huckel-type quantum Hamiltonians in Zubatiuk et al. [ZNL+21]. This was extended to semi-empirical Hamiltonians in Zhou et al. [ZLB+22] by combining the facilities of hippynn with another pytorch code, PYSEQM, developed by Zhou et al. [ZNL+20], which provides quantum calculations that are differentiable by pytorch. In Li et al. [LKK+25], a lighter-weight version of this framework, using classical equilibrated charges, was employed, using a shadow dynamics technique to avoid cost and numerical difficulties associated with self-consistency conditions.

Another avenue of work has been to model excited state dynamics with HIP-NN. In Sifain et al. [SLM+21], a localization layer was used to predict both the energy and location of singlet-triplet excitations in organic materials. In Habib et al. [HLTN23], HIP-NN was used in a dynamical setting to learn the dynamics of excitons in nanoparticles. In this mode, the predictions of a model produce inputs for the next time step, and training takes place by backpropagating through multiple steps of prediction. Li et al. [LLT+24] used the framework to predict several excited state properties; energy, transition dipole, and non-adiabatic coupling vectors were predicted for several excited states in a molecular system.

[ASBL25]

Alice EA Allen, Emily Shinkle, Roxana Bujack, and Nicholas Lubbers. Optimal invariant bases for atomistic machine learning. arXiv preprint arXiv:2503.23515, 2025.

[AVSL+24]

Lorena Alzate-Vargas, Kashi N Subedi, Nicholas Lubbers, Michael WD Cooper, Roxanne M Tutchton, Tammie Gibson, and Richard A Messerly. Machine learning interatomic potential for modeling uranium mononitride. arXiv preprint arXiv:2411.14608, 2024.

[BLT+25]

Daniel J. Burrill, Chang Liu, Michael G. Taylor, Marc J. Cawkwell, Danny Perez, Enrique R. Batista, Nicholas Lubbers, and Ping Yang. Mltb: enhancing transferability and extensibility of density functional tight-binding theory with many-body interaction corrections. Journal of Chemical Theory and Computation, 21(3):1089–1097, 02 2025.

[CSA+23]

Michael Chigaev, Justin S Smith, Steven Anaya, Benjamin Nebgen, Matthew Bettencourt, Kipton Barros, and Nicholas Lubbers. Lightweight and effective tensor sensitivity for atomistic neural networks. The Journal of Chemical Physics, 2023.

[FLL+25]

Nikita Fedik, Wei Li, Ying Wai Li, Nicholas Lubbers, Benjamin Nebjen, and Sergei Tretiak. Challenges and opportunities for machine learning potentials in transition path sampling: alanine dipeptide and azobenzene studies. Digital Discovery, 2025.

[HLTN23]

Adela Habib, Nicholas Lubbers, Sergei Tretiak, and Benjamin Nebgen. Machine learning models capture plasmon dynamics in ag nanoparticles. The Journal of Physical Chemistry A, 127(17):3768–3778, 2023.

[LKK+25]

Cheng-Han Li, Mehmet Cagri Kaymak, Maksim Kulichenko, Nicholas Lubbers, Benjamin T Nebgen, Sergei Tretiak, Joshua Finkelstein, Daniel P Tabor, and Anders MN Niklasson. Shadow molecular dynamics with a machine learned flexible charge potential. Journal of Chemical Theory and Computation, 2025.

[LLT+24]

Xinyang Li, Nicholas Lubbers, Sergei Tretiak, Kipton Barros, and Yu Zhang. Machine learning framework for modeling exciton polaritons in molecular materials. Journal of Chemical Theory and Computation, 20(2):891–901, 2024.

[LSB18]

Nicholas Lubbers, Justin S Smith, and Kipton Barros. Hierarchical modeling of molecular energies using a deep neural network. The Journal of chemical physics, 2018.

[MKM+21]

Sergey Magedov, Christopher Koh, Walter Malone, Nicholas Lubbers, and Benjamin Nebgen. Bond order predictions using deep neural networks. Journal of Applied Physics, 2021.

[MAS+25]

Sakib Matin, Alice Allen, Emily Shinkle, Aleksandra Pachalieva, Galen T Craven, Benjamin Nebgen, Justin Smith, Richard Messerly, Ying Wai Li, Sergei Tretiak, and others. Teacher-student training improves accuracy and efficiency of machine learning inter-atomic potentials. arXiv preprint arXiv:2502.05379, 2025.

[MAS+24]

Sakib Matin, Alice EA Allen, Justin Smith, Nicholas Lubbers, Ryan B Jadrich, Richard Messerly, Benjamin Nebgen, Ying Wai Li, Sergei Tretiak, and Kipton Barros. Machine learning potentials with the iterative boltzmann inversion: training to experiment. Journal of Chemical Theory and Computation, 20(3):1274–1281, 2024.

[MSP+25]

Sakib Matin, Emily Shinkle, Yulia Pimonova, Galen T Craven, Ying Wai Li, Kipton Barros, and Nicholas Lubbers. Ensemble knowledge distillation for machine learning interatomic potentials. arXiv preprint arXiv:2503.14293, 2025.

[NLS+18]

Benjamin Nebgen, Nicholas Lubbers, Justin S Smith, Andrew E Sifain, Andrey Lokhov, Olexandr Isayev, Adrian E Roitberg, Kipton Barros, and Sergei Tretiak. Transferable dynamic molecular charge assignment using deep neural networks. Journal of chemical theory and computation, 14(9):4687–4698, 2018.

[SPB+24]

Emily Shinkle, Aleksandra Pachalieva, Riti Bahl, Sakib Matin, Brendan Gifford, Galen T Craven, and Nicholas Lubbers. Thermodynamic transferability in coarse-grained force fields using graph neural networks. arXiv preprint arXiv:2406.12112, 2024.

[SLN+18]

Andrew E Sifain, Nicholas Lubbers, Benjamin T Nebgen, Justin S Smith, Andrey Y Lokhov, Olexandr Isayev, Adrian E Roitberg, Kipton Barros, and Sergei Tretiak. Discovering a transferable charge assignment model using machine learning. The journal of physical chemistry letters, 9(16):4495–4501, 2018.

[SLM+21]

Andrew E Sifain, Levi Lystrom, Richard A Messerly, Justin S Smith, Benjamin Nebgen, Kipton Barros, Sergei Tretiak, Nicholas Lubbers, and Brendan J Gifford. Predicting phosphorescence energies and inferring wavefunction localization with machine learning. Chemical Science, 12(30):10207–10217, 2021.

[SNM+21]

Justin S Smith, Benjamin Nebgen, Nithin Mathew, Jie Chen, Nicholas Lubbers, Leonid Burakovsky, Sergei Tretiak, Hai Ah Nam, Timothy Germann, Saryu Fensin, and others. Automated discovery of a robust interatomic potential for aluminum. Nature communications, 12(1):1257, 2021.

[SSL+19]

Hidemaro Suwa, Justin S Smith, Nicholas Lubbers, Cristian D Batista, Gia-Wei Chern, and Kipton Barros. Machine learning for molecular dynamics with strongly correlated electrons. Physical Review B, 99(16):161107, 2019.

[ZCI+24]

Shuhao Zhang, Michael Chigaev, Olexandr Isayev, Richard Messerly, and Nicholas Lubbers. Including physics-informed atomization constraints in neural networks for reactive chemistry. ChemRxiv preprint, 2024.

[ZLB+22]

Guoqing Zhou, Nicholas Lubbers, Kipton Barros, Sergei Tretiak, and Benjamin Nebgen. Deep learning of dynamically responsive chemical hamiltonians with semiempirical quantum mechanics. Proceedings of the National Academy of Sciences, 119(27):e2120333119, 2022.

[ZNL+20]

Guoqing Zhou, Ben Nebgen, Nicholas Lubbers, Walter Malone, Anders MN Niklasson, and Sergei Tretiak. Graphics processing unit-accelerated semiempirical born oppenheimer molecular dynamics using pytorch. Journal of Chemical Theory and Computation, 16(8):4951–4962, 2020.

[ZNL+21]

Tetiana Zubatiuk, Benjamin Nebgen, Nicholas Lubbers, Justin S Smith, Roman Zubatyuk, Guoqing Zhou, Christopher Koh, Kipton Barros, Olexandr Isayev, and Sergei Tretiak. Machine learned hückel theory: interfacing physics and deep neural networks. The Journal of Chemical Physics, 2021.