Research articles using hippynn

hippynn implements a variety of methods from the research literature. Some of the earlier research was created with an older, internal implementation of HIP-NN using theano. However, the capabilities are available in hippynn.

One of the main components of hippynn is the implementation of HIP-NN, or the Hierarchical Interacting Particle Neural Network, was introduced in Lubbers et al. [LSB18] for the modeling of molecular energies and forces from atomistic configuration data. HIP-NN was also used to help validate results for potential energy surfaces in Suwa et al. [SSL+19] and Smith et al. [SNM+21], and was later extended to a more flexible functional form, HIP-NN with Tensor Sensitivities, or HIP-NN-TS, in Chigaev et al. [CSA+23]. Matin et al. [MAS+24] demonstrated a method for improving the performance of potentials with respect to experiment by incorporating experimental structural data. Burrill et al. [BLT+25] showed how a linear combination of semi-empirical and machine learning models can be more powerful than either model alone. Shinkle et al. [SPB+24] demonstrated that HIP-NN can model free energies for coarse-grained models using force-matching, and that these many-body models provide improved transferability between thermodynamic states. A more advanced form of the interaction layer, which directly captures high-order many-body information (HIP-HOP-NN), was built in Allen et al. [ASBL26]. Knowledge distillation techniques, where initial teacher models are used to improve the performance of students, have been examined and found effective using hippynn, in Matin et al. [MAS+25]. This was further extended to data without forces by using an ensemble of teachers in Matin et al. [MSP+25], and further extended to distillation of learned free energies for coarse-graining in Olowookere et al. [OMP+26]. In Zhang et al. [ZCI+24], a physics-informed constraint for atomization of molecules was introduced to improve the performance of models far from equilibrium.

hippynn has also been used directly in application to research. Fedik et al. [FLL+25] critically examined the performance of this improved functional form for transitions states and transition path sampling. A potential for Uranium Nitride was investigated in . In Allen et al. [ALM+25] hippynn was used as part of an active learning loop to collect high-quality coupled-cluster data, including forces.

HIP-NN is also useful for modeling properties aside from energy/forces. It was adapted to learn charges in Nebgen et al. [NLS+18] and to learn charge predictions from dipole information in Sifain et al. [SLN+18]. Bond order regression to predict two-body quantities was explored in Magedov et al. [MKM+21]. The atom (charge) and two-body (bond) regressions were combined to build Huckel-type quantum Hamiltonians in Zubatiuk et al. [ZNL+21]. This was extended to semi-empirical Hamiltonians in Zhou et al. [ZLB+22] by combining the facilities of hippynn with another pytorch code, PYSEQM, developed by Zhou et al. [ZNL+20], which provides quantum calculations that are differentiable by pytorch. In Li et al. [LKK+25], a lighter-weight version of this framework, using classical equilibrated charges, was built, using a shadow dynamics technique to avoid cost and numerical difficulties associated with self-consistency conditions.

Another avenue of work has been to model excited state dynamics with HIP-NN. In Sifain et al. [SLM+21], a localization layer was used to predict both the energy and location of singlet-triplet excitations in organic materials. In Habib et al. [HLTN23], HIP-NN was used in a dynamical setting to learn the dynamics of excitons in nanoparticles. In this mode, the predictions of a model produce inputs for the next time step, and training takes place by backpropagating through multiple steps of prediction. Li et al. [LLT+24] used the framework to predict several excited state properties; energy, transition dipole, and non-adiabatic coupling vectors were predicted for several excited states in a molecular system.

[ALM+25]

Alice E. A. Allen, Rui Li, Sakib Matin, Xing Zhang, Benjamin Nebgen, Nicholas Lubbers, Justin S. Smith, Richard Messerly, Sergei Tretiak, Garnet Kin-Lic Chan, and Kipton Barros. Reactive chemistry at unrestricted coupled cluster level: high-throughput calculations for training machine learning potentials. 2025. URL: https://arxiv.org/abs/2509.10872, arXiv:2509.10872.

[ASBL26]

Alice E. A. Allen, Emily Shinkle, Roxana Bujack, and Nicholas Lubbers. Optimal invariant sets for atomistic machine learning. npj Computational Materials, 12(1):75, 2026. URL: https://doi.org/10.1038/s41524-025-01948-0, doi:10.1038/s41524-025-01948-0.

[BLT+25]

Daniel J. Burrill, Chang Liu, Michael G. Taylor, Marc J. Cawkwell, Danny Perez, Enrique R. Batista, Nicholas Lubbers, and Ping Yang. Mltb: enhancing transferability and extensibility of density functional tight-binding theory with many-body interaction corrections. Journal of Chemical Theory and Computation, 21(3):1089–1097, 02 2025.

[CSA+23]

Michael Chigaev, Justin S Smith, Steven Anaya, Benjamin Nebgen, Matthew Bettencourt, Kipton Barros, and Nicholas Lubbers. Lightweight and effective tensor sensitivity for atomistic neural networks. The Journal of Chemical Physics, 2023.

[FLL+25]

Nikita Fedik, Wei Li, Ying Wai Li, Nicholas Lubbers, Benjamin Nebjen, and Sergei Tretiak. Challenges and opportunities for machine learning potentials in transition path sampling: alanine dipeptide and azobenzene studies. Digital Discovery, 2025.

[HLTN23]

Adela Habib, Nicholas Lubbers, Sergei Tretiak, and Benjamin Nebgen. Machine learning models capture plasmon dynamics in ag nanoparticles. The Journal of Physical Chemistry A, 127(17):3768–3778, 2023.

[LKK+25]

Cheng-Han Li, Mehmet Cagri Kaymak, Maksim Kulichenko, Nicholas Lubbers, Benjamin T. Nebgen, Sergei Tretiak, Joshua Finkelstein, Daniel P. Tabor, and Anders M. N. Niklasson. Shadow molecular dynamics with a machine learned flexible charge potential. Journal of Chemical Theory and Computation, 21(7):3658–3675, 04 2025. URL: https://doi.org/10.1021/acs.jctc.5c00062, doi:10.1021/acs.jctc.5c00062.

[LLT+24]

Xinyang Li, Nicholas Lubbers, Sergei Tretiak, Kipton Barros, and Yu Zhang. Machine learning framework for modeling exciton polaritons in molecular materials. Journal of Chemical Theory and Computation, 20(2):891–901, 2024.

[LSB18]

Nicholas Lubbers, Justin S Smith, and Kipton Barros. Hierarchical modeling of molecular energies using a deep neural network. The Journal of chemical physics, 2018.

[MKM+21]

Sergey Magedov, Christopher Koh, Walter Malone, Nicholas Lubbers, and Benjamin Nebgen. Bond order predictions using deep neural networks. Journal of Applied Physics, 2021.

[MAS+25]

Sakib Matin, Alice E. A. Allen, Emily Shinkle, Aleksandra Pachalieva, Galen T. Craven, Benjamin Nebgen, Justin S. Smith, Richard Messerly, Ying Wai Li, Sergei Tretiak, Kipton Barros, and Nicholas Lubbers. Teacher-student training improves the accuracy and efficiency of machine learning interatomic potentials. Digital Discovery, 4:2502–2511, 2025. URL: http://dx.doi.org/10.1039/D5DD00085H, doi:10.1039/D5DD00085H.

[MAS+24]

Sakib Matin, Alice EA Allen, Justin Smith, Nicholas Lubbers, Ryan B Jadrich, Richard Messerly, Benjamin Nebgen, Ying Wai Li, Sergei Tretiak, and Kipton Barros. Machine learning potentials with the iterative boltzmann inversion: training to experiment. Journal of Chemical Theory and Computation, 20(3):1274–1281, 2024.

[MSP+25]

Sakib Matin, Emily Shinkle, Yulia Pimonova, Galen T. Craven, Aleksandra Pachalieva, Ying Wai Li, Kipton Barros, and Nicholas Lubbers. Ensemble knowledge distillation for machine learning interatomic potentials. 2025. URL: https://arxiv.org/abs/2503.14293, arXiv:2503.14293.

[NLS+18]

Benjamin Nebgen, Nicholas Lubbers, Justin S Smith, Andrew E Sifain, Andrey Lokhov, Olexandr Isayev, Adrian E Roitberg, Kipton Barros, and Sergei Tretiak. Transferable dynamic molecular charge assignment using deep neural networks. Journal of chemical theory and computation, 14(9):4687–4698, 2018.

[OMP+26]

Feranmi V. Olowookere, Sakib Matin, Aleksandra Pachalieva, Nicholas Lubbers, and Emily Shinkle. Knowledge distillation of noisy force labels for improved coarse-grained force fields. 2026. URL: https://arxiv.org/abs/2510.26650, arXiv:2510.26650.

[SPB+24]

Emily Shinkle, Aleksandra Pachalieva, Riti Bahl, Sakib Matin, Brendan Gifford, Galen T. Craven, and Nicholas Lubbers. Thermodynamic transferability in coarse-grained force fields using graph neural networks. Journal of Chemical Theory and Computation, 20(23):10524–10539, 12 2024. URL: https://doi.org/10.1021/acs.jctc.4c00788, doi:10.1021/acs.jctc.4c00788.

[SLN+18]

Andrew E Sifain, Nicholas Lubbers, Benjamin T Nebgen, Justin S Smith, Andrey Y Lokhov, Olexandr Isayev, Adrian E Roitberg, Kipton Barros, and Sergei Tretiak. Discovering a transferable charge assignment model using machine learning. The journal of physical chemistry letters, 9(16):4495–4501, 2018.

[SLM+21]

Andrew E Sifain, Levi Lystrom, Richard A Messerly, Justin S Smith, Benjamin Nebgen, Kipton Barros, Sergei Tretiak, Nicholas Lubbers, and Brendan J Gifford. Predicting phosphorescence energies and inferring wavefunction localization with machine learning. Chemical Science, 12(30):10207–10217, 2021.

[SNM+21]

Justin S Smith, Benjamin Nebgen, Nithin Mathew, Jie Chen, Nicholas Lubbers, Leonid Burakovsky, Sergei Tretiak, Hai Ah Nam, Timothy Germann, Saryu Fensin, and others. Automated discovery of a robust interatomic potential for aluminum. Nature communications, 12(1):1257, 2021.

[SSL+19]

Hidemaro Suwa, Justin S Smith, Nicholas Lubbers, Cristian D Batista, Gia-Wei Chern, and Kipton Barros. Machine learning for molecular dynamics with strongly correlated electrons. Physical Review B, 99(16):161107, 2019.

[ZCI+24]

Shuhao Zhang, Michael Chigaev, Olexandr Isayev, Richard Messerly, and Nicholas Lubbers. Including physics-informed atomization constraints in neural networks for reactive chemistry. ChemRxiv preprint, 2024.

[ZLB+22]

Guoqing Zhou, Nicholas Lubbers, Kipton Barros, Sergei Tretiak, and Benjamin Nebgen. Deep learning of dynamically responsive chemical hamiltonians with semiempirical quantum mechanics. Proceedings of the National Academy of Sciences, 119(27):e2120333119, 2022.

[ZNL+20]

Guoqing Zhou, Ben Nebgen, Nicholas Lubbers, Walter Malone, Anders MN Niklasson, and Sergei Tretiak. Graphics processing unit-accelerated semiempirical born oppenheimer molecular dynamics using pytorch. Journal of Chemical Theory and Computation, 16(8):4951–4962, 2020.

[ZNL+21]

Tetiana Zubatiuk, Benjamin Nebgen, Nicholas Lubbers, Justin S Smith, Roman Zubatyuk, Guoqing Zhou, Christopher Koh, Kipton Barros, Olexandr Isayev, and Sergei Tretiak. Machine learned hückel theory: interfacing physics and deep neural networks. The Journal of Chemical Physics, 2021.