Loading [MathJax]/jax/output/HTML-CSS/config.js
Volume 7, Issue 2
Evaluation of Phase Networks in Transformer-Based Neural Network Quantum States

Lizhong Fu, Honghui Shang & Jinlong Yang

Commun. Comput. Chem., 7 (2025), pp. 120-126.

Published online: 2025-06

[An open-access article; the PDF is free to any online user.]

Export citation
  • Abstract

Neural network quantum states represent a powerful approach for solving electronic structures in strongly correlated molecular and material systems. For a neural network ansatz to be accurate, it must effectively learn the phase of a complex wave function. In this work, we demonstrate several different network structures as the phase network for a Transformer-based neural network quantum state implementation. We compare the accuracy of ground state energies, the number of parameters, and computational time across several small molecules. Furthermore, we propose a phase network setup that combines cross attention and multilayer perceptron structures, with the number of parameters remaining constant across different systems. Such an architecture may help reduce computational costs and enable transfer learning to larger quantum chemical systems.

  • AMS Subject Headings

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CiCC-7-120, author = {Fu , LizhongShang , Honghui and Yang , Jinlong}, title = {Evaluation of Phase Networks in Transformer-Based Neural Network Quantum States}, journal = {Communications in Computational Chemistry}, year = {2025}, volume = {7}, number = {2}, pages = {120--126}, abstract = {

Neural network quantum states represent a powerful approach for solving electronic structures in strongly correlated molecular and material systems. For a neural network ansatz to be accurate, it must effectively learn the phase of a complex wave function. In this work, we demonstrate several different network structures as the phase network for a Transformer-based neural network quantum state implementation. We compare the accuracy of ground state energies, the number of parameters, and computational time across several small molecules. Furthermore, we propose a phase network setup that combines cross attention and multilayer perceptron structures, with the number of parameters remaining constant across different systems. Such an architecture may help reduce computational costs and enable transfer learning to larger quantum chemical systems.

}, issn = {2617-8575}, doi = {https://doi.org/10.4208/cicc.2025.92.01}, url = {http://global-sci.org/intro/article_detail/cicc/24181.html} }
TY - JOUR T1 - Evaluation of Phase Networks in Transformer-Based Neural Network Quantum States AU - Fu , Lizhong AU - Shang , Honghui AU - Yang , Jinlong JO - Communications in Computational Chemistry VL - 2 SP - 120 EP - 126 PY - 2025 DA - 2025/06 SN - 7 DO - http://doi.org/10.4208/cicc.2025.92.01 UR - https://global-sci.org/intro/article_detail/cicc/24181.html KW - neural network quantum states, phase network, electronic structure calculation. AB -

Neural network quantum states represent a powerful approach for solving electronic structures in strongly correlated molecular and material systems. For a neural network ansatz to be accurate, it must effectively learn the phase of a complex wave function. In this work, we demonstrate several different network structures as the phase network for a Transformer-based neural network quantum state implementation. We compare the accuracy of ground state energies, the number of parameters, and computational time across several small molecules. Furthermore, we propose a phase network setup that combines cross attention and multilayer perceptron structures, with the number of parameters remaining constant across different systems. Such an architecture may help reduce computational costs and enable transfer learning to larger quantum chemical systems.

Fu , LizhongShang , Honghui and Yang , Jinlong. (2025). Evaluation of Phase Networks in Transformer-Based Neural Network Quantum States. Communications in Computational Chemistry. 7 (2). 120-126. doi:10.4208/cicc.2025.92.01
Copy to clipboard
The citation has been copied to your clipboard