Portrait
Yangxin Wu
Undergraduate Student
Harbin Institute of Technology, Shenzhen
About Me

I'm currently a junior at Harbin Institute of Technology, Shenzhen working with Xiucheng Li.

I work on machine learning algorithms, in particular generative modeling, as well connections to mathematics and science ("AI for science").

  1. 🔭 Currently focused on Neural PDE Solvers.
  2. 🌱 Learning advanced and novel generative models' architectures.
  3. 🤔 Interested in molecule generation and reinforcement learning.
  4. 🛠️ Tech Stack: Python, PyTorch.

Download CV
Education
  • Harbin Institute of Technology, Shenzhen
    Harbin Institute of Technology, Shenzhen
    B.E. in Computer Science
    Sep. 2023 - Jul. 2027
News
2026
Visiting the AI for Scientific Simulation and Discovery Lab at Westlake University as an intern, advised by Prof. Tailin Wu!
Feb 01
2025
Our paper "Boundary-Value PDEs Meet Higher-Order Differential Topology-aware GNNs" is selected as Spotlght Poster in NeurIPS'2025.
Sep 18
Selected Publications (view all )
Boundary-Value PDEs Meet Higher-Order Differential Topology-aware GNNs

Yunfeng Liao, Yangxin Wu, Xiucheng Li

Neural Information Processing Systems (NeurIPS) 2025 Spotlight

Recent advances in graph neural network (GNN)-based neural operators have demonstrated significant progress in solving partial differential equations (PDEs) by effectively representing computational meshes. However, most existing approaches overlook the intrinsic physical and topological meaning of higher-order elements in the mesh, which are closely tied to differential forms. In this paper, we propose a higher-order GNN framework that incorporates higher-order interactions based on discrete and finite element exterior calculus. The time-independent boundary value problems (BVPs) in electromagnetism are instantiated to illustrate the proposed framework. It can be easily generalized to other PDEs that admit differential form formulations. Moreover, the novel physics-informed loss terms, integrated form estimators, and theoretical support are derived correspondingly. Experiments show that our proposed method outperforms the existing neural operators by large margins on BVPs in electromagnetism. Our code is available at https://github.com/Supradax/Higher-Order-Differential-Topology-aware-GNN.

Boundary-Value PDEs Meet Higher-Order Differential Topology-aware GNNs

Yunfeng Liao, Yangxin Wu, Xiucheng Li

Neural Information Processing Systems (NeurIPS) 2025 Spotlight

Recent advances in graph neural network (GNN)-based neural operators have demonstrated significant progress in solving partial differential equations (PDEs) by effectively representing computational meshes. However, most existing approaches overlook the intrinsic physical and topological meaning of higher-order elements in the mesh, which are closely tied to differential forms. In this paper, we propose a higher-order GNN framework that incorporates higher-order interactions based on discrete and finite element exterior calculus. The time-independent boundary value problems (BVPs) in electromagnetism are instantiated to illustrate the proposed framework. It can be easily generalized to other PDEs that admit differential form formulations. Moreover, the novel physics-informed loss terms, integrated form estimators, and theoretical support are derived correspondingly. Experiments show that our proposed method outperforms the existing neural operators by large margins on BVPs in electromagnetism. Our code is available at https://github.com/Supradax/Higher-Order-Differential-Topology-aware-GNN.

All publications