Physics-Preserved Learning for Fluid Dynamics: A Momentum-Conserving Neural Framework

Authors

  • Samarth Patel Material and Process QC Engineer, 3DTechnologies Group, Eurofins Lancaster Laborataries. Author

DOI:

https://doi.org/10.63282/3050-9416.ICAIDSCT26-139

Keywords:

Fluid Simulation, Momentum Conservation, Neural Networks, Particle- Based Methods, Physics-Informed Learning

Abstract

Accurate fluid simulations are crucial for engineering and scientific applications, yet traditional numerical methods often require high computational re- sources. This paper presents a novel neural framework that enforces fundamental physical laws, specifically the conservation of momentum, in particle-based fluid simulations. Unlike conventional data-driven models that approximate physics with soft constraints, our approach embeds symmetry-aware convolutional layers to ensure strict adherence to physical principles. Through a hierarchical architecture and optimized resampling techniques, our method demonstrates superior accuracy and generalization across diverse fluid scenarios. Empirical evaluations confirm its robustness, outperforming existing learning-based solvers in both stability and computational efficiency.

References

1. D. C. Wilcox, ”Turbulence modeling for cfd,” 2006.

2. R. A. Gingold and J. J. Monaghan, ”Smoothed particle hydrodynamics: the- ory and application to non-spherical stars.” Monthly Notices of the Royal Astronomical Society, vol. 181, no. 3, pp. 375-389, 1977.

3. T. Tezduyar, S. Aliabadi, M. Behr, A. Johnson, V. Kalro, and M. Litke, ”Flow simulation and high performance computing,” Computational Me- chanics, vol. 18, no. 6, pp. 397-412, 1996.

4. L. Ladicky, S. Jeong, B. Solenthaler, M. Pollefeys, and M. Gross, ”Data- driven fluid simulations using regression forests,” ACM Transactions on Graphics, vol. 34, no. 6, p. 199, 2015.

5. A. Sanchez-Gonzalez, J. Godwin, T. Pfaff, R. Ying, J. Leskovec, and P. W. Battaglia, ”Learning to simulate complex physics with graph networks,” in International Conference on Machine Learning, 2020, pp. 8459-8468.

6. A. Mohan, D. Daniel, M. Chertkov, and D. Livescu, ”Compressed convolu- tional lstm: An efficient deep learning framework to model high fidelity 3d turbulence,” arXiv preprint arXiv:1903.00033, 2019.

7. M. D. Cranmer, S. Greydanus, S. Hoyer, P. W. Battaglia, D. N. Spergel, and S. Ho, ”Lagrangian neural networks,” arXiv preprint arXiv:2003.04630, 2020.

8. A. Sanchez-Gonzalez, V. Bapst, K. Cranmer, and P. W. Battaglia, ”Hamilto- nian graph networks with ode integrators,” arXiv preprint arXiv:1909.12790, 2019.

9. P. Battaglia, R. Pascanu, M. Lai, D. J. Rezende et al., ”Interaction networks for learning about objects, relations and physics,” in Advances in Neural Information Processing Systems, 2016, pp. 4502-4510.

10. Y. Li, J. Wu, R. Tedrake, J. B. Tenenbaum, and A. Torralba, ”Learning particle dynamics for manipulating rigid bodies, deformable objects, and fluids,” in International Conference on Learning Representations, 2019.

11. C. R. Qi, L. Yi, H. Su, and L. J. Guibas, ”Pointnet++: Deep hierarchical feature learning on point sets in a metric space,” in Advances in neural in- formation processing systems, vol. 30, 2017.

12. B. Ummenhofer, L. Prantl, N. Thuerey, and V. Koltun, ”Lagrangian fluid simulation with continuous convolutions,” in International Conference on Learning Representations, 2019.

13. M. Becker and M. Teschner, ”Weakly compressible sph for free surface flows,” pp. 209-217, 2007.

14. B. Solenthaler and R. Pajarola, ”Predictive-corrective incompressible sph,” ACM Transactions on Graphics, vol. 28, no. 3, p. 40, 2009.

15. J. Bender and D. Koschier, ”Divergence-free sph for incompressible and viscous fluids,” IEEE Transactions on Visualization and Computer Graph- ics, vol. 23, no. 3, pp. 1193-1206, 2016.

16. J. Tompson, K. Schlachter, P. Sprechmann, and K. Perlin, ”Accelerating eu- lerian fluid simulation with convolutional networks,” in International Con- ference on Machine Learning, 2017, pp. 3424-3433.

17. D. Kochkov, J. A. Smith, A. Alieva, Q. Wang, M. P. Brenner, and S. Hoyer, ”Machine learning-accelerated computational fluid dynamics,” Proceedings of the National Academy of Sciences, vol. 118, no. 21, 2021.

18. K. Um, R. Brand, Y. R. Fei, P. Holl, and N. Thuerey, ”Solver-in-the-loop: Learning from differentiable physics to interact with iterative pde-solvers,” in Advances in Neural Information Processing Systems, vol. 33, 2020, pp. 14235-14247.

19. T. Cohen and M. Welling, ”Group equivariant convolutional networks,” in International Conference on Machine Learning, 2016, pp. 2990-2999.

20. N. Thomas, T. E. Smidt, S. Kearnes, L. Yang, L. Li, K. Kohlhoff, and P. Riley, ”Tensor field networks: Rotation and translation-equivariant neural networks for 3d point clouds,” arXiv preprint arXiv:1802.08219, 2018.

21. V. G. Satorras, E. Hoogeboom, and M. Welling, ”E (n) equivariant graph neural networks,” in International Conference on Machine Learning, 2021,

22. pp. 9323-9332.

23. K. Sun, B. Xiao, D. Liu, and J. Wang, ”Deep high-resolution representation learning for human pose estimation,” in Proceedings of the IEEE/CVF Con- ference on Computer Vision and Pattern Recognition, 2019, pp. 5693-5703.

24. S. Adami, X. Y. Hu, and N. A. Adams, ”A generalized wall boundary condition for smoothed particle hydrodynamics,” Journal of Computational Physics, vol. 231, no. 21, pp. 7057-7075, 2012.

25. C. R. Qi, H. Su, K. Mo, and L. J. Guibas, ”Pointnet: Deep learning on point sets for 3d classification and segmentation,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016, pp. 652- 660.

Downloads

Published

2026-02-17

How to Cite

1.
Patel S. Physics-Preserved Learning for Fluid Dynamics: A Momentum-Conserving Neural Framework. IJAIBDCMS [Internet]. 2026 Feb. 17 [cited 2026 Feb. 17];:320-5. Available from: https://ijaibdcms.org/index.php/ijaibdcms/article/view/429