Physics-Informed with Power-Enhanced Residual Network: Results, Acknowledgments & Referencesby@interpolation
120 reads

Physics-Informed with Power-Enhanced Residual Network: Results, Acknowledgments & References

by The Interpolation PublicationFebruary 28th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Discover the power of Power-Enhancing Residual Networks for superior interpolation in 2D/3D domains with physics-informed solutions, also available on Arxiv.
featured image - Physics-Informed with Power-Enhanced Residual Network: Results, Acknowledgments & References
The Interpolation Publication HackerNoon profile picture

This paper is available on arxiv under CC 4.0 license.


(1) Amir Noorizadegan, Department of Civil Engineering, National Taiwan University;

(2) D.L. Young, Core Tech System Co. Ltd, Moldex3D, Department of Civil Engineering, National Taiwan University & [email protected];

(3) Y.C. Hon, Department of Mathematics, City University of Hong Kong;

(4) C.S. Chen, Department of Civil Engineering, National Taiwan University & [email protected].

Abstract & Introduction

Neural Networks

PINN for Solving Inverse Burgers’ Equation

Residual Network

Numerical Results

Results, Acknowledgments & References

6 Conclusion

Throughout this study, we conducted a series of experiments to assess how different neural network setups, including Plain NN and SQR-SkipResNet, perform when it comes to interpolating both smooth and complex functions. Our findings consistently showed that SQR-SkipResNet outperforms other architectures in terms of accuracy. This was especially evident when dealing with non-smooth functions, where SQR-SkipResNet displayed improved accuracy, although it might take slightly more time to converge. We also applied our approach to real-world examples, like interpolating the shape of a volcano and the Stanford bunny. In both cases, SQR-SkipResNet exhibited better accuracy, convergence, and computational time compared to Plain NN.

Furthermore, while opting for a deeper network might at times lead to reduced accuracy for both Plain NN and SQR-SkipResNet, we observed that this outcome is influenced by the specific problem. For instance, when dealing with the complicated geometry of the Stanford Bunny and its smooth function, we noticed that deeper networks yielded enhanced accuracy, quicker convergence, and improved CPU efficiency. Regardless of whether deeper networks are suitable, the proposed method demonstrated superior performance. As the effectiveness of network depth varies based on the problem, our approach offers a more favorable architecture choice for networks of different depths.

Additionally, when applied to solve the inverse Burgers’ equation using a physicsinformed neural network, our proposed architecture showcased significant accuracy and stability improvements across different numbers of hidden layers, unlike Plain NN. Prospective studies might delve into further optimizations, extensions, and applications of the SQR-SkipResNet framework across diverse domains, particularly for addressing a broad range of inverse problems coupled with PINN methodologies.


Authors gratefully acknowledge the financial support of the Ministry of Science and Technology (MOST) of Taiwan under grant numbers 111-2811-E002-062, 109-2221-E002- 006-MY3, and 111-2221-E-002 -054 -MY3.


[1] Raissi, M., P. Perdikaris, and G.E. Karniadakis, Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics, 378: p. 686-707, 2019.

[2] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition. arXiv preprint arXiv:1512.03385, 2015.

[3] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Identity mappings in deep residual networks. arXiv preprint arXiv:1603.05027, 2016.

[4] Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer, Tom Goldstein: Visualizing the Loss Landscape of Neural Nets, arxiv.1712.09913v3, 2018.

[5] Veit, A., M. Wilber, and S. Belongie, Residual networks behave like ensembles of relatively shallow networks, in Proceedings of the 30th International Conference on Neural Information Processing Systems. Curran Associates Inc.: Barcelona, Spain. p. 550–558, 2016.

[6] S. Jastrzębski, Arpit D, Ballas N, Verma V, Che T, Bengio Y: Residual Connections Encourage Iterative Inference. arXiv:1710.04773, 2017.

[7] Lu, Lu., M. Dao, P. Kumar, U. Ramamurty, G.E. Karniadakis, and S. Suresh, Extraction of mechanical properties of materials through deep learning from instrumented indentation. Proceedings of the National Academy of Sciences, 117(13): p. 7052-7062, 2020.

[8] Wang, S., Y. Teng, and P. Perdikaris, Understanding and Mitigating Gradient Flow Pathologies in Physics-Informed Neural Networks. SIAM Journal on Scientific Computing, 43(5): p. A3055-A3081, 2021.

[9] R. Franke, Scattered data interpolation: tests of some methods, Mathematics of Computation 38 , 181–200, 1982.

[10] C.-S. Chen, A. Noorizadegan, C.S. Chen, D.L. Young, On the selection of a better radial basis function and its shape parameter in interpolation problems, Applied Mathematics and Computation 442, 12771, 2023.

[11] denizunlusu / Getty Images. (n.d.). Mt Eden, or Maungawhau, is a significant M¯aori site. Retrieved from

[12] Mira Bozzini, Milvia Rossini. Testing methods for 3d scattered data interpolation. Multivariate Approximation and Interpolation with Applications, 20, 2002.