Abstract
In this paper, we propose a Hermite neural network method for solving the Blasius equation, a nonlinear ordinary differential equation defined on the semi-infinite interval. In this work, Hermite functions are transformed using variable transformation in a semi-infinite domain. Hermite functions are used for the first time in a neural network to solve Blasius differential equations, making this method better than existing networks. This method is efficient for solving differential equations. In this paper, we explore the benefits of using the backpropagation algorithm to update parameters for neural networks. By applying this approach, we can successfully avoid issues such as overflow and local minima, which are common challenges associated with other optimization methods. The results obtained are compared with other methods to validate the proposed method and presented in both graphical and tabular form.