论文标题

偏微分方程的两层神经网络:优化和概括理论

Two-Layer Neural Networks for Partial Differential Equations: Optimization and Generalization Theory

论文作者

Luo, Tao, Yang, Haizhao

论文摘要

求解部分微分方程(PDE)的问题可以将其提出为最小二乘最小化的问题,其中神经网络用于参数化PDE解决方案。全局最小化器对应于解决给定PDE的神经网络。在本文中,我们表明梯度下降方法可以识别最小二乘优化的全局最小化器,用于在过度参数化的假设下使用两层神经网络求解二阶线性PDE。我们还分析了二阶线性PDE和两层神经网络的最小二乘优化的概括误差,当PDE的右手侧函数处于Barron型空间中,并且最小二乘优化的优化是用Barron型标准正规化的,而无需过度参数化假设。

The problem of solving partial differential equations (PDEs) can be formulated into a least-squares minimization problem, where neural networks are used to parametrize PDE solutions. A global minimizer corresponds to a neural network that solves the given PDE. In this paper, we show that the gradient descent method can identify a global minimizer of the least-squares optimization for solving second-order linear PDEs with two-layer neural networks under the assumption of over-parametrization. We also analyze the generalization error of the least-squares optimization for second-order linear PDEs and two-layer neural networks, when the right-hand-side function of the PDE is in a Barron-type space and the least-squares optimization is regularized with a Barron-type norm, without the over-parametrization assumption.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源