论文标题

具有神经常规微分方程的动态系统的近乎最佳控制

Near-optimal control of dynamical systems with neural ordinary differential equations

论文作者

Böttcher, Lucas, Asikis, Thomas

论文摘要

最佳控制问题自然出现在许多科学应用中,人们希望将动态系统从某个初始状态$ \ mathbf {x} _0 $转移到所需的目标状态$ \ mathbf {x}^*$有限时间$ t $。深度学习和基于神经网络的优化的最新进展有助于开发可以帮助解决涉及高维动力学系统的控制问题的方法。特别是,神经普通微分方程(神经ODE)的框架为迭代近似于与分析性棘手和计算要求的控制任务相关的连续时间控制功能提供了有效的手段。尽管神经ODE控制器在解决复杂的控制问题方面表现出巨大的潜力,但了解诸如网络结构和优化者对学习绩效的影响的影响仍然非常有限。我们的工作旨在解决其中一些知识差距,以进行有效的超参数优化。为此,我们首先分析了如何通过时间截断和未截断的反向传播影响运行时性能以及神经网络学习最佳控制功能的能力。然后,我们使用分析和数值方法,然后研究参数初始化,优化器和神经网络结构的作用。最后,我们将结果与神经控制器隐式正规化控制能量的能力联系起来。

Optimal control problems naturally arise in many scientific applications where one wishes to steer a dynamical system from a certain initial state $\mathbf{x}_0$ to a desired target state $\mathbf{x}^*$ in finite time $T$. Recent advances in deep learning and neural network-based optimization have contributed to the development of methods that can help solve control problems involving high-dimensional dynamical systems. In particular, the framework of neural ordinary differential equations (neural ODEs) provides an efficient means to iteratively approximate continuous time control functions associated with analytically intractable and computationally demanding control tasks. Although neural ODE controllers have shown great potential in solving complex control problems, the understanding of the effects of hyperparameters such as network structure and optimizers on learning performance is still very limited. Our work aims at addressing some of these knowledge gaps to conduct efficient hyperparameter optimization. To this end, we first analyze how truncated and non-truncated backpropagation through time affect runtime performance and the ability of neural networks to learn optimal control functions. Using analytical and numerical methods, we then study the role of parameter initializations, optimizers, and neural-network architecture. Finally, we connect our results to the ability of neural ODE controllers to implicitly regularize control energy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源