论文标题
具有稳定不变集的学习动力学模型
Learning Dynamics Models with Stable Invariant Sets
论文作者
论文摘要
不变性和稳定性是动态系统研究中的重要概念,因此,学习具有稳定不变集的动力学模型非常有趣。但是,现有方法只能处理平衡的稳定性。在本文中,我们提出了一种方法,以确保动态模型具有稳定的一组通用类别,例如极限周期和线吸引子。我们从Manek和Kolter(2019)的方法开始,在那里他们使用可学习的Lyapunov函数来使模型稳定,以平衡。我们通过将投影引入它们来概括为一般集。为了解决分析稳定不变设置的难度,我们建议在潜在空间中定义一个集合,例如原始形状(例如球体),并学习原始空间和潜在空间之间的转换。它使我们能够轻松地计算投影,同时,我们可以使用各种可逆神经网络来维护模型的灵活性。我们提出了实验结果,该结果表明了所提出的方法的有效性以及长期预测的有用性。
Invariance and stability are essential notions in dynamical systems study, and thus it is of great interest to learn a dynamics model with a stable invariant set. However, existing methods can only handle the stability of an equilibrium. In this paper, we propose a method to ensure that a dynamics model has a stable invariant set of general classes such as limit cycles and line attractors. We start with the approach by Manek and Kolter (2019), where they use a learnable Lyapunov function to make a model stable with regard to an equilibrium. We generalize it for general sets by introducing projection onto them. To resolve the difficulty of specifying a to-be stable invariant set analytically, we propose defining such a set as a primitive shape (e.g., sphere) in a latent space and learning the transformation between the original and latent spaces. It enables us to compute the projection easily, and at the same time, we can maintain the model's flexibility using various invertible neural networks for the transformation. We present experimental results that show the validity of the proposed method and the usefulness for long-term prediction.