论文标题
Hebbian和STD的限制学会了尖峰神经元的权重
Constraints on Hebbian and STDP learned weights of a spiking neuron
论文作者
论文摘要
我们从数学上分析了由HEBBIAN和STDP学习规则应用于具有重量标准化的尖峰神经元的重量的约束。在纯Hebbian学习的情况下,我们发现归一化的权重等于权重的促进概率与取决于学习率的校正术语,并且通常很小。 STDP算法可以得出类似的关系,其中归一化的权重值反映了权重的促销和降级概率之间的差异。这些关系实际上很有用,因为它们允许检查Hebbian和STDP算法的收敛性。另一个应用是新颖性检测。我们使用MNIST数据集证明了这一点。
We analyse mathematically the constraints on weights resulting from Hebbian and STDP learning rules applied to a spiking neuron with weight normalisation. In the case of pure Hebbian learning, we find that the normalised weights equal the promotion probabilities of weights up to correction terms that depend on the learning rate and are usually small. A similar relation can be derived for STDP algorithms, where the normalised weight values reflect a difference between the promotion and demotion probabilities of the weight. These relations are practically useful in that they allow checking for convergence of Hebbian and STDP algorithms. Another application is novelty detection. We demonstrate this using the MNIST dataset.