论文标题

几乎没有样本的可解释的时间序列分类

Interpretable Time-series Classification on Few-shot Samples

论文作者

Tang, Wensi, Liu, Lu, Long, Guodong

论文摘要

最近的几次学习工作的重点是培训具有先前元知识的模型,以快速适应具有看不见的课程和样本的新任务。但是,传统的时间序列分类算法无法应对几个弹药方案。提出了现有的少量学习方法来解决图像或文本数据,其中大多数是基于神经的模型,它们缺乏可解释性。本文提出了一个可解释的基于神经的框架,即\ textIt {dual prototypical shapelet网络(dpsn)},用于几个时间序列分类,它不仅可以训练基于神经网络的模型,而且还可以从双重粒度训练该模型:1)使用代表性时间序列的全球概述使用代表性的时间序列,并使用2)local indirative indive difrimimine nirdime nimbels shapel shapel shapel shapel shapel shapel shapel shapel。特别是,生成的双型原型形状由代表性样本组成,这些样本主要可以证明类中所有样本的总体形状以及可用于区分不同类别的歧视性偏长形状形状。我们已经从公共基准数据集派生了18个少量TSC数据集,并通过与基准进行了比较,评估了所提出的方法。 DPSN框架的表现优于最先进的时间序列分类方法,尤其是在培训有限的数据时。已经提供了一些案例研究来证明我们的模型的解释能力。

Recent few-shot learning works focus on training a model with prior meta-knowledge to fast adapt to new tasks with unseen classes and samples. However, conventional time-series classification algorithms fail to tackle the few-shot scenario. Existing few-shot learning methods are proposed to tackle image or text data, and most of them are neural-based models that lack interpretability. This paper proposes an interpretable neural-based framework, namely \textit{Dual Prototypical Shapelet Networks (DPSN)} for few-shot time-series classification, which not only trains a neural network-based model but also interprets the model from dual granularity: 1) global overview using representative time series samples, and 2) local highlights using discriminative shapelets. In particular, the generated dual prototypical shapelets consist of representative samples that can mostly demonstrate the overall shapes of all samples in the class and discriminative partial-length shapelets that can be used to distinguish different classes. We have derived 18 few-shot TSC datasets from public benchmark datasets and evaluated the proposed method by comparing with baselines. The DPSN framework outperforms state-of-the-art time-series classification methods, especially when training with limited amounts of data. Several case studies have been given to demonstrate the interpret ability of our model.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源