论文标题
tempclr:通过时期对比度学习重建双手
TempCLR: Reconstructing Hands via Time-Coherent Contrastive Learning
论文作者
论文摘要
我们介绍了TemPCLR,这是一种用于3D手重建的结构化回归任务的新的时期对比学习方法。与以前的手部姿势估计的时间对抗性方法不同,我们的框架考虑了其增强方案中的时间一致性,并说明了沿时间方向的手部姿势的差异。我们的数据驱动方法利用了未标记的视频和标准CNN,而无需依赖合成数据,伪标签或专用体系结构。我们的方法在HO-3D和Freihand数据集中分别将全面监督的手部重建方法的性能提高了15.9%和7.6%,从而确立了新的最先进的性能。最后,我们证明我们的方法会随着时间的流逝而产生更平滑的手部重建,并且与以前的最先进的艺术品相比,对重型的闭塞更为强大。我们的代码和模型将在https://eth-ait.github.io/tempclr上找到。
We introduce TempCLR, a new time-coherent contrastive learning approach for the structured regression task of 3D hand reconstruction. Unlike previous time-contrastive methods for hand pose estimation, our framework considers temporal consistency in its augmentation scheme, and accounts for the differences of hand poses along the temporal direction. Our data-driven method leverages unlabelled videos and a standard CNN, without relying on synthetic data, pseudo-labels, or specialized architectures. Our approach improves the performance of fully-supervised hand reconstruction methods by 15.9% and 7.6% in PA-V2V on the HO-3D and FreiHAND datasets respectively, thus establishing new state-of-the-art performance. Finally, we demonstrate that our approach produces smoother hand reconstructions through time, and is more robust to heavy occlusions compared to the previous state-of-the-art which we show quantitatively and qualitatively. Our code and models will be available at https://eth-ait.github.io/tempclr.