论文标题

CGAP2:上下文和差距意识到的预测姿势框架,用于早期检测手势

CGAP2: Context and gap aware predictive pose framework for early detection of gestures

论文作者

Bhattacharya, Nishant, Sundaram, Suresh

论文摘要

随着对自动驾驶汽车运行的兴趣,人们对人车相互作用的有效预期手势识别系统的需求同样越来越大。现有的手势识别算法主要仅限于历史数据。 In this paper, we propose a novel context and gap aware pose prediction framework(CGAP2), which predicts future pose data for anticipatory recognition of gestures in an online fashion. CGAP2实现编码器架构与姿势预测模块配对,以预测未来的帧,然后是浅层分类器。 CGAP2姿势预测模块使用3D卷积层,取决于提供的姿势框架的数量,每个姿势框架之间的时差以及预测的姿势帧的数量。用MPJPE度量评估CGAP2的性能。对于提前15帧的姿势预测,实现了79.0mm的误差。姿势预测模块仅由26m参数组成,可以在NVIDIA RTX TITAN上以50 fps运行。此外,消融研究表明向姿势预测模块提供更高的上下文信息可能对预期识别有害。与其他手势识别系统相比,CGAP2具有1秒的时间优势,这对于自动驾驶汽车至关重要。

With a growing interest in autonomous vehicles' operation, there is an equally increasing need for efficient anticipatory gesture recognition systems for human-vehicle interaction. Existing gesture-recognition algorithms have been primarily restricted to historical data. In this paper, we propose a novel context and gap aware pose prediction framework(CGAP2), which predicts future pose data for anticipatory recognition of gestures in an online fashion. CGAP2 implements an encoder-decoder architecture paired with a pose prediction module to anticipate future frames followed by a shallow classifier. CGAP2 pose prediction module uses 3D convolutional layers and depends on the number of pose frames supplied, the time difference between each pose frame, and the number of predicted pose frames. The performance of CGAP2 is evaluated on the Human3.6M dataset with the MPJPE metric. For pose prediction of 15 frames in advance, an error of 79.0mm is achieved. The pose prediction module consists of only 26M parameters and can run at 50 FPS on the NVidia RTX Titan. Furthermore, the ablation study indicates supplying higher context information to the pose prediction module can be detrimental for anticipatory recognition. CGAP2 has a 1-second time advantage compared to other gesture recognition systems, which can be crucial for autonomous vehicles.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源