论文标题
dexgraspnet:基于模拟的一般物体的大规模机器人灵巧的抓握数据集
DexGraspNet: A Large-Scale Robotic Dexterous Grasp Dataset for General Objects Based on Simulation
论文作者
论文摘要
机器人灵巧的抓握是实现类似人类的灵敏物体操纵并因此是至关重要的机器人技术的第一步。但是,巧妙的抓握比对物体抓握的对象抓握要探索得多,部分原因是缺乏大规模数据集。在这项工作中,我们提出了由我们提出的高效合成方法生成的大规模机器人灵巧的掌握数据集,即DexGraspNet,通常可以应用于任何灵活的手。我们的方法利用了深度加速的可区分闭合估计器,因此可以在大规模上有效,稳健地合成稳定和多样化的掌握。我们选择ShadowHand并为5355个物体生成132万个掌握,涵盖了133个以上的对象类别,并为每个对象实例包含200多个不同的掌握,所有GRASP均已由Isaac Gym Simulator验证。与Liu等人的先前数据集相比。由Graspit!生成,我们的数据集不仅具有更多的对象和掌握,而且具有更高的多样性和质量。通过执行跨数据库实验,我们表明,在我们的数据集上训练了敏感性掌握合成的几种算法,在前一个训练中的训练显着优于培训。要访问我们的数据和代码,包括针对人类和Allegro Grasp合成代码,请访问我们的项目页面:https://pku-epic.github.io/dexgraspnet/。
Robotic dexterous grasping is the first step to enable human-like dexterous object manipulation and thus a crucial robotic technology. However, dexterous grasping is much more under-explored than object grasping with parallel grippers, partially due to the lack of a large-scale dataset. In this work, we present a large-scale robotic dexterous grasp dataset, DexGraspNet, generated by our proposed highly efficient synthesis method that can be generally applied to any dexterous hand. Our method leverages a deeply accelerated differentiable force closure estimator and thus can efficiently and robustly synthesize stable and diverse grasps on a large scale. We choose ShadowHand and generate 1.32 million grasps for 5355 objects, covering more than 133 object categories and containing more than 200 diverse grasps for each object instance, with all grasps having been validated by the Isaac Gym simulator. Compared to the previous dataset from Liu et al. generated by GraspIt!, our dataset has not only more objects and grasps, but also higher diversity and quality. Via performing cross-dataset experiments, we show that training several algorithms of dexterous grasp synthesis on our dataset significantly outperforms training on the previous one. To access our data and code, including code for human and Allegro grasp synthesis, please visit our project page: https://pku-epic.github.io/DexGraspNet/.