论文标题
共同的人的目标和人搜索
Joint Person Objectness and Repulsion for Person Search
论文作者
论文摘要
人搜索目标是从未约束的场景图像中搜索探测人员,这可以视为人检测和人匹配的组合。但是,基于检测匹配框架的现有方法忽略了人的气质和排斥(OR),这两者都有益于减少干扰物图像的效果。在本文中,我们通过共同考虑目标和排斥信息提出或相似性。除了传统的视觉相似性术语外,或相似性还包含一个对象术语和一个排斥项。对象术语可以减少不包含一个人的干扰图像的相似性,并通过提高正面样本的排名来提高人的搜索表现。由于探针人员的\ emph {邻居}具有不同的人ID,因此与\ emph {proce}邻居}具有较高相似性的画廊图像应与探测员具有较低的相似性。基于这种排斥约束,提出了排斥项,以减少与探测者最相似的干扰物图像的相似性。将更快的R-CNN视为人检测器,通过使用六个描述模型的检测匹配框架在PRW和CUHK-SYSU数据集上评估或相似性。广泛的实验表明,所提出的或相似性可以有效地降低干扰物样品的相似性,并进一步提高人搜索的性能,例如,对于CUHK Sysy数据集,将MAP从92.32%提高到93.23%,而PRW数据集则将MAP从50.91%提高到52.91%。
Person search targets to search the probe person from the unconstrainted scene images, which can be treated as the combination of person detection and person matching. However, the existing methods based on the Detection-Matching framework ignore the person objectness and repulsion (OR) which are both beneficial to reduce the effect of distractor images. In this paper, we propose an OR similarity by jointly considering the objectness and repulsion information. Besides the traditional visual similarity term, the OR similarity also contains an objectness term and a repulsion term. The objectness term can reduce the similarity of distractor images that not contain a person and boost the performance of person search by improving the ranking of positive samples. Because the probe person has a different person ID with its \emph{neighbors}, the gallery images having a higher similarity with the \emph{neighbors of probe} should have a lower similarity with the probe person. Based on this repulsion constraint, the repulsion term is proposed to reduce the similarity of distractor images that are not most similar to the probe person. Treating the Faster R-CNN as the person detector, the OR similarity is evaluated on PRW and CUHK-SYSU datasets by the Detection-Matching framework with six description models. The extensive experiments demonstrate that the proposed OR similarity can effectively reduce the similarity of distractor samples and further boost the performance of person search, e.g., improve the mAP from 92.32% to 93.23% for CUHK-SYSY dataset, and from 50.91% to 52.30% for PRW datasets.