Base on Dual Attention Mechanism for Visible-Infrared Person Re-Identification
DOI:
CSTR:
Author:
Affiliation:

Clc Number:

Fund Project:

The National Natural Science Foundation of China;The Ministry of education of Humanities and Social Science Project;Natural Science Foundation of Jiangxi Province

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    In recent years, visible-infrared person re-identification has attracted the attention of many scholars, and its goal is to match person images with the same identity from images of different modalities. Due to the huge difference between visible images and infrared images, visible-infrared person re-identification is a very challenging image retrieval problem. Existing research focuses on mitigating modal differences by designing network structures to extract shared features or generate intermediate modalities, which are susceptible to areas other than person. In order to solve such problems, focus on person information, and further reduce the difference between the two modes, a network structure of dual attention mechanism is proposed for visible-infrared person re-identification, on the one hand, through the dual attention mechanism to mine person spatial information of different scales and enhance the channel interaction ability of local features. On the other hand, the use of global branches and local branches, learn multi-granular feature information, so that different granular information can complement each other to form a more discriminating feature. Experimental results on two public datasets show that the proposed method has a significant improvement compared with the baseline, and shows ideal performance on both the RegDB dataset and the SYSU-MM01 dataset.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:May 10,2023
  • Revised:June 21,2023
  • Adopted:June 27,2023
  • Online: March 25,2024
  • Published:
Article QR Code