Zero-shot sketch-based image retrieval via adaptive relation-aware metric learning

Yang Liu*, Yuhao Dang, Xinbo Gao, Jungong Han, Ling Shao

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Retrieving natural images with the query sketches under the zero-shot scenario is known as zero-shot sketch-based image retrieval (ZS-SBIR). Most of the best-performing methods adapt the triplet loss to learn projections that map natural images and sketches to a latent embedding space. They nevertheless neglect the modality gap between the hand-drawn sketches and the photos and consider no difference between any two incorrect classes, which limits their performance in real use cases. Towards this end, we put forward a simple and effective model, which adopts relation-aware metric learning to suppress the modality gap between the sketches and the photos. We also propose an adaptive margin that utilizes each anchor in embedding space to improve clustering ability in metric learning. Extensive experiments on the Sketchy and TU-Berlin datasets show the dominant position of our proposed model over SOTA competitors.

Original languageEnglish
Article number110452
JournalPattern Recognition
Volume152
Early online date01 Apr 2024
DOIs
Publication statusPublished - 31 Aug 2024

Keywords

  • Adaptive
  • Margin
  • Metric learning
  • Relation-aware
  • ZS-SBIR

Fingerprint

Dive into the research topics of 'Zero-shot sketch-based image retrieval via adaptive relation-aware metric learning'. Together they form a unique fingerprint.

Cite this