Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-Tailed Classification

Liuyu Xiang, Guiguang Ding*, Jungong Han

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference Proceeding (Non-Journal item)

194 Citations (Scopus)
70 Downloads (Pure)

Abstract

In real-world scenarios, data tends to exhibit a long-tailed distribution, which increases the difficulty of training deep networks. In this paper, we propose a novel self-paced knowledge distillation framework, termed Learning From Multiple Experts (LFME). Our method is inspired by the observation that networks trained on less imbalanced subsets of the distribution often yield better performances than their jointly-trained counterparts. We refer to these models as ‘Experts’, and the proposed LFME framework aggregates the knowledge from multiple ‘Experts’ to learn a unified student model. Specifically, the proposed framework involves two levels of adaptive learning schedules: Self-paced Expert Selection and Curriculum Instance Selection, so that the knowledge is adaptively transferred to the ‘Student’. We conduct extensive experiments and demonstrate that our method is able to achieve superior performances compared to state-of-the-art methods. We also show that our method can be easily plugged into state-of-the-art long-tailed classification algorithms for further improvements.

Original languageEnglish
Title of host publicationComputer Vision – ECCV 2020 - 16th European Conference, Proceedings
EditorsAndrea Vedaldi, Horst Bischof, Thomas Brox, Jan-Michael Frahm
PublisherSpringer Nature
Pages247-263
Number of pages17
ISBN (Print)9783030585570
DOIs
Publication statusPublished - 29 Oct 2020
Event16th European Conference on Computer Vision, ECCV 2020 - Glasgow, United Kingdom of Great Britain and Northern Ireland
Duration: 23 Aug 202028 Aug 2020

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12350 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference16th European Conference on Computer Vision, ECCV 2020
Country/TerritoryUnited Kingdom of Great Britain and Northern Ireland
CityGlasgow
Period23 Aug 202028 Aug 2020

Fingerprint

Dive into the research topics of 'Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-Tailed Classification'. Together they form a unique fingerprint.

Cite this