Skip to content

Complementary Relation Contrastive Distillation

Notifications You must be signed in to change notification settings

Lechatelia/CRCD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Complementary Relation Contrastive Distillation (CRCD)

Introduction

  • This repo is the incomplete implementation of the following paper:

    "Complementary Relation Contrastive Distillation" (CRCD). arxiv

  • I am sorry that the source code in this repository is not an official implementation, which is relies on some internal code of the company's self-developed deep learning library. However, I reimplented the most critical parts in the work with torch, thus it should be very easy to be pluged into the CRD repo.

  • I provide a example to use CRCD loss in CRD repo (train_student.py). Note that this training code is not checked and may have some bugs.

Key components

  • Relation contrastive loss

the crd-style implementation is here

  • Computation of gradient element

gradient element is computed in the def get_grad_loss() in the loops.py by using torch API torch.autograd.grad(). Then, the gradient relation can be estimated and the crcd loss utilizeing gradient elements can be obtained easily.

  • The very effective trick which is used in CRCD

It is very effective to adjust the distillation loss weight dynamically during the training procedures. We supply some strategy examples in the funtion def adjust_mimic_loss_weight() in the loops.py. In these strategy, the reregulatization term in the total loss from distillaltion loss is reduced according to a certain rule as the training progresses.

In our cifar100 experiments with 250 epochs training, we adopted the stepwise one: before the 240-th epoch, the loss weight maintains 1; after the 240-th, the loss weight is adjusted to 0 for the last 10 epochs. This means the students are finetuned for another 10 epochs with the minimun learning rate

About

Complementary Relation Contrastive Distillation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages