Skip to content

Latest commit

 

History

History

Distillation

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

Knowledge Distillation

This repo is the Pytorch implementation of knowledge distillation methods.

ManifoldKD

NeurIPS 2022 paper Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation

This paper utilizes the patch-level information and propose a fine-grained manifold distillation method for transformer-based networks. More details can be found at ManifoldKD.

VanillaKD

VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale.

More details can be found at VanillaKD.