Generative Distribution Distillation (GenDD)

December 16, 2025 ยท View on GitHub

In this paper, we formulate the knowledge distillation (KD) as a conditional generative problem and propose the Generative Distribution Distillation (GenDD) framework. A naive GenDD baseline encounters two major challenges: the curse of high-dimensional optimization and the lack of semantic supervision from labels. To address these issues, we introduce a Split Tokenization strategy, achieving stable and effective unsupervised KD. Additionally, we develop the Distribution Contraction technique to integrate label supervision into the reconstruction objective. Our theoretical proof demonstrates that GenDD with Distribution Contraction serves as a gradient-level surrogate for multi-task learning, realizing efficient supervised training without explicit classification loss on multi-step sampling image representations. To evaluate the effectiveness of our method, we conduct experiments on balanced, imbalanced, and unlabeled data. Experimental results show that GenDD performs competitively in the unsupervised setting, significantly surpassing KL baseline by 16.29% on ImageNet validation set. With label supervision, our ResNet-50 achieves 82.36% top-1 accuracy on ImageNet in 600 epochs training, establishing a new state-of-the-art.

Envrionment

See piplist.txt

Experimental Results

Pretrained models will be available soon.

Supervised KD on ImageNet

MethodModelTop-1 Acc(%)Linear probinglinklog
KDResNet34-ResNet1871.24---
IKL-KDResNet34-ResNet1871.91---
GenDDResNet34-ResNet1872.3872.40 logmodellog
---------------
KDResNet50-MVNet71.44---
IKL-KDResNet50-MVNet73.19---
GenDDResNet50-MVNet73.7873.76 logmodellog
---------------
KDBEiT-L-ResNet50 (A2 300e)80.89---
DKDBEiT-L-ResNet50 (A2 300e)80.77---
GenDDBEiT-B-ResNet50 (A2 300e)82.0181.96 logmodellog
GenDDBEiT-L-ResNet50 (A2 300e)81.8381.79 log-log
---------------
KDBEiT-L-ResNet50 (A1 600e)81.68---
DKDBEiT-L-ResNet50 (A1 600e)81.83---
GenDDBEiT-B-ResNet50 (A1 600e)82.3182.15 log-log
GenDDBEiT-L-ResNet50 (A1 600e)82.3482.36 logmodellog

Unsupervised KD on CC3M

We train models on CC3M without labels and evaluate the trained models on ImageNet validation set.

MethodModelTop-1 Acc(%)linklog
KLResNet50-MVNet51.60--
GenDDResNet34-ResNet1866.90--
GenDDResNet50-MVNet67.89--

Training and Evaluation

Before evaluation, please specify the path of the trained models. A linear classifier can achieve similar results to the diffusion head.

For CIFAR,

cd GenDD_cifar             
bash sh/fetch_pretrained_teachers.sh            
bash sh/train_res110_res32_gendd.sh              
bash sh/evaluate.sh

For supervised KD on ImageNet,

cd GenDD_imagenet              
bash sh/imagenet_train_res34res18_gendd.sh            
bash sh/imagenet_eval_res34res18_gendd.sh
bash sh/imagenet_train_res34res18_gendd_linear.sh               

For unsupervised KD on CC3M,

cd GenDD_imagenet             
bash sh/cc3m_train_res34res18_unsupervised_gendd.sh
bash sh/cc3m_eval_res34res18_unsupervised_gendd.sh

Contact

If you have any questions, feel free to contact us through email (jiequancui@gmail.com) or Github issues. Enjoy!