这里收集了一些关于知识蒸馏 - Knowledge Distillation (KD) 的介绍和研究现状。
如果你找到了相关领域 remarkable (开山之作、详尽survey、高引用量) 的 paper,可以在 issue 中留言。
如果对你有帮助,请三连支持👍!
- 大神总结 | 强化学习线路 [post]
- Policy Gradient Algorithms [post]
- Deterministic Policy Gradient Algorithms [paper]
- Continuous Control with Deep Reinforcement Learning [paper]
- Knowledge Distillation(知识蒸馏)Review--20篇paper回顾 [post]
- 知识蒸馏 | 模型压缩利器_良心总结 [post]
- Knowledge Distillation: A Survey [paper]
- Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks [paper]
- Distilling the Knowledge in a Neural Network [paper]
- Deep Mutual Learning [paper]
- On the Efficacy of Knowledge Distillation [paper]
- Self-training with Noisy Student improves ImageNet classification [paper]
- Training deep neural networks in generations: A more tolerant teacher educates better students [paper]
- Distillation-Based Training for Multi-Exit Architectures [paper]
- Knowledge Extraction with No Observable Data [paper] [code]
- MEAL: Multi-Model Ensemble via Adversarial Learning [paper] [code]
- Feature-map-level Online Adversarial Knowledge Distillation [paper]
- Data-Free Learning of Student Networks [paper]
- KDGAN: Knowledge Distillation with Generative Adversarial Networks [paper]
- Training Shallow and Thin Networks for Acceleration via Knowledge Distillation with Conditional Adversarial Networks [paper]