This repository has been archived by the owner on May 1, 2023. It is now read-only.
Releases: IntelLabs/distiller
Releases · IntelLabs/distiller
Intermediate release
Tagging the 'master' branch before performing a few API-breaking changes.
v0.3.0
PyTorch 0.4 support and new features
- PyTorch 0.4 support
- An implementation of Baidu's RNN pruning paper from ICLR 2017
Narang, Sharan & Diamos, Gregory & Sengupta, Shubho & Elsen, Erich. (2017).
Exploring Sparsity in Recurrent Neural Networks. (https://arxiv.org/abs/1704.05119) - Add a word language model pruning example using AGP and Baidu RNN pruning
- Quantization aware training (4-bit quantization)
- New models: pre-activation ResNet for ImageNet and CIFAR, and AlexNet with batch-norm
- New quantization documentation content
Initial version
We're tagging this version which uses PyTorch 0.3, and we want to want to move the 'master' branch to support PyTorch 0.4 and its API changes.