Lucas Liebenwein*, Cenk Baykal*, Harry Lang, Dan Feldman, Daniela Rus
Implementation of provable filter pruning using sensitivity as introduced in Provable Filter Pruning for Efficient Neural Networks. The algorithm relies on a notion of sensitivity (the product of the data and the weight) to provably quantify the error introduced by pruning.
*Equal contribution
The algorithm relies on a novel notion of filter sensitivity as saliency score for weight parameters in the network to estimate their relative importance. The filter sensitivity is a generalization of the weight sensitivity introduced in SiPP that accounts for the filter having multiple weights and being used in multiple places.
For illustrative purposes, note that in the simple case of a linear layer the
sensitivity of a single weight w_ij
in layer l
can be defined as the
maximum relative contribution of the weight to the corresponding output neuron
over a small set of points x \in S
:
The weight hereby represents the edge connecting neuron j
in layer ell-1
to
neuron i
in layer l
. This notion can then be generalized to convolutional
layers, neurons, and filters among others as is shown in the paper.
In the paper, we show how pruning filters according to (empirical) sensitivity enables us to provably quantify the trade-off between the error and sparsity of the resulting pruned neural network.
Check out the main README.md and the respective packages for more information on the code base.
The experiment configurations are located here. To reproduce the experiments for a specific configuration, run:
python -m experiment.main paper/pfp/param/cifar/resnet20.yaml
Please cite the following paper when using our work.
Provable Filter Pruning for Efficient Neural Networks
@inproceedings{
liebenwein2020provable,
title={Provable Filter Pruning for Efficient Neural Networks},
author={Lucas Liebenwein and Cenk Baykal and Harry Lang and Dan Feldman and Daniela Rus},
booktitle={International Conference on Learning Representations},
year={2020},
url={https://openreview.net/forum?id=BJxkOlSYDH}
}