Skip to content
This repository has been archived by the owner on May 1, 2023. It is now read-only.

Distiller Summary Reports

Neta Zmora edited this page Oct 3, 2018 · 3 revisions

You can use the sample compression application to generate model various summary reports.
Summaries can be generated for all supported models, whether they are dense or sparse.
The following examples use a dense ResNet18 model to demonstrate some of the summary reports.
This feature is loosely inspired by Keras's print_summary.


Show feature-maps sizes and compute per layer

This assumes direct convolution and uses MAC as a compute metric (Multiply Accumulate operations). Only convolution and linear (fully-connected) layers are counted.

$ python3 compress_classifier.py -a=resnet18 ../../../data.imagenet --summary=compute

Show sparsity of each layer's weights tensors

$ python3 compress_classifier.py -a=resnet18 ../../../data.imagenet --summary=sparsity

Generate a PNG image of the network graph

python3 compress_classifier.py -a=resnet18 ../../../data.imagenet --summary=png

You can also show the parameters: python3 compress_classifier.py -a=resnet18 ../../../data.imagenet --summary=png_w_params

Export to ONNX

python3 compress_classifier.py -a=resnet18 ../../../data.imagenet --summary=onnx

List all of the modules in the model

python3 compress_classifier.py -a=resnet18 ../../../data.imagenet --summary=modules