Skip to content

codestar12/pruning-distilation-bias

Repository files navigation

Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation

This repo is supports and documents our paper

If you are reading this sorry the repo is kind of a mess. Deadlines are hard. I will be updating it to improve reproducablility. Feel free to shoot me an eamil with questions at [email protected].

We would like to thank - whos implimentations we used to many of the distillation methods and - whos CCA and CKA implementations we used.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published