Skip to content

Latest commit

 

History

History
8 lines (4 loc) · 486 Bytes

README.md

File metadata and controls

8 lines (4 loc) · 486 Bytes

Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation

This repo is supports and documents our paper

If you are reading this sorry the repo is kind of a mess. Deadlines are hard. I will be updating it to improve reproducablility. Feel free to shoot me an eamil with questions at [email protected].

We would like to thank - whos implimentations we used to many of the distillation methods and - whos CCA and CKA implementations we used.