This is the code repository of the following paper to perform data augmentation driven model compression to maintain out-of-distribution generalization and preserve robustness against universal adversarial patch attacks.
"Preserving real-world robustness of neural networks under sparsity constraints"
Jasmin Viktoria Gritsch, Robert Legenstein, Ozan Özdenizci
European Conference on Machine Learning and Knowledge Discovery in Databases (ECML-PKDD), 2024.
If you use this code or models in your research and find it helpful, please cite the following paper:
@inproceedings{gritsch2024preserving,
title={Preserving real-world robustness of neural networks under sparsity constraints},
author={Jasmin Viktoria Gritsch and Robert Legenstein and Ozan {\"O}zdenizci},
booktitle={European Conference on Machine Learning and Knowledge Discovery in Databases (ECML-PKDD)},
year={2024}
}
Authors of this work are affiliated with Graz University of Technology, Institute of Theoretical Computer Science, and Silicon Austria Labs, TU Graz - SAL Dependable Embedded Systems Lab, Graz, Austria. This work has been supported by the "University SAL Labs" initiative of Silicon Austria Labs (SAL) and its Austrian partner universities for applied fundamental research for electronic based systems.
Parts of this code repository is based on the following works: