This repository contains code for reproducing results reported in our paper describing how asymmetric non-contrastive SSL methods like BYOL perform implicit variance regularization. The code here is heavily based on solo-learn, a library of self-supervised methods. As such, most of the functionality of solo-learn is available here as well.
First clone the repo. If no Dali support is needed, the requirements can be installed in a virtual environment with:
pip3 install -r requirements.txt
If you want to use Dali, you need to install it manually following their guide.
For pretraining the backbone, follow one of the many bash commands in run_experiments.sh
, with most of the hyperparameter set in scripts/pretrain/
following the exact same structure as solo-learn.
The common syntax is something like:
python3 main_pretrain.py \
# path to training script folder
--config-path scripts/pretrain/cifar/ \
# training config name
--config-name isoloss.yaml
# add new arguments (e.g. those not defined in the yaml files)
# by doing ++new_argument=VALUE
# pytorch lightning's arguments can be added here as well.
For DirectCopy, just run DirectPred with the dp_alpha
parameter set to 1, and the eps_iso
parameter set to 0.25, and similarly for the IsoLoss version.
In the paper, we report online readout accuracy, which is logged by wandb, and can be extracted from the logs as needed.