Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

mixup: Beyond Empirical Risk Minimization

License

Unknown, MIT licenses found

Licenses found

Unknown
LICENSE
MIT
LICENSE-pytorch-cifar
Notifications You must be signed in to change notification settings

facebookresearch/mixup-cifar10

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Mixup-CIFAR10

By Hongyi Zhang, Moustapha Cisse, Yann Dauphin, David Lopez-Paz.

Facebook AI Research

Introduction

Mixup is a generic and straightforward data augmentation principle. In essence, mixup trains a neural network on convex combinations of pairs of examples and their labels. By doing so, mixup regularizes the neural network to favor simple linear behavior in-between training examples.

This repository contains the implementation used for the results in our paper (https://arxiv.org/abs/1710.09412).

Citation

If you use this method or this code in your paper, then please cite it:

@article{
zhang2018mixup,
title={mixup: Beyond Empirical Risk Minimization},
author={Hongyi Zhang, Moustapha Cisse, Yann N. Dauphin, David Lopez-Paz},
journal={International Conference on Learning Representations},
year={2018},
url={https://openreview.net/forum?id=r1Ddp1-Rb},
}

Requirements and Installation

  • A computer running macOS or Linux
  • For training new models, you'll also need a NVIDIA GPU and NCCL
  • Python version 3.6
  • A PyTorch installation

Training

Use python train.py to train a new model. Here is an example setting:

$ CUDA_VISIBLE_DEVICES=0 python train.py --lr=0.1 --seed=20170922 --decay=1e-4

License

This project is CC-BY-NC-licensed.

Acknowledgement

The CIFAR-10 reimplementation of mixup is adapted from the pytorch-cifar repository by kuangliu.

About

mixup: Beyond Empirical Risk Minimization

Resources

License

Unknown, MIT licenses found

Licenses found

Unknown
LICENSE
MIT
LICENSE-pytorch-cifar

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages