Skip to content

[NeurIPS2023] Official code of "Understanding Contrastive Learning via Distributionally Robust Optimization"

Notifications You must be signed in to change notification settings

junkangwu/ADNCE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Understanding Contrastive Learning via Distributionally Robust Optimization

This is the PyTorch implementation for our NeurIPS 2023 paper.

Junkang Wu, Jiawei Chen, Jiancan Wu, Wentao Shi, Xiang Wang & Xiangnan He. 2023. Understanding Contrastive Learning via Distributionally Robust Optimization. arxiv link

Pseudo Code

The implementation only requires a small modification to the InfoNCE code.

# pos     : exp of inner products for positive examples
# neg     : exp of inner products for negative examples
# N       : number of negative examples
# t       : temperature scaling
# mu      : center position
# sigma   : height scale

#InfoNCE
standard_loss = -log(pos.sum() / (pos.sum() + neg.sum()))

#ADNCE
weight=1/(sigma * sqrt(2*pi)) * exp( -0.5 * ((neg-mu)/sigma)**2 )
weight=weight/weight.mean()
Adjusted_loss = -log(pos.sum() / (pos.sum() + (neg * weight.detach() ).sum()))

Image Experiments

The code can be found in image

Sentence Experiments

The code can be found in sentence

Graph Experiments

The code can be found in graph

Citation

If you find this repo useful for your research, please consider citing the paper

@inproceedings{wu2023ADNCE,
  author = {Junkang Wu and Jiawei Chen and Jiancan Wu and Wentao Shi and Xiang Wang and Xiangnan He},
  title = {Understanding Contrastive Learning via Distributionally Robust Optimization},
  booktitle = {NeurIPS},
  year = {2023}
}

For any clarification, comments, or suggestions please create an issue or contact me ([email protected]).

About

[NeurIPS2023] Official code of "Understanding Contrastive Learning via Distributionally Robust Optimization"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published