Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training code for lambda-ECLIPSE #2

Open
j-min opened this issue Apr 5, 2024 · 1 comment
Open

Training code for lambda-ECLIPSE #2

j-min opened this issue Apr 5, 2024 · 1 comment

Comments

@j-min
Copy link

j-min commented Apr 5, 2024

Hi, thanks for sharing the inference code for lambda-ECLIPSE; nice work!

Do you plan to release the training code for the lambda-ECLIPSE? Such resource-efficient training would be very useful for many low-resource groups. I'll be looking forward to trying this.

@Maitreyapatel
Copy link
Member

Maitreyapatel commented Apr 7, 2024

Hello @j-min, thanks for showing interest in our work.

TL;DR: We are building an end-to-end project that can help us utilize the true potential of CLIP models for T2I. But public release may take some time.

Long answer:

Current progress/hype in T2I is largely due to three works: SDXL-Turbo (inference time efficiency), DALL-E 3 (SOTA), and Playground v2.5 (highly aesthetic). Majority of the such works use cleaver tricks (like synthetic captions, DPO, etc.) which we haven't explored yet in ECLIPSE/UnCLIP settings. Hence, the true potential is still a mystery to us!!

Due to the efforts involved in building the end-to-end system, we are projecting the partial release in May. We are also looking for help/collaborations to speed up the process and explore its use cases in unknown territories. If you are interested please reach out to me at: [email protected]

Alternative: The training is pretty straightforward and uses a mix of contrastive learning and projection loss as described in the paper.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants