Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Properly integrate OpenAI's batch api into all generation scripts #170

Open
Kipok opened this issue Oct 7, 2024 · 0 comments
Open

Properly integrate OpenAI's batch api into all generation scripts #170

Kipok opened this issue Oct 7, 2024 · 0 comments

Comments

@Kipok
Copy link
Collaborator

Kipok commented Oct 7, 2024

We already support batch api directly in the model, but this needs to be exposed in all relevant scripts. The main problem API-wise is how do we wait for the results such that we can easily chain scripts together and make sure non-batch and batch api has consistent interface. Maybe write the id request into the jsonl file and then add checks in all relevant places and fail if it's not done?

Here is an initial attempt based on the prior code version, but we need to upgrade it to work with latest main #79

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant