Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added option for specifying model config directly, for programmatic use #245

Closed

Conversation

shaltielshmid
Copy link
Contributor

Removed in #187

The parameter was added in order to allow passing the config programmatically without having to save the config directly to a file.

@clefourrier
Copy link
Member

Hi! Thanks for the PR - I think something like this will be interesting to add once #236 is merged

@shaltielshmid shaltielshmid force-pushed the add-model-config-in-param branch from 80b2677 to 2340a4f Compare August 25, 2024 16:23
@clefourrier
Copy link
Member

Hi! Have you taken a look at the new system, merged in #269 ? Are there still things you want to add now?

@shaltielshmid
Copy link
Contributor Author

Hi! The new system looks great!
I haven't had a chance to delve into it yet, but I believe some of the changes I added here are still relevant, although they will probably need to be updated to match the updated main branch.

I seem to have not written the features in the main PR message, the features I added here that are relevant:

1] Support for OpenAI compatible server - for example, if we launch a model using vLLM, we'll be able to just specify a different config and it will use that interface instead of the TGI interface.

2] Retry with backoff - in case a single request fails (e.g., times out), we try again a few times instead of stopping the whole run.

If this is something you think is important, I'm hoping to have some time next week or the week after to work on it to update it to work with the latest main branch.

@clefourrier
Copy link
Member

Hm I think Open AI compatibility was added for the LLM as a judge, so if needed some code could be taken from that! Yep sounds good!

@clefourrier
Copy link
Member

Thanks a lot for your contrib! :)

@clefourrier
Copy link
Member

Hi @shaltielshmid ! Going to close this one as I think it will be easier to start from scratch given the code base changes - but feel free to reopen if you disagree

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants