-
Notifications
You must be signed in to change notification settings - Fork 771
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add system prompt for OpenAI representation model #2146
Comments
Thank you for this feature request. If we consider adding system prompts as a parameter for these models, shouldn't we then also consider adding the same for all other LLMs? From a user experience perspective, if one model has that parameter, I would expect all others to also have that same functionality. |
Hi @MaartenGr, thank you for the feedback! Yes, it would be better to add the same for all other LLMs that support system prompts. If you'd like, I can add them. |
@Leo-LiHao If it doesn't take too much time, that would be great! I think it would be a nice user experience to have it available for all LLM-based representation models. |
@MaartenGr I checked the implemented LLM-based representation models:
Let me know if I missed anything or if you'd like to discuss further steps. |
Sounds good!
Agreed, it should be clear to the users why there isn't a system prompt available whereas it is being used in other LLMs.
Then it might be worthwhile to switch over to chat ompletion when we have a system prompt or use chat completion by default What do you think? |
@MaartenGr I migrated Cohere's generate to chat API (#2145). However, I'm not familiar with LangChain and LlamaCPP. I'd appreciate it if someone can lend a hand :) |
Feature request
Currently, the system prompt of the OpenAI representation model is fixed. This feature allows developers to customize the system prompt.
Motivation
Provide flexibility to developers to set their own system prompt for OpenAI models
Your contribution
#2145 Added system prompt for OpenAI models
The text was updated successfully, but these errors were encountered: