Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add system prompt for OpenAI representation model #2146

Open
Leo-LiHao opened this issue Sep 12, 2024 · 6 comments
Open

Add system prompt for OpenAI representation model #2146

Leo-LiHao opened this issue Sep 12, 2024 · 6 comments

Comments

@Leo-LiHao
Copy link

Feature request

Currently, the system prompt of the OpenAI representation model is fixed. This feature allows developers to customize the system prompt.

Motivation

Provide flexibility to developers to set their own system prompt for OpenAI models

Your contribution

#2145 Added system prompt for OpenAI models

@MaartenGr
Copy link
Owner

Thank you for this feature request. If we consider adding system prompts as a parameter for these models, shouldn't we then also consider adding the same for all other LLMs? From a user experience perspective, if one model has that parameter, I would expect all others to also have that same functionality.

@Leo-LiHao
Copy link
Author

Hi @MaartenGr, thank you for the feedback! Yes, it would be better to add the same for all other LLMs that support system prompts. If you'd like, I can add them.

@MaartenGr
Copy link
Owner

@Leo-LiHao If it doesn't take too much time, that would be great! I think it would be a nice user experience to have it available for all LLM-based representation models.

@Leo-LiHao
Copy link
Author

Leo-LiHao commented Sep 15, 2024

@MaartenGr I checked the implemented LLM-based representation models:

  • Cohere: We're using the legacy generate API, which doesn't support system prompts. The newer chat API does support them, so we could consider migrating if needed.
  • LangChain: The system prompt is set when creating the chain, so the feature is already supported implicitly. We could update the documentation to include an example that uses a system prompt when configuring the chain, which could clarify this for users.
  • LlamaCPP: From what I understand, we used this library for text generation rather than chat, so system prompts aren’t supported.

Let me know if I missed anything or if you'd like to discuss further steps.

@MaartenGr
Copy link
Owner

Cohere: We're using the legacy generate API, which doesn't support system prompts. The newer chat API does support them, so we could consider migrating if needed.

Sounds good!

LangChain: The system prompt is set when creating the chain, so the feature is already supported implicitly. We could update the documentation to include an example that uses a system prompt when configuring the chain, which could clarify this for users.

Agreed, it should be clear to the users why there isn't a system prompt available whereas it is being used in other LLMs.

LlamaCPP: From what I understand, we used this library for text generation rather than chat, so system prompts aren’t supported.

Then it might be worthwhile to switch over to chat ompletion when we have a system prompt or use chat completion by default What do you think?

@Leo-LiHao
Copy link
Author

@MaartenGr I migrated Cohere's generate to chat API (#2145).

However, I'm not familiar with LangChain and LlamaCPP. I'd appreciate it if someone can lend a hand :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants