-
Notifications
You must be signed in to change notification settings - Fork 339
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Guardrails support of AzureOpenAI with openai>1.0.0[bug] #576
Comments
This seems more like it's passing too many requests to the create constructor. Can you try removing temperature and max_tokens and see if it works? |
Thanks for your reply. I removed temperature and max_tokens but still have the same error. This error is similar to these issues that were previously raised for using openai>1.x: but I am wondering what the solution is for azure openai where these parameters should be set in order to call openai.chat.completions: according to https://www.guardrailsai.com/docs/integrations/azure_openai. Thanks. |
@shima-khoshraftar - Facing the similar issue. Could you please tell me what version of guardrails is working with openai v0.28 currently? |
@Aman0509 The latest release of guardrails works with Azure openai with openai==0.28. |
I am wondering if there is any update to this issue? I am also trying to use guardrails with Azure OpenAI and openai>1.0.0. |
You can now use AzureOpenAI with guardrails using |
Please follow this example (just substitute with AzureOpenAI) instead. |
My bad. I thought I added the link! Just updated the comment. |
Thanks for letting us know about the update and sending the link. However, the link does not seem to work, It can not find examples/litellm_example.ipynb and throws a page not found error. Could you please update the link? Thanks. |
@shima-khoshraftar I'm also looking at this example since I would like to use
|
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 14 days. |
This issue was closed because it has been stalled for 14 days with no activity. |
Describe the bug
Does Guardrails-ai support AzureOpenAI with openai library>1.0 which has a different llm call api from openai==0.28? (openai.chat.completions.create instead of openai.ChatCompletion.create)
To Reproduce
openai.api_key = api_key
openai.azure_endpoint =azure_endpoint
openai.api_type = 'azure'
openai.api_version = api_version
raw_llm_response, validated_response,*rest = guard(
#openai.ChatCompletion.create,
openai.chat.completions.create,
prompt_params={"document": content[:6000]},
#engine="text-davinci-003",
model='gpt-35-turbo-1106',
max_tokens=2048,
temperature=0.3,
)
Expected behavior
I expect it to call llm just like it worked with openai==0.28 (openai.ChatCompletion.create) but I get this error instead:
TypeError: create() takes 1 argument(s) but 2 were given
I realized that this was fixed for openAI by setting the api_key through os.environ. But how can I do it for AzureOpenAI?
Library version:
Guardrails-ai 0.3.2
openai 1.12.0
Thanks.
The text was updated successfully, but these errors were encountered: