You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I'm getting this error, not exactly sure why:
ModuleNotFoundError Traceback (most recent call last)
/home/kenny/workspace/elapse/packages/research/notebooks/wized-guardrail.ipynb Cell 4 line 1
----> [1](vscode-notebook-cell://wsl%2Bubuntu/home/kenny/workspace/elapse/packages/research/notebooks/wized-guardrail.ipynb#W6sdnNjb2RlLXJlbW90ZQ%3D%3D?line=0) from guardrails import Guard
[3](vscode-notebook-cell://wsl%2Bubuntu/home/kenny/workspace/elapse/packages/research/notebooks/wized-guardrail.ipynb#W6sdnNjb2RlLXJlbW90ZQ%3D%3D?line=2) guard = Guard.from_pydantic(output_class=GuardRails, prompt=prompt)
File [~/.local/lib/python3.10/site-packages/guardrails/__init__.py:3](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/__init__.py:3)
[1](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/__init__.py:1) # Set up __init__.py so that users can do from guardrails import Response, Schema, etc.
----> [3](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/__init__.py:3) from guardrails.guard import Guard
[4](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/__init__.py:4) from guardrails.llm_providers import PromptCallableBase
[5](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/__init__.py:5) from guardrails.logging_utils import configure_logging
File [~/.local/lib/python3.10/site-packages/guardrails/guard.py:20](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/guard.py:20)
[17](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/guard.py:17) from eliot import add_destinations, start_action
[18](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/guard.py:18) from pydantic import BaseModel
---> [20](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/guard.py:20) from guardrails.llm_providers import get_async_llm_ask, get_llm_ask
[21](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/guard.py:21) from guardrails.prompt import Instructions, Prompt
[22](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/guard.py:22) from guardrails.rail import Rail
File [~/.local/lib/python3.10/site-packages/guardrails/llm_providers.py:5](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/llm_providers.py:5)
[2](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/llm_providers.py:2) from typing import Any, Awaitable, Callable, Dict, List, Optional, cast
[4](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/llm_providers.py:4) import openai
----> [5](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/llm_providers.py:5) import openai.error
[6](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/llm_providers.py:6) from pydantic import BaseModel
[7](https://vscode-remote+wsl-002bubuntu.vscode-resource.vscode-cdn.net/home/kenny/workspace/elapse/packages/research/notebooks/~/.local/lib/python3.10/site-packages/guardrails/llm_providers.py:7) from tenacity import retry, retry_if_exception_type, wait_exponential_jitter
ModuleNotFoundError: No module named 'openai.error'
To Reproduce
# %%%pip-qinstallguardrails-aiopenai# %%fromenumimportEnumfrompydanticimportBaseModel, FieldclassRelatedEnum(int, Enum):
one=1two=2three=3four=4five=5classPriorityEnum(int, Enum):
one=1two=2three=3classGuardRails(BaseModel):
related_ranking: RelatedEnum=Field(RelatedEnum.three, description="Related ranking")
priority_ranking: PriorityEnum=Field(PriorityEnum.two, description="Priority ranking")
# %%prompt="""Determine the following base on the user's query:Query: `${query}`- Is the user's query related to the platform Wized?- What is the likely priority of the user's query?Ranking:1. Not related2. Somewhat related3. Related4. Very related5. Extremely relatedPriority:1. Low2. Medium3. High${gr.complete_json_suffix_v2}"""# %%fromguardrailsimportGuardguard=Guard.from_pydantic(output_class=GuardRails, prompt=prompt)
# %%query="What is the best way to get started with Wized?"# %%importopenairaw_llm_output, validated_output=guard(
openai.Completion.create,
prompt_params={"query": query},
engine="text-davinci-003",
max_tokens=1024,
temperature=0.3,
)
# %%raw_llm_output# %%validated_output
Expected behavior
The error to not show up
Library version:
Version (e.g. 0.2.6)
The text was updated successfully, but these errors were encountered:
Hello @kdcokenny, openai just released a new package (> v1.x) with new features like JSON mode yesterday - and it's somewhat backwards incompatible. We've started working on updating our code to make it compatible with openai's latest package - and we will update here once it's complete. As of now, please try re-installing an earlier version:
Hey @kdcokenny, we just publish guardails-ai v0.2.7, this patch temporarily locks the version of openai to 0.28.1 or under. Once we have support for openai 1.x we'll remove this lock, but for now you should be able to use this latest version without having to specify the openai version yourself. Thanks again for filing this issue!
Describe the bug
I'm getting this error, not exactly sure why:
To Reproduce
Expected behavior
The error to not show up
Library version:
Version (e.g. 0.2.6)
The text was updated successfully, but these errors were encountered: