Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/wrapper #90

Merged
merged 64 commits into from
Mar 21, 2024
Merged

Feat/wrapper #90

merged 64 commits into from
Mar 21, 2024

Conversation

csgulati09
Copy link
Collaborator

Title: Revamp: wrapper around OpenAI SDK

Description:

  • Existing routes: chat, completion, embedding. The approach is changed. Now they call openAI SDK instead of post request to our Gateway
  • post, feedback and prompt routes are untouched
  • Introduced new routes: images, files, assistant + sub routes, threads + sub routes
  • New test cases added as well

Motivation:
This will make it easier for us to integrate newly introduced routes from OpenAI

Related Issues:
#89

@csgulati09 csgulati09 self-assigned this Feb 24, 2024
@dosubot dosubot bot added the size:XXL This PR changes 1000+ lines, ignoring generated files. label Feb 24, 2024
@csgulati09 csgulati09 requested a review from VisargD February 24, 2024 14:21
@VisargD
Copy link
Collaborator

VisargD commented Mar 2, 2024

Can you please check and remove comments from the PR

setup.cfg Outdated Show resolved Hide resolved
tests/models.json Outdated Show resolved Hide resolved
tests/models.json Outdated Show resolved Hide resolved
tests/models.json Outdated Show resolved Hide resolved
@VisargD
Copy link
Collaborator

VisargD commented Mar 7, 2024

Hey @csgulati09 - I can see that test directory for threads, assistants and images contains config with virtual keys from providers like cohere, anthropic and anyscale. Should we remove them as these providers do not support the methods

for i in get_configs(f"{CONFIGS_PATH}/single_provider"):
t3_params.append((client, i))

@pytest.mark.parametrize("client, provider, auth, model", t3_params)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

t3_params only has 2 items: client and virtual_key as per the above initialization. So @pytest.mark.parametrize("client, provider, auth, model", t3_params) will fail as it checks for 4

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

tests/test_threads.py Outdated Show resolved Hide resolved
for i in get_configs(f"{CONFIGS_PATH}/single_with_basic_config"):
t2_params.append((client, i))

@pytest.mark.parametrize("client, provider, auth, model", t2_params)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The same problem as assistants tests here as well. t2_params only has client and virtual_key but the parametrize expects 4.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@@ -33,3 +33,4 @@
PORTKEY_GATEWAY_URL = PORTKEY_BASE_URL
PORTKEY_API_KEY_ENV = "PORTKEY_API_KEY"
PORTKEY_PROXY_ENV = "PORTKEY_PROXY"
OPEN_AI_API_KEY = "DUMMY-KEY"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is not required anymore, right? We can remove if thats the case.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need some value as a placeholder. Even it has no significance, but we need some string to pass.

setup.cfg Outdated
@@ -42,6 +42,7 @@ dev =
python-dotenv==1.0.0
ruff==0.0.292
pytest-asyncio==0.23.5
openai>=1.12.0,<1.12.9
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might have to allow all the 1.x.x versions here as openai is already on 1.13.0 version. Might have to test this as well by installing portkey to a project that already has openai installed in it.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

model=model, messages=messages, **kwargs
)
json_response = json.loads(response.text)
return ChatCompletions(**json_response)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think due to this, get_headers() is also not working because only json fields are getting passed to the model. Need to debug this further. But get_headers() is returning None. Please check this once.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@dosubot dosubot bot added the lgtm label Mar 21, 2024
@VisargD
Copy link
Collaborator

VisargD commented Mar 21, 2024

Closes #89

@VisargD VisargD linked an issue Mar 21, 2024 that may be closed by this pull request
@VisargD VisargD merged commit 923ed41 into main Mar 21, 2024
2 of 6 checks passed
@VisargD VisargD deleted the feat/wrapper branch March 21, 2024 12:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:enhancement lgtm size:XXL This PR changes 1000+ lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Revamp Python SDK: Wrapper
2 participants