Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat/async #85

Merged
merged 16 commits into from
Feb 13, 2024
Merged

Feat/async #85

merged 16 commits into from
Feb 13, 2024

Conversation

csgulati09
Copy link
Collaborator

Title: Async Support

Description:

  • Initialization of all the Async methods in the respective init.py file
  • New class AsyncAPIResource created
  • New class AsyncAPIClient created
  • New class AsyncChatCompletion created to support .chat completion calls
  • New class AsyncCompletion created to support .completion calls
  • New class AsyncEmbeddings created to support .embedding calls
  • New class AsyncFeedback created to support .feedback calls
  • Likewise, AsyncGenerations, AsyncPrompts, AsyncCompletions, AsyncPost classes created as well

To use instead of importing Portkey, we will have to import AsyncPortkey
This matches the convention of OpenAI, instead of OpenAI, we import AsyncOpenAI to async calls of the SDK

Motivation:
Allows the users to make Async calls for chat, complete and embedding

Related Issues:
#33

@dosubot dosubot bot added size:XXL This PR changes 1000+ lines, ignoring generated files. auto:enhancement labels Feb 8, 2024
except httpx.HTTPStatusError as err: # 4xx and 5xx errors
# If the response is streamed then we need to explicitly read the response
# to completion before attempting to access the response text.
err.response.read()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might have to be changed to its async version
await err.response.aread() other wise it raises an exception when the API responds with an error code

class TestChatCompletions:
client = AsyncPortkey
parametrize = pytest.mark.parametrize("client", [client], ids=["strict"])
models = read_json_file("./models.json")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This needs to be ./tests/models.json

@VisargD
Copy link
Collaborator

VisargD commented Feb 13, 2024

Closes #33

@dosubot dosubot bot added the lgtm label Feb 13, 2024
@VisargD VisargD merged commit c0dc31f into main Feb 13, 2024
5 of 6 checks passed
@VisargD VisargD deleted the feat/async branch February 13, 2024 17:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:enhancement lgtm size:XXL This PR changes 1000+ lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants