Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow Azure OpenAPI Clients #68981

Closed

Conversation

sharkymcdongles
Copy link

Azure uses a different client for OpenAI. This PR allows Azure OpenAI when needed and uses original when not.

I had to disable the custom index for PyPi for this because I can't add the latest openai package there for Python.

This is a quick edit, so this PR really serves as a jumping off point for the solution and needs more guidance from the maintainers.

Legal Boilerplate

Look, I get it. The entity doing business as "Sentry" was incorporated in the State of Delaware in 2015 as Functional Software, Inc. and is gonna need some rights from me in order to utilize my contributions in this here PR. So here's the deal: I retain all rights, title and interest in and to my contributions, and by keeping this boilerplate intact I confirm that Sentry can use, modify, copy, and redistribute my contributions, under Sentry's choice of terms.

Signed-off-by: sharkymcdongles <[email protected]>
Signed-off-by: sharkymcdongles <[email protected]>
@sharkymcdongles sharkymcdongles requested review from a team as code owners April 16, 2024 13:00
@github-actions github-actions bot added the Scope: Backend Automatically applied to PRs that change backend components label Apr 16, 2024
@JoshFerge
Copy link
Member

JoshFerge commented Apr 17, 2024

hello! thank you for the contribution. #68771 is currently in progress, and will provide a more modular way to configure LLMs for usage with Sentry.

@sharkymcdongles
Copy link
Author

sharkymcdongles commented Apr 17, 2024 via email

@JoshFerge
Copy link
Member

Thanks for the reply however it seems this PR you linked still won't work for Azure OpenAI. :(

On Wed, 17 Apr 2024, 04:47 Josh Ferge, @.> wrote: hello! thank you for the contribution. #67478 <#67478> is currently in progress, and will provide a more modular way to configure LLMs for usage with Sentry. — Reply to this email directly, view it on GitHub <#68981 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AG6KEUK7PIOMWOIL7B72P63Y5XPDRAVCNFSM6AAAAABGJJ4FT2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDANRQGI2DENZUHE . You are receiving this because you authored the thread.Message ID: @.>

I can add azure openAI as a provider as a follow up!

@getsantry
Copy link
Contributor

getsantry bot commented May 9, 2024

This pull request has gone three weeks without activity. In another week, I will close it.

But! If you comment or otherwise update it, I will reset the clock, and if you add the label WIP, I will leave it alone unless WIP is removed ... forever!


"A weed is but an unloved flower." ― Ella Wheeler Wilcox 🥀

@getsantry getsantry bot added the Stale label May 9, 2024
@getsantry getsantry bot closed this May 17, 2024
@github-actions github-actions bot locked and limited conversation to collaborators Jun 1, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Scope: Backend Automatically applied to PRs that change backend components Stale
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants