Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove generics of Client, enabling dynamic dispatch #125

Open
wants to merge 15 commits into
base: main
Choose a base branch
from

Conversation

ifsheldon
Copy link
Contributor

@ifsheldon ifsheldon commented Oct 21, 2023

This is an attempt to solve #106. This removes generics of Client and use Rc Arc trait objects to do dynamic dispatch.

Breaking changes:

  1. The type of Client is different, so users need to change their code to use this newer version. Fortunately, the change is easy, just remove anything like <C: Config>.
  2. Config trait is different, since Debug + 'static Debug + 'static + Send + Sync is added as a trait bound. As long as users do not impl Config for their own type, this is perfectly fine. Otherwise, they need to impl Debug for their structs and make sure their structs don't contain non-static references.
  3. Client::configs now returns &Rc<dyn Config> &Arc<dyn Config>, not &C: Config.

I think this breaking change is worthwhile, since users don't need to color all their functions that use client with C: Config and this makes switching between services at runtime painless.

@schneiderfelipe
Copy link
Contributor

Hey this initiative is awesome, thanks for this. A couple questions:

  1. Wouldn't requiring Config to be 'static make impossible to read it at run time? How would you (safely) get it from a file, for instance?

  2. Another question, why not have a GenericClient with Config as associated type? Then the current type gets its implementation, no breaking change required. Plus one is able to specify a dyn object as Box<dyn GenericClient<Config=TheOneIExpect>>. This has precedence with the Iterator trait (Box<dyn Iterator<Item=TheOneIExpect>>). No dynamic dispatch required in the library per se, yet users are able to have it just fine.

Keep in mind I haven't thoroughly reviewed the code, so I may have made some mistakes in this comment.

@ifsheldon
Copy link
Contributor Author

Wouldn't requiring Config to be 'static make impossible to read it at run time? How would you (safely) get it from a file, for instance?

'static here does not mean only static objects can be constructed. Rather, it means a type cannot hold non-static references. A struct can have owned objects like String, i8 etc, but not &'a str unless 'a is the same as 'static

Another question, why not have a GenericClient with Config as associated type?

I think you are moving generics from a place to another. Ideally, a client should be agnostic to configs in terms of functionalities and type signatures. And honestly, I don't think dynamic dispatch and generics don't make much difference in terms of performance. We've already spent trillion of cycles to run transformers after all. Programming ergonomics matters more in my opinion. But it's not the case in Iterator.

# Conflicts:
#	async-openai/src/chat.rs
#	async-openai/src/client.rs
#	async-openai/src/completion.rs
#	async-openai/src/file.rs
#	async-openai/src/image.rs
#	async-openai/src/moderation.rs
@ifsheldon
Copy link
Contributor Author

@64bit hey I saw new releases getting published and new PR getting merged while this PR got no feedback. Do you have any comments?

@64bit
Copy link
Owner

64bit commented Nov 26, 2023

This does look like moving in right direction long term, and similar to how it looked before Azure support.

I recalled why Generics were used in first place - so that differences in Azure and OpenAI can be handled in type safe manner.

But now Azure OpenAI service not available to everyone, and with growing drift between Azure and OpenAI spec just means that long term async-openai should just drop support for Azure OpenAI service. And so this PR is in right direction.

Now breaking change in this is really breaking, and so moving in this direction means this is one way and not going back (type safe API for two platforms)

So we target this to be released in next version bump? v0.18.0, if yes do you mind updating this PR?

Thats all I have on the topic.

@ifsheldon
Copy link
Contributor Author

ifsheldon commented Nov 26, 2023

I recalled why Generics were used in first place - so that differences in Azure and OpenAI can be handled in type > safe manner.

But now Azure OpenAI service not available to everyone, and with growing drift between Azure and OpenAI spec just means that long term async-openai should just drop support for Azure OpenAI service.

I see. I rely on both Azure and OpenAI. I recently took a closer look at the differences between Azure and OpenAI's APIs. Except headers and API versioning, they work almost the same. Recently because OpenAI dramatically changed their APIs, Azure and OpenAI APIs diverges a lot, but I think Azure's APIs of a newer version will converge again with OpenAI's.

The premise to remove generics of Client is that APIs of the two are not so different, but for now the premise does not hold unfortunately. If Azure is still supported, then we should wait and see Azure's APIs of a newer version that converge again. If Azure support is drop in v0.18.0, then I think we can just go ahead.

Off topic a bit:
I'm afraid we will eventually have to manage multiple sets of APIs of different versions with different concrete implementations. Azure does API versioning properly and OpenAI for now just break things and move fast, but I guess, from the fact that OpenAI marks their APIs as v1 in their API base (i.e. https://api.openai.com/v1), they will come up with a totally different set of APIs probably in v2. When it comes true, we will also need to decide whether to drop support for v1 in Config and/or how to do the dispatch (like generics "static dispatch" or dynamic dispatch).

@64bit
Copy link
Owner

64bit commented Nov 26, 2023

Good points!

I think Azure's APIs of a newer version will converge again with OpenAI's

assuming sometime in near future, that would be best time to get this PR in (and most justified reason for anyone to deal with breaking changes) and Azure support doesnt need to be dropped.

For now we wait for Azure OpenAI Service.

@64bit
Copy link
Owner

64bit commented Jan 18, 2024

Hello @ifsheldon

Would you like to get this into next version bump, perhaps v0.19.0 or v0.20.0?

It seems not worth holding too long for uncertainties surrounding Azure. Not having access to it is tough to maintain changes related to it. So I'd be completely relying on you for testing.

This PR would also be beneficial to dynamic dispatch to other LLMs / AI providers fully compatible with OpenAI.

# Conflicts:
#	async-openai/src/chat.rs
due to tokio::spawn in tests that requires `Send`
@ifsheldon
Copy link
Contributor Author

Yeah. Noteworthy is that after merging updates from main, a few tests failed to compile due to tokio::spawn which requires Send, so I had to add more trait bounds (Send and Sync) to Config trait and use Arc in Client.

@ifsheldon
Copy link
Contributor Author

ifsheldon commented Jan 18, 2024

I think Azure support is still fine, incomplete though, because only latest version of Azure APIs is and will be compatible with OpenAI's. For example, I think this crate, as of now, can only support 2023-12-01-preview version of Azure APIs, which is basically the same as OpenAI's APIs. But I think we should mention this somewhere in the documentation.

@64bit
Copy link
Owner

64bit commented Jan 18, 2024

Ah nice, anything on Azure that works with latest of OpenAI is good.

Please feel free to repurpose/augment the note about Azure in README for documentation.

Enabling extra configs that can be opted in, which may be needed for LLM services other than OpenAI
@ifsheldon
Copy link
Contributor Author

I think I've got a better idea. In 639eaa4 , I added BaseConfig and a new Config with generics and a generics default. API users can now opt in extra configs at the cost of writing their own Client. The only issue is that API users need to impl BaseConfig instead of previously Config, which is a breaking change.

With this, any differences in difference LLM services/deployments can be handled in a type safe manner. API users just need to impl Config<SomeExtraDataType> and propagate this generic and extra data to where needed. Client now accepts any types impl Config<()>, which is ensured by a blanket impl as long as they impl BaseConfig.

We could make Client generic with a default as well, like

pub struct Client<ExtraConfigs = ()> {
    http_client: reqwest::Client,
    config: Arc<dyn Config<ExtraConfigs>>,
    backoff: backoff::ExponentialBackoff,
}

but I think it's over generic for now.

Tell me what you think @64bit

@64bit
Copy link
Owner

64bit commented Jan 23, 2024

That's a clever approach.

I see that blanket Config<()> would be the case for dynamic dispatching. However, from the library user perspective any non-blanket case would requires them to write separate code anyway - and so type for extra config must be managed in the user application - without it being built into async-openai - for example as a separate function parameter or generic type for function parameter.

Moreover extra_config() is not used in the library itself, so that also suggest it can live outside library and in user's application.

So, there's no rationale to make Config generic into Config<ExtraConfig>, because ExtraConfig plays no role in async-openai itself.

@ifsheldon
Copy link
Contributor Author

OK, make sense.

When I rethink about making Config generic, it occurs to me that various Config implementations are not enough but various Clients that make requests based on configs. So probably we should factor functionalities of the existing Client into traits. For example, a client impl ChatClient trait will have create and create_stream methods (factored from existing Chat).

Then we will/can have OpenAIClient that impls ChatClient, ImageClient etc, and a AzureClient can impl a different set of these functionality traits. Any other backend can also have their own client structs, but with unified interfaces defined by traits.

But this would be a radical change.

For now, I will revert 639eaa4

@64bit
Copy link
Owner

64bit commented Jan 23, 2024

Thats a good idea too. However, we want to keep things simple without falling into overengineering trap, above all, just working with openai should be prioritized even if that means library doesn't work with any other providers.

The assumption with other providers is that they provider compatible API, if APIs differ significantly then it would become a nightmare to maintain and so async-openai should limit scope to work well with just one provider that is OpenAI.

Other perspective that I often think about is that if we were to use ideas (and code) from this crate to implement OpenAPI spec for some other unrelated API provider - good ideas should spill over - meaning we have a good abstraction for generating Rust code from OpenAPI spec - this perspective help me keep things simple too.

@ifsheldon
Copy link
Contributor Author

However, we want to keep things simple without falling into overengineering trap, above all, just working with openai should be prioritized even if that means library doesn't work with any other providers.

yeah, that would be async-litellm in Rust.

It seems not worth holding too long for uncertainties surrounding Azure. Not having access to it is tough to maintain changes related to it. So I'd be completely relying on you for testing.

If you feel like ready, you can merge this before next release. I think now I don't have anything major to add. And tests seem fine.

@64bit
Copy link
Owner

64bit commented Jan 26, 2024

Thank you @ifsheldon, I'll give it one last review.

Hey @deipkumar, this was your feature request - do you wanna take it for test drive and provide us any feedback (you're also welome to review the PR) before this makes it into a release?

@64bit 64bit added the upcoming release This is going to be part of one of the next few upcoming releases label Jan 26, 2024
@64bit 64bit removed the upcoming release This is going to be part of one of the next few upcoming releases label May 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants