Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update README (async) #88

Merged
merged 4 commits into from
Feb 23, 2024
Merged
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
112 changes: 73 additions & 39 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,75 +1,109 @@
<div align="center">
<img src="https://assets.portkey.ai/header.png" height=150><br />

## Build reliable, secure, and production-ready AI apps easily.

## Control Panel for AI Apps
```bash
pip install portkey-ai
```
</div>

## **💡 Features**

**🚪 AI Gateway:**
* Unified API Signature: If you've used OpenAI, you already know how to use Portkey with any other provider.
* Interoperability: Write once, run with any provider. Switch between _any model_ from _any provider_ seamlessly.
* Automated Fallbacks & Retries: Ensure your application remains functional even if a primary service fails.
* Load Balancing: Efficiently distribute incoming requests among multiple models.
* Semantic Caching: Reduce costs and latency by intelligently caching results.

**🔬 Observability:**
* Logging: Keep track of all requests for monitoring and debugging.
* Requests Tracing: Understand the journey of each request for optimization.
* Custom Tags: Segment and categorize requests for better insights.

</div>

## **🚀 Quick Start**
## Features

### AI Gateway
<table>
<tr>
<td width=50%><b>Unified API Signature</b><br />If you've used OpenAI, you already know how to use Portkey with any other provider.</td>
<td><b>Interoperability</b><br />Write once, run with any provider. Switch between any model from_any provider seamlessly. </td>
</tr>
<tr>
<td width=50%><b>Automated Fallbacks & Retries</b><br />Ensure your application remains functional even if a primary service fails.</td>
<td><b>Load Balancing</b><br />Efficiently distribute incoming requests among multiple models.</td>
</tr>
<tr>
<td width=50%><b>Semantic Caching</b><br />Reduce costs and latency by intelligently caching results.</td>
<td><b>Virtual Keys</b><br />Secure your LLM API keys by storing them in Portkey vault and using disposable virtual keys.</td>
</tr>
<tr>
<td width=50%><b>Request Timeouts</b><br />Manage unpredictable LLM latencies effectively by setting custom request timeouts on requests.</td>
</tr>
</table>

### Observability
<table width=100%>
<tr>
<td width=50%><b>Logging</b><br />Keep track of all requests for monitoring and debugging.</td>
<td width=50%><b>Requests Tracing</b><br />Understand the journey of each request for optimization.</td>
</tr>
<tr>
<td width=50%><b>Custom Metadata</b><br />Segment and categorize requests for better insights.</td>
<td width=50%><b>Feedbacks</b><br />Collect and analyse weighted feedback on requests from users.</td>
</tr>
<tr>
<td width=50%><b>Analytics</b><br />Track your app & LLM's performance with 40+ production-critical metrics in a single place.</td>
</tr>
</table>

## Usage

#### Prerequisites
1. [Sign up on Portkey](https://app.portkey.ai/) and grab your Portkey API Key
2. Add your [OpenAI key](https://platform.openai.com/api-keys) to Portkey's Virtual Keys page and keep it handy

#### First, install the SDK & export Portkey API Key
[Get Portkey API key here.](https://app.portkey.ai/signup)
```bash
# Installing the SDK

$ pip install portkey-ai
$ export PORTKEY_API_KEY=PORTKEY_API_KEY
```

#### Now, let's make a request with GPT-4

#### Making a Request to OpenAI
* Portkey fully adheres to the OpenAI SDK signature. You can instantly switch to Portkey and start using our production features right out of the box. <br />
* Just replace `from openai import OpenAI` with `from portkey_ai import Portkey`:
```py
from portkey_ai import Portkey

# Construct a client with a virtual key
portkey = Portkey(
openai = Portkey(
api_key="PORTKEY_API_KEY",
virtual_key="VIRTUAL_KEY"
)

completion = portkey.chat.completions.create(
chat_completion = openai.chat.completions.create(
messages = [{ "role": 'user', "content": 'Say this is a test' }],
model = 'gpt-3.5-turbo'
model = 'gpt-4'
)
print(completion)

print(chat_completion)
```

Portkey fully adheres to the OpenAI SDK signature. This means that you can instantly switch to Portkey and start using Portkey's advanced production features right out of the box.
#### Async Usage
* Use `AsyncPortkey` instead of `Portkey` with `await`:
```py
import asyncio
from portkey_ai import AsyncPortkey

client = AsyncPortkey(
api_key="PORTKEY_API_KEY",
virtual_key="VIRTUAL_KEY"
)

## **🤝 Supported Providers**
async def main():
chat_completion = await client.chat.completions.create(
messages=[{'role': 'user', 'content': 'Say this is a test'}],
model='gpt-4'
)

|| Provider | Support Status | Supported Endpoints |
|---|---|---|---|
| <img src="https://assets.portkey.ai/openai.png" width=18 />| OpenAI | ✅ Supported | `/completion`, `/embed` |
| <img src="https://assets.portkey.ai/azure.png" width=18>| Azure OpenAI | ✅ Supported | `/completion`, `/embed` |
| <img src="https://assets.portkey.ai/anthropic.png" width=18>| Anthropic | ✅ Supported | `/complete` |
| <img src="https://assets.portkey.ai/anyscale.png" width=18>| Anyscale | ✅ Supported | `/chat/completions` |
| <img src="https://assets.portkey.ai/cohere.png" width=18>| Cohere | 🚧 Coming Soon | `generate`, `embed` |
print(chat_completion)

asyncio.run(main())
```

---

#### [📝 Full Documentation](https://docs.portkey.ai/docs) | [🛠️ Integration Requests](https://github.com/Portkey-AI/portkey-python-sdk/issues) |
#### [Check out Portkey docs for the full list of supported providers](https://portkey.ai/docs/welcome/what-is-portkey#ai-providers-supported)

<a href="https://twitter.com/intent/follow?screen_name=portkeyai"><img src="https://img.shields.io/twitter/follow/portkeyai?style=social&logo=twitter" alt="follow on Twitter"></a>
<a href="https://discord.gg/sDk9JaNfK8" target="_blank"><img src="https://img.shields.io/discord/1143393887742861333?logo=discord" alt="Discord"></a>

## **🛠️ Contributing**
Get started by checking out Github issues. Feel free to open an issue, or reach out if you would like to add to the project!
#### Contributing
Get started by checking out Github issues. Email us at [email protected] or just ping on Discord to chat.
Loading