-
Notifications
You must be signed in to change notification settings - Fork 464
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
supporting Gemini #74
Comments
@koreanssam, Gemini models are already supported (refer #69), what you see are warnings so it should be fine, could you print the output of zerox api to check if you are getting some sensible output. Also I am assuming that you are setting correct api keys are per example: os.environ["GEMINI_API_KEY"] = "your-api-key" |
The great news is that For those interested, here's a quick code snippet to get started with Gemini:
Now, let's dive a bit deeper. Before I do, I must say how much I like using Gemini. It's my go-to LLM, and I've tried it in several other places, too:
The biggest perk of Gemini is its free tier. However, as with all things free, there are some limitations. For example, if you're working with a PDF over 10 pages, you might hit Google's rate limit for the Gemini model under the free tier. It seems you're facing is a warning, as mentioned by @pradhyumna85 +.
Due to these limitations with Gemini, I was inspired to open this PR: I hope this helps! |
`model = "gemini/gemini-1.5-flash-002"``
2024-10-25 23:04:56,975 - INFO - HTTP Request: POST https://generativelanguage.googleapis.com/v1beta/models/gemini-1.5-flash-002:generateContent?key=secret^^ "HTTP/1.1 200 OK"
2024-10-25 23:05:16,321 - INFO -
LiteLLM completion() model= gemini-1.5-flash-002; provider = gemini
�[92m23:05:16 - LiteLLM:WARNING�[0m: vertex_ai_non_gemini.py:198 - No text in user content. Adding a blank text to user content, to ensure Gemini doesn't fail the request. Relevant Issue - https://github.com/BerriAI/litellm/issues/5515
2024-10-25 23:05:16,332 - WARNING - No text in user content. Adding a blank text to user content, to ensure Gemini doesn't fail the request. Relevant Issue - https://github.com/BerriAI/litellm/issues/5515
How can I use Gemini?
The text was updated successfully, but these errors were encountered: