Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix v1beta api compatibility, modify tests and examples to use `gem… #13

Merged
merged 1 commit into from
May 17, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "GoogleGenAI"
uuid = "903d41d1-eaca-47dd-943b-fee3930375ab"
authors = ["Tyler Thomas <[email protected]>"]
version = "0.3.0"
version = "0.3.1"

[deps]
Base64 = "2a0f44e3-6c83-55bd-87e4-b1978d98bd5f"
Expand Down
28 changes: 14 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,14 +33,14 @@ Create a [secret API key in Google AI Studio](https://makersuite.google.com/)
using GoogleGenAI

secret_key = ENV["GOOGLE_API_KEY"]
model = "gemini-pro"
model = "gemini-1.5-flash-latest"
prompt = "Hello"
response = generate_content(secret_key, model, prompt)
println(response.text)
```
outputs
```julia
"Hello there! How may I assist you today? Feel free to ask me any questions you may have or give me a command. I'm here to help! 😊"
"Hello! 👋 How can I help you today? 😊"
```

```julia
Expand All @@ -50,7 +50,7 @@ println(response.text)
```
outputs
```julia
"Hello there, how may I assist you today?"
"Hello! 👋 How can I help you today? 😊"
```

```julia
Expand All @@ -71,40 +71,40 @@ outputs
### Multi-turn conversations

```julia
# Define the provider with your API key (placeholder here)
using GoogleGenAI

provider = GoogleProvider(api_key=ENV["GOOGLE_API_KEY"])
api_kwargs = (max_output_tokens=50,)
model_name = "gemini-pro"
model = "gemini-1.5-flash-latest"
conversation = [
Dict(:role => "user", :parts => [Dict(:text => "When was Julia 1.0 released?")])
]

response = generate_content(provider, model_name, conversation)
response = generate_content(provider, model, conversation)
push!(conversation, Dict(:role => "model", :parts => [Dict(:text => response.text)]))
println("Model: ", response.text)

push!(conversation, Dict(:role => "user", :parts => [Dict(:text => "Who created the language?")]))
response = generate_content(provider, model_name, conversation; api_kwargs)
response = generate_content(provider, model, conversation; api_kwargs)
println("Model: ", response.text)
```
outputs
```julia
"Model: August 8, 2018"

"Model: Jeff Bezanson, Alan Edelman, Viral B. Shah, Stefan Karpinski, and Keno Fischer
"Model: Julia 1.0 was released on **August 8, 2018**."

Julia Computing, Inc. is the company that provides commercial support for Julia."
"Model: Julia was created by a team of developers at MIT, led by **Jeff Bezanson, Stefan Karpinski, Viral B. Shah, and Alan Edelman**."
```

### Count Tokens
```julia
using GoogleGenAI
n_tokens = count_tokens(ENV["GOOGLE_API_KEY"], "gemini-pro", "Hello")
model = "gemini-1.5-flash-latest"
n_tokens = count_tokens(ENV["GOOGLE_API_KEY"], model, "The Julia programming language")
println(n_tokens)
```
outputs
```julia
1
4
```

### Create Embeddings
Expand Down Expand Up @@ -169,7 +169,7 @@ safety_settings = [
Dict("category" => "HARM_CATEGORY_HARASSMENT", "threshold" => "BLOCK_MEDIUM_AND_ABOVE"),
Dict("category" => "HARM_CATEGORY_DANGEROUS_CONTENT", "threshold" => "BLOCK_LOW_AND_ABOVE")
]
model = "gemini-pro"
model = "gemini-1.5-flash-latest"
prompt = "Hello"
api_kwargs = (safety_settings=safety_settings,)
response = generate_content(secret_key, model, prompt; api_kwargs)
Expand Down
2 changes: 1 addition & 1 deletion src/GoogleGenAI.jl
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ function _parse_response(response::HTTP.Messages.Response)
concatenated_texts = join(all_texts, "")
candidates = [Dict(i) for i in parsed_response[:candidates]]
finish_reason = candidates[end][:finishReason]
safety_rating = Dict(parsed_response.promptFeedback.safetyRatings)
safety_rating = Dict(parsed_response.candidates[end].safetyRatings)

return (
candidates=candidates,
Expand Down
6 changes: 3 additions & 3 deletions test/runtests.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ if haskey(ENV, "GOOGLE_API_KEY")
http_kwargs = (retries=2,)
# Generate text from text
response = generate_content(
secret_key, "gemini-pro", "Hello"; api_kwargs, http_kwargs
secret_key, "gemini-1.5-flash-latest", "Hello"; api_kwargs, http_kwargs
)

# Generate text from text+image
Expand All @@ -26,10 +26,10 @@ if haskey(ENV, "GOOGLE_API_KEY")
# Multi-turn conversation
conversation = [Dict(:role => "user", :parts => [Dict(:text => "Hello")])]
response = generate_content(
secret_key, "gemini-pro", conversation; api_kwargs, http_kwargs
secret_key, "gemini-1.5-flash-latest", conversation; api_kwargs, http_kwargs
)

n_tokens = count_tokens(secret_key, "gemini-pro", "Hello")
n_tokens = count_tokens(secret_key, "gemini-1.5-flash-latest", "Hello")
@test n_tokens == 1

embeddings = embed_content(secret_key, "embedding-001", "Hello")
Expand Down
Loading