You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
v1.4.1 is working with the following code,
but v1.5.0 don't work anymore(ollama version is 0.5.7):
from smolagents.agents import ToolCallingAgent
from smolagents import CodeAgent
from smolagents import tool, LiteLLMModel, DuckDuckGoSearchTool, PythonInterpreterTool, VisitWebpageTool
from typing import Optional
model = LiteLLMModel(
model_id="ollama_chat/qwen2.5",
api_base="http://localhost:11434", # replace with remote open-ai compatible server if necessary
# api_key="your-api-key", # replace with API key if necessary
num_ctx=102400
)
agent = CodeAgent(tools=[], model=model, add_base_tools=True,additional_authorized_imports=['requests', 'bs4', 'os'])
print(agent.run("download https://raw.githubusercontent.com/huggingface/smolagents/refs/heads/main/src/smolagents/tool_validation.py and list existing functions contained in file"))
Ollama return HTTP 400 Bad Request:
──────────────────────────────────────────────────────── New run ────────────────────────────────────────────────────────╮
│ │
│ download https://raw.githubusercontent.com/huggingface/smolagents/refs/heads/main/src/smolagents/tool_validation.py and │
│ list existing functions contained in file │
│ │
╰─ LiteLLMModel - ollama_chat/qwen2.5 ────────────────────────────────────────────────────────────────────────────────────╯
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ Step 0 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Error in generating model output:
litellm.APIConnectionError: Ollama_chatException - Client error '400 Bad Request' for url 'http://localhost:11434/api/chat'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400
[Step 0: Duration 0.02 seconds]
if using tcpdump, I see not particular change unless the payload size 9370 with v1.5.0 vs 9293 with v1.4.1 .
With v1.5.0:
sudo tcpdump -vvv -i any -s 0 -A 'tcp[((tcp[12:1] & 0xf0) >> 2):4] = 0x504F5354'
tcpdump: data link type LINUX_SLL2
tcpdump: listening on any, link-type LINUX_SLL2 (Linux cooked v2), snapshot length 262144 bytes
12:05:25.611240 lo In IP (tos 0x0, ttl 64, id 35460, offset 0, flags [DF], proto TCP (6), length 253)
localhost.47512 > localhost.11434: Flags [P.], cksum 0xfef1 (incorrect -> 0x5f72), seq 3424847324:3424847525, ack 1962811321, win 512, options [nop,nop,TS val 2477752340 ecr 2477752340], length 201
E.....@[email protected]..........,..#..t..............
........POST /api/chat HTTP/1.1
Host: localhost:11434
Accept: */*
Accept-Encoding: gzip, deflate
Connection: keep-alive
User-Agent: litellm/1.59.7
Content-Length: 9370
Content-Type: application/json
using ollama/qwen2.5 instead of ollama_chat/qwen2.5 works,
but ollama_chat/qwen2.5 is the recommended way for smolagents (If my memory is good).
And result seem's not as good as ollama_chat. (I have to experiment more to be sure)
v1.4.1 is working with the following code,
but v1.5.0 don't work anymore(ollama version is 0.5.7):
Ollama return HTTP 400 Bad Request:
if using tcpdump, I see not particular change unless the payload size 9370 with v1.5.0 vs 9293 with v1.4.1 .
With v1.5.0:
with v1.4.1:
The text was updated successfully, but these errors were encountered: