Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uncaught Error: error sending request for url #249

Open
knightmarehs opened this issue Aug 12, 2024 · 11 comments
Open

Uncaught Error: error sending request for url #249

knightmarehs opened this issue Aug 12, 2024 · 11 comments

Comments

@knightmarehs
Copy link

The agents can't generate conversations, the traceback is as follows:

8/12/2024, 5:33:59 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] Uncaught Error: error sending request for url (http://127.0.0.1:11434/api/embeddings): error trying to connect: tcp connect error: Connection refused (os error 61)

@ianmacartney
Copy link
Collaborator

ianmacartney commented Aug 12, 2024 via email

@uannyao
Copy link

uannyao commented Sep 29, 2024

Hi there, I'm facing a similar issue, the agents can't generate conversations, but in the traceback, it's saying the request was forbidden:

9/29/2024, 2:53:34 PM [CONVEX A(aiTown/agentOperations:agentGenerateMessage)] Uncaught Error: Request to http://localhost:11434/api/embeddings forbidden

@likeUMR
Copy link

likeUMR commented Oct 5, 2024

I have the same problem too!!!

Uncaught Error: Request to http://127.0.0.1:11434/api/embeddings forbidden

When i change the embeddings to embed(base on a post about the ollama new api doc, so it is http://127.0.0.1:11434/api/embed) the problem still exist!

I have tested the api in terminal and it seems correctly working!
(curl -X POST http://127.0.0.1:11434/api/embed :11434/api/embed
-H "Content-Type: application/json"
-d '{"model":"mxbai-embed-large","input":"Stella is talking to Lucky"}')

I don't know how to set the convex to fix that problem?

@quanchentg
Copy link

@likeUMR did you resolve this issue or figure something out? Thanks.

@quanchentg
Copy link

@ianmacartney is there any clue for this issue? Thanks.

@ianmacartney
Copy link
Collaborator

ianmacartney commented Oct 22, 2024 via email

@leol15
Copy link

leol15 commented Oct 27, 2024

For me this happened when the frontend is pointing to Convex backend managed by Convex in the cloud (dashboard.convex.dev).

Check your .env.local file and make sure VITE_CONVEX_URL=http://127.0.0.1:3210 is there. If not you can re-generate it with npm run dev:backend after clearing the contents of .env.local.

Then you might need to restart the frontend app - npm run dev:frontend to pick up the change.

@quanchentg
Copy link

@leol15 Thanks for your nice suggestions. Let me give it a try!

@Tsailj
Copy link

Tsailj commented Nov 22, 2024

@leol15 您好,想问您是使用本地的convex后端,不使用云端的convex就能成功跟Ollama通信是吗,我现在遇到的问题就是跟Ollama的连接被拒绝,我现在的convex使用的是云端的dashboard.convex.dev,.env.local也是设置的云端的没有用http://127.0.0.1:3210,您是使用http://127.0.0.1:3210就能避免通信被拒这个问题是吗

@Tsailj
Copy link

Tsailj commented Nov 23, 2024

Uncaught Error: Request to http://127.0.0.1:11434/api/embeddings forbidden

When i change the embeddings to embed(base on a post about the ollama new api doc, so it is http://127.0.0.1:11434/api/embed) the problem still exist!

Hello, I’m facing the same issue as you. I also changed ‘embedding’ to ‘embed’, but I’m still getting an error(Uncaught Error: Request to http://localhost:11434/api/embed forbidden
). Have you found a solution? Thank you.

@leol15
Copy link

leol15 commented Nov 24, 2024

@Tsailj Yes, local backend worked for me. This is the relevant command to start the backend https://github.com/a16z-infra/ai-town/blob/main/Justfile#L31

I'm unsure how the cloud backend was able to communicate with Ollama running locally, appreciate if others could explain!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants