-
Notifications
You must be signed in to change notification settings - Fork 16.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Ollama embeddings (BAAI/bge-m3) model not found #29336
Comments
Can you try using |
@sardanabhav thanks for the reply, but yes already tried. |
That's weird. It works for me. Can you share the package versions you are using for reproduction? |
venv: python 3.12.8 langchain==0.3.15 |
I am not able to reproduce the issue with the given dependency versions. It works with both |
Is it possible that the error is based on the latest version of ollama? In the settings shown i'm using: dockerized and served on :11434 |
Sorry, I wasn't able to test out your version of ollama. I will test it today and let you know. |
Checked other resources
Example Code
from langchain_ollama.embeddings import OllamaEmbeddings
from langchain_community.vectorstores import FAISS
EMBED_MODEL_ID = "BAAI/bge-m3"
embeddings = OllamaEmbeddings(model=EMBED_MODEL_ID)
vectorstore = FAISS.from_documents(docs, embeddings)
Error Message and Stack Trace (if applicable)
Description
Ollama served on :11434 and embedding model pulled correctly.
Any suggestions?
System Info
The text was updated successfully, but these errors were encountered: