Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Ollama embeddings (BAAI/bge-m3) model not found #29336

Open
5 tasks done
Francesco9932 opened this issue Jan 21, 2025 · 7 comments
Open
5 tasks done

[BUG] Ollama embeddings (BAAI/bge-m3) model not found #29336

Francesco9932 opened this issue Jan 21, 2025 · 7 comments

Comments

@Francesco9932
Copy link

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

from langchain_ollama.embeddings import OllamaEmbeddings
from langchain_community.vectorstores import FAISS

EMBED_MODEL_ID = "BAAI/bge-m3"
embeddings = OllamaEmbeddings(model=EMBED_MODEL_ID)
vectorstore = FAISS.from_documents(docs, embeddings)

Error Message and Stack Trace (if applicable)

Image

Description

Ollama served on :11434 and embedding model pulled correctly.
Any suggestions?

System Info

Image

Image

@sardanabhav
Copy link
Contributor

Can you try using EMBED_MODEL_ID = "bge-m3" ?

@Francesco9932
Copy link
Author

@sardanabhav thanks for the reply, but yes already tried.
with EMBED_MODEL_ID = "bge-m3" ollama raise that bge-m3 is not a valid model repo.

@sardanabhav
Copy link
Contributor

@sardanabhav thanks for the reply, but yes already tried. with EMBED_MODEL_ID = "bge-m3" ollama raise that bge-m3 is not a valid model repo.

That's weird. It works for me. Can you share the package versions you are using for reproduction?

@Francesco9932
Copy link
Author

Francesco9932 commented Jan 22, 2025

@sardanabhav

venv: python 3.12.8

langchain==0.3.15
langchain_experimental==0.3.4
langchain_ollama==0.2.2
langchain_huggingface==0.1.2
faiss-cpu==1.9.0.post1

@sardanabhav
Copy link
Contributor

I am not able to reproduce the issue with the given dependency versions. It works with both EMBED_MODEL_ID = "bge-m3" and EMBED_MODEL_ID = "bge-m3:latest" .

@Francesco9932
Copy link
Author

Francesco9932 commented Jan 22, 2025

Is it possible that the error is based on the latest version of ollama?

In the settings shown i'm using:
ollama version is 0.5.7-0-ga420a45-dirty Warning: client version is 0.5.5

dockerized and served on :11434

@Francesco9932 Francesco9932 changed the title Ollama embeddings (BAAI/bge-m3) model not found [BUG] Ollama embeddings (BAAI/bge-m3) model not found Jan 23, 2025
@sardanabhav
Copy link
Contributor

Sorry, I wasn't able to test out your version of ollama. I will test it today and let you know.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants