Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add magpie support llama cpp ollama #1086

Merged
merged 14 commits into from
Jan 8, 2025

Conversation

davidberenstein1957
Copy link
Member

I added extended support for Ollama and Llamacpp.

  • ollama support
  • llamacpp support
  • minor refactors w.r.t. import of disitlabel.models module.

I looked into adding OpenAI API format for other providers but this does not work because tokenization is handled server-side and cannot be disabled out of the box.

Perhaps we can refactor the HF InferenceClient a bit to make this work.

Ollama

from distilabel.models import LlamaCppLLM
from distilabel.steps.tasks import Magpie

llm = LlamaCppLLM(
    model_path="smollm2-360m-instruct-q8_0.gguf",
    tokenizer_id="HuggingFaceTB/SmolLM2-360M-Instruct",
    magpie_pre_query_template="qwen2",
)
magpie = Magpie(
    llm=llm,
)
magpie.load()

print(next(magpie.process(inputs=[{"system": "You are a helpful assistant."}])))

Llamacpp

from distilabel.models import OllamaLLM
from distilabel.steps.tasks import Magpie

llm = OllamaLLM(
    model="llama3.1",
    tokenizer_id="meta-llama/Meta-Llama-3-8B-Instruct",
    magpie_pre_query_template="llama3",
)
magpie = Magpie(llm=llm)
magpie.load()

print(next(magpie.process(inputs=[{"system_prompt": "You are a helpful assistant."}])))

Copy link

codspeed-hq bot commented Dec 23, 2024

CodSpeed Performance Report

Merging #1086 will not alter performance

Comparing feat/add-magpie-support-llama-cpp-ollama (f4229c9) with develop (f1f7d77)

Summary

✅ 1 untouched benchmarks

@davidberenstein1957 davidberenstein1957 marked this pull request as ready for review December 23, 2024 16:01
@davidberenstein1957 davidberenstein1957 changed the title Feat/add magpie support llama cpp ollama Add magpie support llama cpp ollama Dec 24, 2024
Copy link

github-actions bot commented Jan 8, 2025

Documentation for this PR has been built. You can view it at: https://distilabel.argilla.io/pr-1086/

Copy link
Contributor

@burtenshaw burtenshaw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just some minor comments on documentation, but looks ok.

src/distilabel/models/llms/ollama.py Outdated Show resolved Hide resolved
@davidberenstein1957 davidberenstein1957 merged commit 344cce7 into develop Jan 8, 2025
8 checks passed
@davidberenstein1957 davidberenstein1957 deleted the feat/add-magpie-support-llama-cpp-ollama branch January 8, 2025 16:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants