Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make transformers dependency explicit #240

Merged
merged 3 commits into from
Aug 16, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 9 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,10 @@ You can follow [@NormalComputing](https://twitter.com/NormalComputing), [@remilo
pip install outlines
```

The dependencies needed to use models are not installed by default. You will need to run:

- `pip install openai` to be able to use OpenAI [models](https://platform.openai.com/docs/api-reference).
- `pip install transformers` to be able to use HuggingFace `transformers` [models](https://huggingface.co/models?pipeline_tag=text-generation).

## Guided generation

Expand Down Expand Up @@ -94,7 +98,11 @@ import outlines.models as models

model = models.transformers("gpt2")

prompt = labelling("Just awesome", examples)
prompt = """You are a sentiment-labelling assistant.
Is the following review positive or negative?

Review: This restaurant is just awesome!
"""
answer = generate.choice(model, ["Positive", "Negative"])(prompt)
```

Expand Down
7 changes: 6 additions & 1 deletion outlines/models/transformers.py
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,12 @@ def convert_token_to_string(self, token: str) -> str:


def transformers(model_name: str, device: Optional[str] = None, **model_kwargs):
from transformers import AutoModelForCausalLM
try:
from transformers import AutoModelForCausalLM
except ImportError:
raise ImportError(
"The `transformers` library needs to be installed in order to use `transformers` models."
)

model = AutoModelForCausalLM.from_pretrained(model_name, **model_kwargs)
tokenizer = TransformersTokenizer(model_name)
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ build-backend = "setuptools.build_meta"
name = "outlines"
authors= [{name = "Normal Computing", email = "[email protected]"}]
description = "Probabilistic Generative Model Programming"
requires-python = ">=3.7"
requires-python = ">=3.10"
keywords=[
"normal computing",
"machine learning",
Expand Down