Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make torch and transformers imports optional #815

Merged
merged 1 commit into from
Apr 15, 2024

Conversation

rlouf
Copy link
Member

@rlouf rlouf commented Apr 14, 2024

The torch and transformers libraries are installed by default with Outlines. These two dependencies are heavy, and not needed for the users who want to use the llama-cpp-python or vllm integration. We thus make them optional.

@rlouf rlouf force-pushed the make-torch-transformers-optional branch 2 times, most recently from 8d80682 to ffe4a3c Compare April 15, 2024 07:55
@rlouf rlouf force-pushed the make-torch-transformers-optional branch from ffe4a3c to 70dbb79 Compare April 15, 2024 09:51
@rlouf rlouf merged commit 9103d06 into dottxt-ai:main Apr 15, 2024
5 checks passed
@rlouf rlouf deleted the make-torch-transformers-optional branch April 15, 2024 12:04
@rlouf rlouf linked an issue Apr 15, 2024 that may be closed by this pull request
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Make torch dependency optional
1 participant