Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(refactor): ✨ remove brew command and add proper ollama installation for colab to run in free tier #223

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

onuralpszr
Copy link
Contributor

What does this PR do?

This PR refactors the setup for Ollama in Colab by removing the Homebrew command and adding a compatible installation method for the google colab free tier without requirement of "terminal/console"

Fixes #123

Who can review?

@merveenoyan and @stevhliu.

Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@onuralpszr
Copy link
Contributor Author

@@ -72,13 +72,74 @@
" llama-index-llms-ollama"
Copy link
Member

@stevhliu stevhliu Oct 29, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ollama installation

These dependencies help properly detect the GPU.


Reply via ReviewNB

@@ -72,13 +72,74 @@
" llama-index-llms-ollama"
Copy link
Member

@stevhliu stevhliu Oct 29, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Install Ollama.


Reply via ReviewNB

@@ -72,13 +72,74 @@
" llama-index-llms-ollama"
Copy link
Member

@stevhliu stevhliu Oct 29, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"Run Ollama service in the background."


Reply via ReviewNB

@@ -72,13 +72,74 @@
" llama-index-llms-ollama"
Copy link
Member

@stevhliu stevhliu Oct 29, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Llama2 from the Ollama library.


Reply via ReviewNB

Copy link
Member

@stevhliu stevhliu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a few notes but LGTM overall, thanks!

…ion for colab to run in free tier

Signed-off-by: Onuralp SEZER <[email protected]>
@onuralpszr onuralpszr force-pushed the noterm/rag_llamaindex_librarian branch from 239d3d4 to 61cd216 Compare October 29, 2024 15:40
@onuralpszr
Copy link
Contributor Author

@stevhliu thank you and all changes are done.

@merveenoyan
Copy link
Collaborator

merveenoyan commented Oct 30, 2024

the CI times out after 6 hrs, there's no issue regarding this notebook.

@onuralpszr
Copy link
Contributor Author

the CI times out after 6 hrs, there's no issue regarding this notebook.

Yup :)

@onuralpszr
Copy link
Contributor Author

Hello, I was waiting for a while and Is there anything I should do or check about it ?

Thank you.

cc @sergiopaniego

@sergiopaniego
Copy link
Contributor

Hello, I was waiting for a while and Is there anything I should do or check about it ?

Thank you.

cc @sergiopaniego

Thanks for solving the issue I created!! 😄
I think there is nothing else to be done, it'll be merged soon. Let's be patient 😃

@stevhliu
Copy link
Member

stevhliu commented Nov 7, 2024

The frontend team is working on a fix. I'll merge this for you @onuralpszr once its ready 👍

@onuralpszr
Copy link
Contributor Author

@stevhliu I did merge but still looks stuck on CI I was checking what causing it. I also notice other comment on other PR so I wanted to let you know.

@stevhliu
Copy link
Member

How odd! Can you try getting the latest changes from main again to see if that helps? It seemed to have worked for #238, but I see that it is stalling out again in several other PRs.

@onuralpszr
Copy link
Contributor Author

@stevhliu I did already If you check my merge commit, I will do it again but that's for new cookbook so won't make big difference anyway

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Can "Building A RAG Ebook "Librarian" Using LlamaIndex" be run using Google Colab?
4 participants