-
-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
v3.0 failed to select model when launched w/out Ollama running #100
Comments
This seems to be a first-time launch issue, once there is a cached list of models Ollamac selects from the cache, even while Ollama is closed. On subsequent tries the chat works fine after launching Ollama without any manual step. |
Great update by the way, this experience is much smoother! 🔥🦙 |
Confirmed bug @epheterson 😂 but it can be solved by clicking the "Try Again" after opening the Ollama. |
Thanks, I did click Try Again though! Maybe I clicked it before launching Ollama and it only showed once? Small bug anyway, people can find their way out |
FYI. I had this issue as a first time user, it was very confusing. It would automatically select the right model, so I didn't choose and pick it... Pressing enter or the button was just doing nothing at all. At least show a message or don't pre-select the model. Without this thread I'd have uninstalled it. |
I just went through these steps and hit a bug:
There seems to be some edge where Ollamac can be launched with Ollama being closed that doesn't self-heal without manually selecting a model.
The text was updated successfully, but these errors were encountered: