-
Notifications
You must be signed in to change notification settings - Fork 3
Home
See the Ollama FAQ for the full details on this.
https://github.com/ollama/ollama/blob/main/docs/faq.md
Ollama defaults to serving on port 11434. If you can configure your ollama to serve on port 80 the rest of this will not be necessary.
As admin user go to
admin/settings.php?section=httpsecurity
Remove this from cURL blocked hosts list
192.168.0.0/16
Assuming your ollama is not listening on port 80, add its port to cURL allowed ports list
e.g. 11434
Assuming you have put your Ollama server on a local machine with the dns set up as myollama, The Endpoint url will probably look something like
http://myollama:11434/v1/chat/completions
I have found that the mistral model responds well to being asked to only return json.
An interesting cloud service with open source models is available at https://console.groq.com Get a key and use this for the endpoint https://api.groq.com/openai/v1/chat/completions
See here for a list of supported models https://console.groq.com/docs/models
See this search for items in the plugins database
https://moodle.org/plugins/?q=chatgpt
This is an attempt to create a list, the quality of the plugins may vary widely
https://github.com/marcusgreen/moodle-qtype_aitext
https://github.com/enovation/moodle-local_ai_connector
https://github.com/marcusgreen/moodle-tool_aiconnect (fork of ennovation plugin)
https://github.com/yedidiaklein/moodle-local_aiquestions
https://github.com/praxisdigital/assignsubmission_pxaiwriter
https://github.com/praxisdigital/mod_smartlink
https://github.com/developerck/moodle-atto_aic
https://github.com/EduardoKrausME/moodle-local_geniai
https://github.com/michael-milette/moodle-local_aiid.git