We have 2 versions of web based (client side) LLM chat for ollama.com server. One version is made by claude.ai and the other by me. Both include scripts that fetch the list of models to choose from.
MY VERSION: My file "index.htm" has a preprompt scheme (similar to using a system prompt that can be changed on-the-fly). It lists the models on server (JavaScript calls). All code is embeded in one HTML file, making it a fully portable AIO app. It has text-to-speech features (click to use TTS). There are several styles (blank to programming) to assist.
CLAUDE VERSION: The file "index.html" is more of a LLM chat, but it won't recall your last chats...
In Windows, you must set environment variables for Ollama (2 settings, see img) or you will get CORS errors.