Releases: dezoito/ollama-grid-search
Releases · dezoito/ollama-grid-search
v0.7.0
[Version 0.7.0] - 2024-11-24
Added
- Fully functional prompt management UI
- Prompt inputs can trigger autocomplete by starting with "/"
- Added SQLite integration for prompt management and other developments
Changed
- Several small UI improvements (mostly using ScrollAreas instead of overflows
- Improved validation rules on experiment form
v0.7.0-rc2
See the assets to download this version and install.
v0.7.0-rc1
See the assets to download this version and install.
v0.6.2
[Version 0.6.2] - 2024-10-29
Fixed
- The "refetch" button must be shown when there was an error in the inference call.
v0.6.1
[Version 0.6.1] - 2024-10-28
Changed
- When removing all experiment logs, only JSON files should be deleted.
- Add colors to prompt and system_prompt when displaying inference params in results.
- Border colors are used on the side of a result to group outputs from the same model.
v0.6.0
[Version 0.6.0] - 2024-10-20
Added
- Added UI controls to re-run past experiments.
- Added controls to remove experiment files from the UI.
- Added button to copy an inference text to the clipboard.
Changed
- Moved "reload" icon to improve layout.
- Improved experiment inspection UI readability.
- Streamlined State management.
Fixes
- Fix HMR not working on MacOS (in development, of course).
v0.5.3
[Version 0.5.3] - 2024-09-16
Fixes
- Handles Ollama servers using default ports (80 or 443)
v0.5.2
[Version 0.5.2] - 2024-09-15
Added
- Adds custom application icon.
Fixes
- Handles Ollama version info not being correctly returned by the server.
v0.5.1
[Version 0.5.1] - 2024-07-10
Added
- Added Clippy checks when saving Rust code.
- Corrected existing Rust code to pass Clippy checks.
- Improved UI for component that displays inference parameters with collapsible prompts.
Changed
- Fixes generation responses not returning metadata (like
eval_duration
,total_duration
,eval_count
). - Added Rust CI checks.
- Fixed padding in "Expand/Hide" buttons for params and metadate.
keep_alive
parameter for generation is set to Ollama's default (instead ofindefinitely
).
v0.5.0
[Version 0.5.0] - 2024-05-10
Added
- Allows multiple prompts when running experiments (courtesy of @calebsheridan).
- Allows multiple concurrent inference calls, matching Ollama's support for concurrency.
- Allows user to hide model names on the results pane, to avoid bias in evaluations.
- Added
concurrent_inferences
input to settings. - Added
hide_model_names
input to settings.
Changed
- Experiment inspection view was updated to show which prompt was used for each iteration, preserving their line breaks (courtesy of @calebsheridan).
- Minor UI tweaks.