Releases: Blarc/ai-commits-intellij-plugin
Releases · Blarc/ai-commits-intellij-plugin
v2.9.0
Added
- More options for configuring LLM clients.
- Use the chosen LLM client icon as the generate commit message action's icon.
- Option to stop the commit message generation by clicking the action icon again.
- Setting for HuggingFace client to automatically remove prompt from the generated commit message.
- Show progress and result when refreshing models via API.
- Support for Mistral AI.
Fixed
- The progress bar for generating commit message continues running after the user creates the commit.
v2.8.0
Added
- Support streaming mode for Gemini Google.
- Support GitHub models client.
- Theme based icons for better visibility.
Fixed
- Project specific locale is not used when creating prompt.
- Properties topP and topK are not used when verifying Gemini Google client configuration.
v2.7.1
Added
- Option to set top K and top P in Gemini Google client settings.
Fixed
- Unable to submit request to Gemini Google because it has a topK value of 64 but the supported range is from 1 (inclusive) to 41 (exclusive).
v2.7.0
Added
- Support for Gemini Google.
- Save the size of the dialog for adding prompts, if it's resized by the user.
Changed
- Rename Gemini to Gemini Vertex.
- Use the correct icon for Gemini Vertex.
Fixed
- Project's specific prompt is not saved properly.
v2.6.0
Added
- Support streaming response.
- Support for Hugging Face.
v2.5.0
Added
- Support for Azure OpenAI.
- Sort LLM client configurations by provider name and configuration name.
Changed
- Update default prompt for generating commit messages with GitMoji.
Fixed
- Open AI configuration setting
organizationId
is not used when verifying configuration. - Gemini configuration settings
projectId
andlocation
are not used when verifying configuration. - Notification about common branch is shown after the prompt dialog is closed.
- Invalid caret position for prompt preview.
v2.4.1
Fixed
- Setting LLM client configuration or prompt as project specific does not work.
v2.4.0
Added
- Option to choose prompt per project.
- Amending commits now adds the changes from previous commit to the prompt.
Fixed
- Prompt does not contain diff for new files.
v2.3.1
Fixed
- NPE when retrieving TaskManager for prompt construction.
v2.3.0
Added
- Variables
{taskId}
,{taskSummary}
and{taskDescription}
for prompt customization that are replaced with values from the active task. - Option to configure LLM client configuration per project.
Changed
- Rethrow generic exceptions when generating commit messages.
- Replace
executeOnPooledThread
with coroutines andModalityState
.
Fixed
- NPE when verifying LLM client configuration.