Skip to content

Emacs on steroids with GPT

License

Notifications You must be signed in to change notification settings

emacsmirror/le-gpt

Repository files navigation

weird-generated-logo

le-gpt.el

MELPA

le-gpt.el is a comprehensive Emacs package for interacting with large language models like GPT-4 and Claude 3.5 Sonnet. It's a feature-rich fork of gpt.el that adds project awareness, completion, region transform, and more to come.

The aim is to make sure Emacs stays up-to-date with modern GPT support, essentially aiming for a CursorAI for Emacs.

Features

  • Chat Interface: Create and manage multiple chat sessions with GPT. Use M-x le-gpt-chat to start a session. Key bindings in chat buffers include:

    • C-c C-c: Send follow-up command
    • C-c C-p: Toggle prefix visibility
    • C-c C-b: Copy code block at point
    • C-c C-t: Generate descriptive buffer name from its content
    • C-c C-s: Save the current buffer
  • Chat Buffer List: Display a list of all GPT chat buffers with M-x le-gpt-list-buffers. This feature allows you to manage and navigate through your GPT-related buffers efficiently. Mark buffers you want to delete with d. Execute those deletions with x. Unmark with u. You can visit a buffer in the list by hitting RET and refresh the list with g r. Generating buffer names with gpt via C-c C-t also works in the buffer list.

  • Save & load chats: Save a chat when visiting a buffer (or in the buffer list) with C-c C-s (or M-x le-gpt-chat-save-buffer and M-x le-gpt-buffer-list-save-buffer, respectively). You can load previously saved chats with M-x le-gpt-chat-load-file.

  • Completion at Point: Let GPT complete what you're currently writing. Use M-x le-gpt-complete-at-point to get suggestions based on your current cursor position. Suggest to bind this to a convenient key. I use C-M-n.

  • Region Transformation: Select a region you want GPT to transform. Use M-x le-gpt-transform-region to transform the selected region using GPT. Again, I use C-M-t as a shortcut.

  • Project Context: Select files from your project that GPT should use as context. Globally select project files to be used as context via M-x le-gpt-select-project-files or select local, per-command context by running the above commands with a prefix argument (C-u). Context is used by chat, completion, and region transforms. To deselect global context files, use M-x le-gpt-deselect-project-files or M-x le-gpt-clear-selected-context-files to clear the entire selection.

Mandatory GIFs

Chat Interface Completion at point
le-gpt-chat-demo le-gpt-complete-at-point-demo
Project Context Region Transformation
le-gpt-with-context-demo le-gpt-transform-region-demo

...and a screenshot for a small buffer list for completeness le-gpt-buffer-list

Installation

Prerequisites

You'll need Python packages for the API clients:

pip install openai anthropic jsonlines

You don't need to install all of them, but minimally openai or anthropic.

You'll also need API keys from OpenAI and/or Anthropic.

You'll also need markdown-mode for displaying the chat conversations nicely.

Using Melpa

le-gpt is available via MELPA.

Here's how to install it with straight:

(use-package le-gpt
  :bind (("M-C-g" . le-gpt-chat)
         ("M-C-n" . le-gpt-complete-at-point)
         ("M-C-t" . le-gpt-transform-region)
         ("M-C-s" . le-gpt-select-project-files)
         ("M-C-d" . le-gpt-deselect-project-files))
  :config
  ;; you need to set at least one of the following
  (setq le-gpt-openai-key "your-openai-key-here")
  (setq le-gpt-anthropic-key "your-anthropic-key-here"))

If you're using evil, you'll want to add

(with-eval-after-load 'evil
    (evil-define-key 'normal le-gpt-buffer-list-mode-map
      (kbd "RET") #'le-gpt-buffer-list-open-buffer
      (kbd "d") #'le-gpt-buffer-list-mark-delete
      (kbd "u") #'le-gpt-buffer-list-unmark
      (kbd "x") #'le-gpt-buffer-list-execute
      (kbd "gr") #'le-gpt-buffer-list-refresh
      (kbd "q") #'quit-window))

to get the above mentioned buffer list comands to work.

Configuration

See all available customizations via M-x customize-group RET le-gpt.

Basic configuration:

;; API Keys
(setq le-gpt-openai-key "sk-...")
(setq le-gpt-anthropic-key "sk-ant-...")

;; Model Parameters (optional)
(setq le-gpt-model "gpt-4o")
(setq le-gpt-max-tokens 2000)
(setq le-gpt-temperature 0)

;; API Selection (default is 'openai)
(setq le-gpt-api-type 'anthropic)

Usage

Chat Interface

Start a chat session:

M-x le-gpt-chat

If you provide a prefix argument, you can select context files for a single query.

Completion at Point

Get completions based on your current cursor position:

M-x le-gpt-complete-at-point

Project Context

Set project files as context:

M-x le-gpt-select-project-files

The context will be used by chat, completion, and region transforms.

Note that these files are persisted between multiple calls.

To deselect files:

M-x le-gpt-deselect-project-files

Or, to clear the entire selection:

M-x le-gpt-clear-selected-context-files

Region Transformation

Transform selection via:

M-x le-gpt-transform-region

Buffer List

Display a list of all GPT buffers:

M-x le-gpt-list-buffers

Contributing

Contributions are welcome! Please feel free to submit issues and pull requests on GitHub.

Feature roadmap

  • More models, e.g., groq (waiting for aisuite to support streaming)
  • Ability to generate images (?)
  • Add all files of the current project as context (?)
  • Ability to let GPT decide which context files it needs
  • RAG for indexing files (?)

License

le-gpt.el is licensed under the MIT License. See LICENSE for details.