le-gpt.el is a comprehensive Emacs package for interacting with large language models from OpenAI, Anthropic, and Deepseek. It's a feature-rich fork of gpt.el that adds project awareness, completion, region transform, and more to come.
The aim is to make sure Emacs stays up-to-date with modern GPT support, essentially aiming for a CursorAI for Emacs.
- 0.8.0: Add support to interrupt gpt process (
le-gpt-interrupt) - 0.7.0: Update model list to include GPT-5 family & make it configurable.
- 0.6.0: Add support for filtering buffer list via regex on content. Add optional
le-gpt-consult-buffersfunction. - 0.5.0: Add buffers as context; remove global context file support.
- 0.4.0: Add DeepSeek support
-
Chat Interface: Create and manage multiple chat sessions with GPT. Use
M-x le-gpt-chatto start a session. See usage for more details. -
Chat Buffer List: Display a list of all GPT chat buffers with
M-x le-gpt-list-buffers. This feature allows you to manage and navigate through your chat buffers efficiently. See usage for more details. -
Completion at Point: Let GPT complete what you're currently writing. Use
M-x le-gpt-complete-at-pointto get suggestions based on your current cursor position. See usage for more details. -
Region Transformation: Select a region you want GPT to transform. Use
M-x le-gpt-transform-regionto transform the selected region using GPT. See usage for more details. -
Context: Select files from your project and buffers that GPT should use as context. You can select per-command context by running the above commands with a prefix argument (
C-u). Context is used by chat, completion, and region transforms. See usage for more details.
| Chat Interface | Completion at point |
|---|---|
| Context | Region Transformation |
|---|---|
...and a screenshot for a small buffer list for completeness
You'll need Python packages for the API clients:
pip install openai anthropic jsonlinesYou don't need to install all of them, but minimally openai or anthropic.
For deepseek you'll need openai.
You'll also need API keys from OpenAI and/or Anthropic and/or Deepseek.
You'll also need markdown-mode for displaying the chat conversations nicely.
le-gpt is available via MELPA.
Here's how to install it with straight:
(use-package le-gpt
:after evil
;; suggested keybindings
:bind (("M-C-g" . le-gpt-chat)
("M-C-n" . le-gpt-complete-at-point)
("M-C-t" . le-gpt-transform-region)
("M-C-k" . le-gpt-interrupt)
;; if you use consult
("C-c C-s" . le-gpt-consult-buffers))
:config
;; set default values as you wish (and swith with `le-gpt-switch-model`)
(setq le-gpt-api-type 'anthropic)
(setq le-gpt-model "claude-sonnet-4-20250514")
(setq le-gpt-max-tokens 10000)
(setq le-gpt-openai-key "xxx")
(setq le-gpt-anthropic-key "xxx")
(setq le-gpt-deepseek-key "xxx"))If you're using evil, you'll want to add
(with-eval-after-load 'evil
(evil-define-key 'normal le-gpt-buffer-list-mode-map
(kbd "RET") #'le-gpt-buffer-list-open-buffer
(kbd "d") #'le-gpt-buffer-list-mark-delete
(kbd "u") #'le-gpt-buffer-list-unmark
(kbd "x") #'le-gpt-buffer-list-execute
(kbd "gr") #'le-gpt-buffer-list-refresh
(kbd "/") #'le-gpt-buffer-list-filter
(kbd "C-c C-s") #'le-gpt-consult-buffers
(kbd "q") #'quit-window))to get the above mentioned buffer list comands to work.
See all available customizations via M-x customize-group RET le-gpt.
Basic configuration:
;; API Keys
(setq le-gpt-openai-key "sk-...")
(setq le-gpt-anthropic-key "sk-ant-...")
(setq le-gpt-deepseek-key "sk-...")
;; Model Parameters (optional)
(setq le-gpt-api-type 'anthropic)
(setq le-gpt-model "claude-sonnet-4-20250514") ;; make sure this matches le-gpt-api-type
(setq le-gpt-max-tokens 10000)
(setq le-gpt-temperature 0)You can add context for all of the below functionality by calling the functions with a prefix argument (C-u).
You'll then be prompted to add project files and buffers as context.
For convenience, you also have the option to use a previous context selection.
Start a chat session:
M-x le-gpt-chatKey bindings in chat buffers include:
C-c C-c: Send follow-up commandC-c C-p: Toggle prefix visibilityC-c C-b: Copy code block at pointC-c C-t: Generate descriptive buffer name from its contentC-c C-s: Save the current bufferC-c C-k: Interrupt the current process
Display a list of all GPT buffers (created via le-gpt-chat):
M-x le-gpt-list-buffersYou can narrow down the list by regex'ing over the content pressing / followed by your regex.
To reset the filter, hit C-c C-r.
Mark buffers you want to delete with d. Execute those deletions with x. Unmark with u.
You can visit a buffer in the list by hitting RET and refresh the list with g r.
Generating buffer names with gpt via C-c C-t also works in the buffer list.
Save a chat when visiting a buffer (or in the buffer list) with C-c C-s (or M-x le-gpt-chat-save-buffer and M-x le-gpt-buffer-list-save-buffer, respectively).
You can load previously saved chats with M-x le-gpt-chat-load-file.
Get completions based on your current cursor position:
M-x le-gpt-complete-at-pointTransform selection via:
M-x le-gpt-transform-regionAs mentioned above, you can set the model + provider like so:
(setq le-gpt-api-type 'anthropic)
(setq le-gpt-model "claude-sonnet-4-20250514")For convenience, you can also cycle through different models via le-gpt-switch-model.
You can update the underyling model list by customizing le-gpt-model-list.
For long(er) running responses that are going in the wrong direction, you may want to interrupt them.
You can do that in a chat using C-c C-k and otherwise with le-gpt-interrupt.
Contributions are welcome! Please feel free to submit issues and pull requests on GitHub.
le-gpt.el is licensed under the MIT License. See LICENSE for details.