Talk to LLMs from within Org mode by using babel code blocks.
NOTE: ob-llm requires the llm CLI tool
ob-llm interfaces with LLMs by wrapping llm, a specific command line tool created by Simon Willison. Setting up llm, configuring it, and understanding its interface are required. Please visit the documentation for llm here. Also, capability with GNU Emacs, Org mode, and Babel is assumed.
Using llm from the command line:
llm -m 4o "Explain 'a person needs an LLM like a pelican needs a bicycle'"Using ob-llm from inside an Org mode buffer:
#+begin_src llm :m 4o
Explain 'a person needs an LLM like a pelican needs a bicycle'
#+end_src
Execute the llm Babel code block by calling org-ctrl-c-ctrl-c with the binding C-c C-c.
Since this package is a wrapper of the llm CLI tool, the feature set is mostly llm’s feature set.
Features provided by llm (found here):
- model selection (including local models) and configuration
- local sqlite storage, easily navigable by datasette, visidata, etc.
- use arbitrary databases (ex. different DB per work project), or “no-log”
- search history, continue any conversation
- streaming, system prompts, fragments (including web, URLs), attachments (multi-modal)
- plugins, schemas, templates
Features provided by ob-llm:
- interface with
llmfrom inside Org mode buffers, passing Babel code block header arguments as flags - supports async, streaming, and concurrent processes across multiple buffers
- finished responses are automatically converted to prettified JSON or Org mode:
schemaandschema-multiresponses are pretty-printed as JSON- other, general responses are converted from markdown to Org mode
- NOTE: JSON pretty-printing requires jq; Org mode conversion requires Pandoc
- NOTE: disable auto-conversion per prompt with a
:no-conversionBabel code block header argument, or disable auto-conversion with a customization setting via(setq ob-llm-post-process-auto-convert-p nil)
- refresh
llm-ready models withob-llm-refresh-models - model helpers: copy (kill), paste (yank), see/set default model
- interactively search the
llmlogs database withob-llm-query-logs - mode-line indicator for when ob-llm is in use
- immediate process termination with
ob-llm-kill-all-processes
- Enable with
(ob-llm-mode 1) - Write a code block:
#+begin_src llm...<prompt as body>...#+end_src - Execute with
C-c C-c
The header arguments (docs, reference card) for the Babel code blocks are a bit overloaded in that they might be used (1) to control Org mode Babel code block evaluation behavior, (2) to control ob-llm application behavior/settings, or otherwise (3) to be passed along as flags to the llm command.
- Org babel parameters are consumed by Org mode, and thus not passed to LLM command:
:results,:exports,:cache,:noweb,:session,:tangle,:hlines, etc.:results silentwill emit to the output buffer, but the results will not be streamed or copied in to the org buffer (or post-processed).
- Custom ob-llm parameters are used for special handling, ex.
:no-conversionto leave the completed response unaltered. - All other parameters are passed directly to the
llmcommand as flags.
ob-llm-yank-a-model-name | Select a model from ob-llm-models to paste |
ob-llm-kill-a-model-name | Select a model from ob-llm-models to kill |
ob-llm-echo-default-model | Show the current default model in the echo area |
ob-llm-change-default-model | Pick which model from ob-llm-models to set as default |
ob-llm-refresh-models | Update ob-llm-models with the models available to llm |
Search logs from the llm logs database with ob-llm-query-logs. Selecting one will copy (kill) the conversation ID, which can be used to then resume that conversation from any code block.
#+begin_src llm :system "single word response"
what is the longest word in the English language that is all vowels?
#+end_src
#+RESULTS:
Euouae
#+begin_src llm :s "single word response"
who is halfway between Rameau and Bach?
#+end_src
#+RESULTS:
Handel
Use ~:continue~ to keep going with the most recent conversation.
#+begin_src llm :continue :s "terse"
who would come before Rameau in this same line
#+end_src
#+RESULTS:
Couperin
Now call ~ob-llm-query-logs~, search for "vowels", then select the original
conversation. Pass its ID with the header argument ~:cid~ or ~:conversation~.
#+begin_src llm :cid 01jwrgrwaj73adxm7prx46gpxj :s "terse"
definition?
#+end_src
#+RESULTS:
A medieval musical notation representing the vowel sounds of "seculorum Amen"
sung at the end of psalms.
ob-llm-line-indicator | What to show in the mode line while a process is active. Default is “🦆” |
ob-llm-post-process-auto-convert-p | Whether to convert completed responses to prettified JSON (schema) or Org mode (regular). Default is t |
ob-llm-models | Models available to yank, kill, and set as default. Update this by calling ob-llm-refresh-models |
ob-llm-pandoc-additional-org-mode-conversion-flags | Additional flags to pass to Pandoc when converting general responses to Org mode |
When a general response is finished (as opposed to a code block with header arguments of :schema or :schema-multi (docs)), it is automatically converted to Org mode using Pandoc. This can be turned off for a single code block with a header argument of :no-conversion, or as a customization by setting ob-llm-post-process-auto-convert-p to nil.
Additional flags can be passed to Pandoc during the “convert the completed markdown response to org mode” step using ob-llm-pandoc-additional-org-mode-conversion-flags. For instance, to exclude the PROPERTY drawers from Org headings:
(setq ob-llm-pandoc-additional-org-mode-conversion-flags
'("--lua-filter=/Users/myuser/.local/share/remove-header-attr.lua"))And then these contents for the remove-header-attr.lua file are at that location:
function Header (header)
return pandoc.Header(header.level, header.content, pandoc.Attr())
end