Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Use `llm' as an Org Babel language

License

emacsmirror/ob-llm

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ob-llm

Talk to LLMs from within Org mode by using babel code blocks.

NOTE: ob-llm requires the llm CLI tool

ob-llm interfaces with LLMs by wrapping llm, a specific command line tool created by Simon Willison. Setting up llm, configuring it, and understanding its interface are required. Please visit the documentation for llm here. Also, capability with GNU Emacs, Org mode, and Babel is assumed.

Example

Using llm from the command line:

llm -m 4o "Explain 'a person needs an LLM like a pelican needs a bicycle'"

Using ob-llm from inside an Org mode buffer:

#+begin_src llm :m 4o
Explain 'a person needs an LLM like a pelican needs a bicycle'
#+end_src

Execute the llm Babel code block by calling org-ctrl-c-ctrl-c with the binding C-c C-c.

Features

Since this package is a wrapper of the llm CLI tool, the feature set is mostly llm’s feature set.

Features provided by llm (found here):

Features provided by ob-llm:

  • interface with llm from inside Org mode buffers, passing Babel code block header arguments as flags
  • supports async, streaming, and concurrent processes across multiple buffers
  • finished responses are automatically converted to prettified JSON or Org mode:
    • schema and schema-multi responses are pretty-printed as JSON
    • other, general responses are converted from markdown to Org mode
    • NOTE: JSON pretty-printing requires jq; Org mode conversion requires Pandoc
    • NOTE: disable auto-conversion per prompt with a :no-conversion Babel code block header argument, or disable auto-conversion with a customization setting via (setq ob-llm-post-process-auto-convert-p nil)
  • refresh llm-ready models with ob-llm-refresh-models
  • model helpers: copy (kill), paste (yank), see/set default model
  • interactively search the llm logs database with ob-llm-query-logs
  • mode-line indicator for when ob-llm is in use
  • immediate process termination with ob-llm-kill-all-processes

General Usage

  1. Enable with (ob-llm-mode 1)
  2. Write a code block: #+begin_src llm...<prompt as body>...#+end_src
  3. Execute with C-c C-c

Code Block Header Arguments (“Params”) Handling

The header arguments (docs, reference card) for the Babel code blocks are a bit overloaded in that they might be used (1) to control Org mode Babel code block evaluation behavior, (2) to control ob-llm application behavior/settings, or otherwise (3) to be passed along as flags to the llm command.

  1. Org babel parameters are consumed by Org mode, and thus not passed to LLM command: :results, :exports, :cache, :noweb, :session, :tangle, :hlines, etc.
    • :results silent will emit to the output buffer, but the results will not be streamed or copied in to the org buffer (or post-processed).
  2. Custom ob-llm parameters are used for special handling, ex. :no-conversion to leave the completed response unaltered.
  3. All other parameters are passed directly to the llm command as flags.

Model Management

ob-llm-yank-a-model-nameSelect a model from ob-llm-models to paste
ob-llm-kill-a-model-nameSelect a model from ob-llm-models to kill
ob-llm-echo-default-modelShow the current default model in the echo area
ob-llm-change-default-modelPick which model from ob-llm-models to set as default
ob-llm-refresh-modelsUpdate ob-llm-models with the models available to llm

Conversation History

Search logs from the llm logs database with ob-llm-query-logs. Selecting one will copy (kill) the conversation ID, which can be used to then resume that conversation from any code block.

#+begin_src llm :system "single word response"
what is the longest word in the English language that is all vowels?
#+end_src

#+RESULTS:
Euouae


#+begin_src llm :s "single word response"
who is halfway between Rameau and Bach?
#+end_src

#+RESULTS:
Handel

Use ~:continue~ to keep going with the most recent conversation.

#+begin_src llm :continue :s "terse"
who would come before Rameau in this same line
#+end_src

#+RESULTS:
Couperin


Now call ~ob-llm-query-logs~, search for "vowels", then select the original
conversation. Pass its ID with the header argument ~:cid~ or ~:conversation~.

#+begin_src llm :cid 01jwrgrwaj73adxm7prx46gpxj :s "terse"
definition?
#+end_src

#+RESULTS:
A medieval musical notation representing the vowel sounds of "seculorum Amen"
sung at the end of psalms.

Customization

ob-llm-line-indicatorWhat to show in the mode line while a process is active. Default is “🦆”
ob-llm-post-process-auto-convert-pWhether to convert completed responses to prettified JSON (schema) or Org mode (regular). Default is t
ob-llm-modelsModels available to yank, kill, and set as default. Update this by calling ob-llm-refresh-models
ob-llm-pandoc-additional-org-mode-conversion-flagsAdditional flags to pass to Pandoc when converting general responses to Org mode

Converting Markdown to Pandoc

When a general response is finished (as opposed to a code block with header arguments of :schema or :schema-multi (docs)), it is automatically converted to Org mode using Pandoc. This can be turned off for a single code block with a header argument of :no-conversion, or as a customization by setting ob-llm-post-process-auto-convert-p to nil.

Additional flags can be passed to Pandoc during the “convert the completed markdown response to org mode” step using ob-llm-pandoc-additional-org-mode-conversion-flags. For instance, to exclude the PROPERTY drawers from Org headings:

(setq ob-llm-pandoc-additional-org-mode-conversion-flags
      '("--lua-filter=/Users/myuser/.local/share/remove-header-attr.lua"))

And then these contents for the remove-header-attr.lua file are at that location:

function Header (header)
  return pandoc.Header(header.level, header.content, pandoc.Attr())
end

About

Use `llm' as an Org Babel language

Resources

License

Stars

Watchers

Forks

Sponsor this project

  •  

Packages

No packages published

Languages

  • Emacs Lisp 100.0%