Module: LLM
- Defined in:
- lib/llm.rb,
lib/llm/bot.rb,
lib/llm/agent.rb,
lib/llm/error.rb,
lib/llm/buffer.rb,
lib/llm/client.rb,
lib/llm/tracer.rb,
lib/llm/message.rb,
lib/llm/version.rb,
lib/llm/contract.rb,
lib/llm/response.rb,
lib/llm/tracer/null.rb,
lib/llm/eventhandler.rb,
lib/llm/json_adapter.rb,
lib/llm/providers/xai.rb,
lib/llm/providers/zai.rb,
lib/llm/tracer/logger.rb,
lib/llm/providers/gemini.rb,
lib/llm/providers/ollama.rb,
lib/llm/providers/openai.rb,
lib/llm/tracer/telemetry.rb,
lib/llm/providers/deepseek.rb,
lib/llm/providers/llamacpp.rb,
lib/llm/providers/anthropic.rb
Defined Under Namespace
Modules: Client, Contract Classes: Agent, Anthropic, Buffer, DeepSeek, Error, File, Function, Gemini, JSONAdapter, LlamaCpp, Message, Object, Ollama, OpenAI, Prompt, Provider, Response, Schema, ServerTool, Session, Tool, Tracer, Usage, XAI, ZAI
Constant Summary collapse
- Bot =
-
Backward-compatible alias
-
Session -
HTTPUnauthorized
-
Class.new(Error)
- RateLimitError =
-
HTTPTooManyRequests
-
Class.new(Error)
- ServerError =
-
HTTPServerError
-
Class.new(Error)
- FormatError =
-
When an given an input object that is not understood
-
Class.new(Error)
- PromptError =
-
When given a prompt object that is not understood
-
Class.new(FormatError)
- InvalidRequestError =
-
When given an invalid request
-
Class.new(Error)
- ContextWindowError =
-
When the context window is exceeded
-
Class.new(InvalidRequestError)
- ToolLoopError =
-
When stuck in a tool call loop
-
Class.new(Error)
- VERSION =
-
"4.1.0"
Class Method Summary collapse
- .File(obj) ⇒ LLM::File
-
.json=(adapter) ⇒
void
Sets the JSON adapter used by the library.
-
.anthropic ⇒
Anthropic
A new instance of Anthropic.
-
.gemini ⇒ Gemini
A new instance of Gemini.
-
.ollama(key: nil) ⇒
Ollama
A new instance of Ollama.
- .llamacpp(key: nil) ⇒ LLM::LlamaCpp
- .deepseek ⇒ LLM::DeepSeek
-
.openai ⇒ OpenAI
A new instance of OpenAI.
-
.xai ⇒ XAI
A new instance of XAI.
-
.zai ⇒ ZAI
A new instance of ZAI.
-
.function(key,
&b) ⇒ LLM::Function
Define a function.
-
.lock(name) ⇒ void
Provides a thread-safe lock.
-
.json ⇒ Class
Returns the JSON adapter used by the library.
Class Method Details
.File(obj) ⇒ LLM::File
82 83 84 85 86 87 88 89 90 91 |
# File 'lib/llm/file.rb', line 82 def LLM.File(obj) case obj when File obj.close unless obj.closed? LLM.File(obj.path) when LLM::File, LLM::Response then obj when String then LLM::File.new(obj) else raise TypeError, "don't know how to handle #{obj.class} objects" end end |
.json=(adapter) ⇒ void
This should be set once from the main thread when your program starts. Defaults to LLM::JSONAdapter::JSON.
This method returns an undefined value.
Sets the JSON adapter used by the library
52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
# File 'lib/llm.rb', line 52 def json=(adapter) @json = case adapter.to_s when "JSON", "json" then JSONAdapter::JSON when "Oj", "oj" then JSONAdapter::Oj when "Yajl", "yajl" then JSONAdapter::Yajl else is_class = Class === adapter is_subclass = is_class && adapter.ancestors.include?(LLM::JSONAdapter) if is_subclass adapter else raise TypeError, "Adapter must be a subclass of LLM::JSONAdapter" end end end |
.anthropic ⇒ Anthropic
Returns a new instance of Anthropic.
71 72 73 74 |
# File 'lib/llm.rb', line 71 def anthropic(**) lock(:require) { require_relative "llm/providers/anthropic" unless defined?(LLM::Anthropic) } LLM::Anthropic.new(**) end |
.gemini ⇒ Gemini
Returns a new instance of Gemini.
79 80 81 82 |
# File 'lib/llm.rb', line 79 def gemini(**) lock(:require) { require_relative "llm/providers/gemini" unless defined?(LLM::Gemini) } LLM::Gemini.new(**) end |
.ollama(key: nil) ⇒ Ollama
Returns a new instance of Ollama.
87 88 89 90 |
# File 'lib/llm.rb', line 87 def ollama(key: nil, **) lock(:require) { require_relative "llm/providers/ollama" unless defined?(LLM::Ollama) } LLM::Ollama.new(key:, **) end |
.llamacpp(key: nil) ⇒ LLM::LlamaCpp
95 96 97 98 |
# File 'lib/llm.rb', line 95 def llamacpp(key: nil, **) lock(:require) { require_relative "llm/providers/llamacpp" unless defined?(LLM::LlamaCpp) } LLM::LlamaCpp.new(key:, **) end |
.deepseek ⇒ LLM::DeepSeek
103 104 105 106 |
# File 'lib/llm.rb', line 103 def deepseek(**) lock(:require) { require_relative "llm/providers/deepseek" unless defined?(LLM::DeepSeek) } LLM::DeepSeek.new(**) end |
.openai ⇒ OpenAI
Returns a new instance of OpenAI.
111 112 113 114 |
# File 'lib/llm.rb', line 111 def openai(**) lock(:require) { require_relative "llm/providers/openai" unless defined?(LLM::OpenAI) } LLM::OpenAI.new(**) end |
.xai ⇒ XAI
Returns a new instance of XAI.
120 121 122 123 |
# File 'lib/llm.rb', line 120 def xai(**) lock(:require) { require_relative "llm/providers/xai" unless defined?(LLM::XAI) } LLM::XAI.new(**) end |
.zai ⇒ ZAI
Returns a new instance of ZAI.
129 130 131 132 |
# File 'lib/llm.rb', line 129 def zai(**) lock(:require) { require_relative "llm/providers/zai" unless defined?(LLM::ZAI) } LLM::ZAI.new(**) end |
.function(key, &b) ⇒ LLM::Function
Define a function
149 150 151 |
# File 'lib/llm.rb', line 149 def function(key, &b) LLM::Function.new(key, &b) end |
.lock(name) ⇒ void
This method returns an undefined value.
Provides a thread-safe lock
158 |
# File 'lib/llm.rb', line 158 def lock(name, &) = @monitors[name].synchronize(&) |
.json ⇒ Class
Returns the JSON adapter used by the library
40 41 42 |
# File 'lib/llm.rb', line 40 def json @json ||= JSONAdapter::JSON end |