Module: Durable::Llm

Defined in:
lib/durable/llm.rb,
lib/durable/llm/cli.rb,
lib/durable/llm/client.rb,
lib/durable/llm/errors.rb,
lib/durable/llm/version.rb,
lib/durable/llm/providers.rb,
lib/durable/llm/configuration.rb,
lib/durable/llm/providers/xai.rb,
lib/durable/llm/providers/base.rb,
lib/durable/llm/providers/groq.rb,
lib/durable/llm/providers/cohere.rb,
lib/durable/llm/providers/google.rb,
lib/durable/llm/providers/openai.rb,
lib/durable/llm/response_helpers.rb,
lib/durable/llm/providers/mistral.rb,
lib/durable/llm/provider_utilities.rb,
lib/durable/llm/providers/deepseek.rb,
lib/durable/llm/providers/opencode.rb,
lib/durable/llm/providers/together.rb,
lib/durable/llm/providers/anthropic.rb,
lib/durable/llm/providers/fireworks.rb,
lib/durable/llm/providers/openrouter.rb,
lib/durable/llm/providers/perplexity.rb,
lib/durable/llm/providers/huggingface.rb,
lib/durable/llm/providers/azure_openai.rb

Overview

The Llm module provides a unified interface for Large Language Model operations.

This module serves as the main entry point for the Durable LLM gem, offering:

  • Global configuration management

  • Provider-agnostic client creation

  • Convenience methods for common operations

  • Access to version information

The module maintains a singleton configuration instance that can be customized to set API keys, default providers, and other global settings.

Examples:

Basic setup and usage

Durable::Llm.configure do |config|
  config.openai.api_key = 'sk-...'
end

client = Durable::Llm.new(:openai)
response = client.complete('Hello!')

See Also:

Defined Under Namespace

Modules: ProviderUtilities, Providers, ResponseHelpers Classes: APIError, AuthenticationError, CLI, Client, Configuration, ConfigurationError, Error, InsufficientQuotaError, InvalidRequestError, InvalidResponseError, ModelNotFoundError, NetworkError, RateLimitError, ResourceNotFoundError, ServerError, StreamingError, TimeoutError, UnsupportedProviderError

Constant Summary collapse

VERSION =
'0.1.6'

Class Attribute Summary collapse

Class Method Summary collapse

Class Attribute Details

.configurationConfiguration

Returns The global configuration instance.

Returns:



101
102
103
# File 'lib/durable/llm.rb', line 101

def configuration
  @configuration
end

Class Method Details

.chat(messages, provider: :openai, model: nil, **options) ⇒ Object

Creates a chat completion with minimal setup.

This is a convenience method for quick chat interactions that automatically creates a client and performs the chat completion.

Examples:

Simple chat

response = Durable::Llm.chat(
  [{ role: 'user', content: 'Hello!' }],
  model: 'gpt-4'
)
puts response.choices.first.message.content

Parameters:

  • messages (Array<Hash>)

    Array of message hashes with :role and :content

  • provider (Symbol) (defaults to: :openai)

    The provider to use (default: :openai)

  • model (String) (defaults to: nil)

    The model to use (required)

  • options (Hash)

    Additional options for the client and request

Returns:

  • (Object)

    The chat response object

Raises:

  • (ArgumentError)

    If required parameters are missing



201
202
203
204
205
206
207
208
209
210
211
# File 'lib/durable/llm.rb', line 201

def self.chat(messages, provider: :openai, model: nil, **options)
  raise ArgumentError, 'messages are required' if messages.nil? || messages.empty?
  raise ArgumentError, 'model is required' if model.nil? || model.to_s.strip.empty?

  request_keys = i[temperature max_tokens top_p frequency_penalty presence_penalty]
  request_params = options.select { |k, _| request_keys.include?(k) }
  client_options = options.reject { |k, _| request_keys.include?(k) }

  client = new(provider, client_options.merge(model: model))
  client.chat(messages: messages, **request_params)
end

.complete(text, provider: :openai, model: nil, **options) ⇒ String

Creates a quick completion with minimal setup.

This is a convenience method for one-off completions that automatically creates a client, performs the completion, and returns the text result.

Examples:

Quick completion with OpenAI

result = Durable::Llm.complete('What is Ruby?', model: 'gpt-4')
puts result

Quick completion with Anthropic

result = Durable::Llm.complete('Explain AI', provider: :anthropic, model: 'claude-3-opus-20240229')
puts result

Parameters:

  • text (String)

    The input text to complete

  • provider (Symbol) (defaults to: :openai)

    The provider to use (default: :openai)

  • model (String) (defaults to: nil)

    The model to use (required)

  • options (Hash)

    Additional options for the client

Returns:

  • (String)

    The completion text

Raises:

  • (ArgumentError)

    If required parameters are missing



176
177
178
179
180
181
182
# File 'lib/durable/llm.rb', line 176

def self.complete(text, provider: :openai, model: nil, **options)
  raise ArgumentError, 'text is required' if text.nil? || text.to_s.strip.empty?
  raise ArgumentError, 'model is required' if model.nil? || model.to_s.strip.empty?

  client = new(provider, options.merge(model: model))
  client.complete(text)
end

.configConfiguration

Returns the current configuration instance.

This is an alias for the configuration accessor, provided for convenience.

Returns:

See Also:

  • #configuration


109
110
111
# File 'lib/durable/llm.rb', line 109

def config
  configuration
end

.configure {|configuration| ... } ⇒ void

This method returns an undefined value.

Configures the global LLM settings.

This method initializes or yields the global configuration instance, allowing you to set API keys, default providers, and other global options.

Examples:

Configure API keys

Durable::Llm.configure do |config|
  config.openai.api_key = 'sk-...'
  config.anthropic.api_key = 'sk-ant-...'
  config.default_provider = 'openai'
end

Configure from environment

# Environment variables are automatically loaded
ENV['DLLM__OPENAI__API_KEY'] = 'sk-...'
Durable::Llm.configure do |config|
  # Additional programmatic configuration
end

Yields:

Yield Parameters:

  • configuration (Configuration)

    The global configuration object



154
155
156
157
# File 'lib/durable/llm.rb', line 154

def self.configure
  self.configuration ||= Configuration.new
  yield(configuration)
end

.models(provider = :openai, **options) ⇒ Array<String>

Lists available models for a provider.

Examples:

List OpenAI models

models = Durable::Llm.models(:openai)
puts models.inspect

Parameters:

  • provider (Symbol) (defaults to: :openai)

    The provider name (default: :openai)

  • options (Hash)

    Provider options (e.g., api_key)

Returns:

  • (Array<String>)

    List of available model IDs



221
222
223
224
# File 'lib/durable/llm.rb', line 221

def self.models(provider = :openai, **options)
  client = new(provider, options)
  client.provider.models
end

.new(provider, options = {}) ⇒ Client

Creates a new LLM client for the specified provider.

This is a convenience method that creates a new Client instance with the given provider and options. It’s equivalent to calling ‘Durable::Llm::Client.new(provider, options)`.

Examples:

Create an OpenAI client

client = Durable::Llm.new(:openai, api_key: 'sk-...', model: 'gpt-4')

Create an Anthropic client

client = Durable::Llm.new(:anthropic, api_key: 'sk-ant-...')

Parameters:

  • provider (Symbol, String)

    The provider name (e.g., :openai, :anthropic)

  • options (Hash) (defaults to: {})

    Configuration options for the client

Options Hash (options):

  • :model (String)

    The default model to use

  • :api_key (String)

    API key for authentication

Returns:

  • (Client)

    A new client instance

Raises:

  • (NameError)

    If the provider is not found



129
130
131
# File 'lib/durable/llm.rb', line 129

def new(provider, options = {})
  Client.new(provider, options)
end