Class: Durable::Llm::Client
- Inherits:
-
Object
- Object
- Durable::Llm::Client
- Defined in:
- lib/durable/llm/client.rb
Overview
Unified interface for interacting with different LLM providers
The Client class provides a facade that delegates operations like completion, chat, embedding, and streaming to the appropriate provider instance while handling parameter processing, model configuration, and providing convenience methods for quick text completion. The client automatically resolves provider classes based on the provider name and manages default parameters including model selection.
Instance Attribute Summary collapse
-
#model ⇒ String?
The default model to use for requests.
-
#provider ⇒ Object
readonly
The underlying provider instance.
Instance Method Summary collapse
-
#chat(params = {}) ⇒ Object
Performs a chat completion request (alias for completion).
-
#clone_with(**options) ⇒ Client
Creates a copy of the client with different configuration.
-
#complete(text, _opts = {}) ⇒ String
(also: #quick_complete)
Performs a text completion with minimal configuration.
-
#completion(params = {}) ⇒ Object
Performs a completion request.
-
#default_params ⇒ Hash
Returns the default parameters to merge with request options.
-
#embed(params = {}) ⇒ Object
Performs an embedding request.
-
#initialize(provider_name, options = {}) ⇒ Client
constructor
Initializes a new LLM client for the specified provider.
-
#stream(params = {}) {|Object| ... } ⇒ Object
Performs a streaming completion request.
-
#stream? ⇒ Boolean
Checks if the provider supports streaming.
-
#with_max_tokens(tokens) ⇒ Client
Sets max tokens for the next request (fluent interface).
-
#with_model(model_name) ⇒ Client
Sets the model for subsequent requests (fluent interface).
-
#with_temperature(temp) ⇒ Client
Sets temperature for the next request (fluent interface).
Constructor Details
#initialize(provider_name, options = {}) ⇒ Client
Initializes a new LLM client for the specified provider
42 43 44 45 46 47 48 49 50 51 52 53 54 |
# File 'lib/durable/llm/client.rb', line 42 def initialize(provider_name, = {}) if provider_name.nil? || provider_name.to_s.strip.empty? raise ArgumentError, 'provider_name cannot be nil or empty. Supported providers: ' \ "#{Durable::Llm::Providers.available_providers.join(', ')}" end raise ArgumentError, 'options must be a Hash' unless .is_a?(Hash) @model = .delete('model') || .delete(:model) if .key?('model') || .key?(:model) provider_class = Durable::Llm::Providers.provider_class_for(provider_name) @provider = provider_class.new(**) end |
Instance Attribute Details
#model ⇒ String?
Returns The default model to use for requests.
27 28 29 |
# File 'lib/durable/llm/client.rb', line 27 def model @model end |
#provider ⇒ Object (readonly)
Returns The underlying provider instance.
24 25 26 |
# File 'lib/durable/llm/client.rb', line 24 def provider @provider end |
Instance Method Details
#chat(params = {}) ⇒ Object
Performs a chat completion request (alias for completion)
141 142 143 144 145 |
# File 'lib/durable/llm/client.rb', line 141 def chat(params = {}) raise ArgumentError, 'params must be a Hash' unless params.is_a?(Hash) @provider.completion(process_params(params)) end |
#clone_with(**options) ⇒ Client
Creates a copy of the client with different configuration
251 252 253 254 |
# File 'lib/durable/llm/client.rb', line 251 def clone_with(**) provider_name = @provider.class.name.split('::').last.downcase.to_sym self.class.new(provider_name, .merge(model: @model)) end |
#complete(text, _opts = {}) ⇒ String Also known as: quick_complete
Performs a text completion with minimal configuration
76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 |
# File 'lib/durable/llm/client.rb', line 76 def complete(text, _opts = {}) if text.nil? || text.to_s.strip.empty? raise ArgumentError, 'text cannot be nil or empty. Provide a non-empty string for completion.' end response = completion(process_params(messages: [{ role: 'user', content: text }])) choice = response.choices.first unless choice raise IndexError, 'No completion choices returned from the API. This may indicate an ' \ 'API error or invalid request parameters.' end = choice. unless raise NoMethodError, 'Response choice has no message. The API response format may be ' \ 'unexpected or the provider may have changed their response structure.' end content = .content unless content raise NoMethodError, 'Response message has no content. This may occur if the model ' \ 'refused to respond or if content filtering was applied.' end content end |
#completion(params = {}) ⇒ Object
Performs a completion request
124 125 126 127 128 |
# File 'lib/durable/llm/client.rb', line 124 def completion(params = {}) raise ArgumentError, 'params must be a Hash' unless params.is_a?(Hash) @provider.completion(process_params(params)) end |
#default_params ⇒ Hash
Returns the default parameters to merge with request options
59 60 61 |
# File 'lib/durable/llm/client.rb', line 59 def default_params @model ? { model: @model } : {} end |
#embed(params = {}) ⇒ Object
Performs an embedding request
162 163 164 165 166 167 168 169 170 |
# File 'lib/durable/llm/client.rb', line 162 def (params = {}) raise ArgumentError, 'params must be a Hash' unless params.is_a?(Hash) @provider.(**process_params(params)) rescue NotImplementedError provider_name = @provider.class.name.split('::').last raise NotImplementedError, "#{provider_name} does not support embeddings. " \ 'Try using a provider like OpenAI that offers embedding models.' end |
#stream(params = {}) {|Object| ... } ⇒ Object
Performs a streaming completion request
189 190 191 192 193 194 195 196 197 198 199 200 |
# File 'lib/durable/llm/client.rb', line 189 def stream(params = {}, &block) raise ArgumentError, 'params must be a Hash' unless params.is_a?(Hash) unless block_given? raise ArgumentError, 'block required for streaming. Use: client.stream(params) { |chunk| ... }' end @provider.stream(process_params(params), &block) rescue NotImplementedError provider_name = @provider.class.name.split('::').last raise NotImplementedError, "#{provider_name} does not support streaming. " \ 'Try using completion() or chat() instead.' end |
#stream? ⇒ Boolean
Checks if the provider supports streaming
205 206 207 |
# File 'lib/durable/llm/client.rb', line 205 def stream? @provider.stream? end |
#with_max_tokens(tokens) ⇒ Client
Sets max tokens for the next request (fluent interface)
238 239 240 241 |
# File 'lib/durable/llm/client.rb', line 238 def with_max_tokens(tokens) @next_max_tokens = tokens self end |
#with_model(model_name) ⇒ Client
Sets the model for subsequent requests (fluent interface)
216 217 218 219 |
# File 'lib/durable/llm/client.rb', line 216 def with_model(model_name) @model = model_name self end |
#with_temperature(temp) ⇒ Client
Sets temperature for the next request (fluent interface)
227 228 229 230 |
# File 'lib/durable/llm/client.rb', line 227 def with_temperature(temp) @next_temperature = temp self end |