Class: AiClient
- Inherits:
-
Object
- Object
- AiClient
- Defined in:
- lib/ai_client/chat.rb,
lib/ai_client/embed.rb,
lib/ai_client/speak.rb,
lib/ai_client/version.rb,
lib/ai_client/function.rb,
lib/ai_client/middleware.rb,
lib/ai_client/transcribe.rb,
lib/ai_client/configuration.rb,
lib/ai_client/retry_middleware.rb,
lib/ai_client/logger_middleware.rb,
lib/ai_client/ollama_extensions.rb,
lib/ai_client/open_router_extensions.rb
Overview
ai_client/logger_middleware.rb
Defined Under Namespace
Classes: Config, Function, LoggingMiddleware, RetryMiddleware, Tool
Constant Summary collapse
- VERSION =
"0.4.6"
Class Attribute Summary collapse
-
.class_config ⇒ Object
Returns the value of attribute class_config.
-
.default_config ⇒ Object
Returns the value of attribute default_config.
Class Method Summary collapse
-
.add_ollama_extensions ⇒ void
Initializes Ollama extensions for AiClient.
-
.add_open_router_extensions ⇒ void
Initializes OpenRouter extensions for AiClient.
-
.clear_middlewares ⇒ void
Clears all middlewares from the client.
-
.configure {|config| ... } ⇒ void
Configures the AiClient with a given block.
-
.middlewares ⇒ Array
Returns the list of middlewares applied to the client.
-
.model_details(model_id) ⇒ AiClient::LLM?
Retrieves details for a specific model.
-
.models(substring = nil) ⇒ Array<String>
Retrieves model names, optionally filtered by provider.
-
.ollama_available_models(host = nil) ⇒ Array<Hash>
Retrieves the available models from the Ollama server.
-
.ollama_client ⇒ Ollama::Client
Retrieves the ORC client instance.
-
.ollama_host ⇒ String
Gets the configured Ollama host URL.
-
.ollama_model_exists?(model_name, host = nil) ⇒ Boolean
Checks if a specific model exists on the Ollama server.
-
.orc_client ⇒ OpenRouter::Client
Retrieves the ORC client instance.
-
.providers ⇒ Array<Symbol>
Retrieves all available providers.
-
.reset_default_config ⇒ void
Resets the default configuration to the value defined in the class.
-
.use(middleware) ⇒ void
Adds a middleware to the stack.
- .version ⇒ Object
Instance Method Summary collapse
-
#add_context(my_prompt) ⇒ String+
Adds context to the current prompt.
- #batch_embed(inputs, batch_size: 100, **params) ⇒ Object
-
#call_with_middlewares(method, *args, **kwargs, &block) ⇒ Object
Calls the specified method with middlewares applied.
-
#chat(messages = '', **params, &block) ⇒ String
OmniAI Params model: @model [String] optional format: @format [Symbol] optional :text or :json stream: @stream [Proc, nil] optional tools: @tools [Array<OmniAI::Tool>] optional temperature: @temperature [Float, nil] optional.
-
#chat_without_middlewares(messages, **params, &block) ⇒ String
Chats with the client without middleware processing.
-
#chatbot(prompt = 'hello') ⇒ Object
Just an example …
-
#clear_context ⇒ void
Clears the current context.
-
#embed(input, **params) ⇒ Object
OmniAI Params model [String] required.
-
#model_details ⇒ Hash?
Retrieves details for the current model.
-
#models ⇒ Array<String>
Retrieves model names for the current provider.
-
#speak(text, **params) {|output| ... } ⇒ Tempfile``
OmniAI Params input [String] required model [String] required voice [String] required speed [Float] optional format [String] optional (default “aac”) aac mp3 flac opus pcm wav.
- #speak_without_middlewares(text, **params) ⇒ Object
-
#transcribe(audio, format: nil, **params) ⇒ Object
OmniAI Params model [String] language [String, nil] optional prompt [String, nil] optional format [Symbol] :text, :srt, :vtt, or :json (default) temperature [Float, nil] optional.
- #transcribe_without_middlewares(audio, format: nil, **params) ⇒ Object
- #version ⇒ Object
Class Attribute Details
.class_config ⇒ Object
Returns the value of attribute class_config.
127 128 129 |
# File 'lib/ai_client/configuration.rb', line 127 def class_config @class_config end |
.default_config ⇒ Object
Returns the value of attribute default_config.
127 128 129 |
# File 'lib/ai_client/configuration.rb', line 127 def default_config @default_config end |
Class Method Details
.add_ollama_extensions ⇒ void
This method returns an undefined value.
Initializes Ollama extensions for AiClient.
This sets up the access token and initializes the ORC client.
84 85 86 87 88 89 90 91 |
# File 'lib/ai_client/ollama_extensions.rb', line 84 def add_ollama_extensions access_token = fetch_access_token return unless access_token configure_ollama(access_token) initialize_ollama_client end |
.add_open_router_extensions ⇒ void
This method returns an undefined value.
Initializes OpenRouter extensions for AiClient.
This sets up the access token and initializes the ORC client.
74 75 76 77 78 79 80 81 |
# File 'lib/ai_client/open_router_extensions.rb', line 74 def add_open_router_extensions access_token = fetch_access_token return unless access_token configure_open_router(access_token) initialize_orc_client end |
.clear_middlewares ⇒ void
This method returns an undefined value.
Clears all middlewares from the client.
56 57 58 |
# File 'lib/ai_client/middleware.rb', line 56 def clear_middlewares @middlewares = [] end |
.configure {|config| ... } ⇒ void
This method returns an undefined value.
Configures the AiClient with a given block.
134 135 136 |
# File 'lib/ai_client/configuration.rb', line 134 def configure(&block) yield(class_config) end |
.middlewares ⇒ Array
Returns the list of middlewares applied to the client.
38 39 40 |
# File 'lib/ai_client/middleware.rb', line 38 def middlewares @middlewares ||= [] end |
.model_details(model_id) ⇒ AiClient::LLM?
Retrieves details for a specific model.
63 |
# File 'lib/ai_client/ollama_extensions.rb', line 63 def model_details(model_id) = LLM.find(model_id.downcase) |
.models(substring = nil) ⇒ Array<String>
Retrieves model names, optionally filtered by provider.
56 |
# File 'lib/ai_client/ollama_extensions.rb', line 56 def models(substring = nil) = LLM.models(substring) |
.ollama_available_models(host = nil) ⇒ Array<Hash>
Retrieves the available models from the Ollama server.
109 110 111 112 113 114 115 116 117 118 119 120 |
# File 'lib/ai_client/ollama_extensions.rb', line 109 def ollama_available_models(host = nil) host ||= ollama_host uri = URI("#{host}/api/tags") response = Net::HTTP.get_response(uri) if response.is_a?(Net::HTTPSuccess) JSON.parse(response.body)["models"] rescue [] else [] end end |
.ollama_client ⇒ Ollama::Client
Retrieves the ORC client instance.
98 99 100 |
# File 'lib/ai_client/ollama_extensions.rb', line 98 def ollama_client @ollama_client ||= initialize_ollama_client end |
.ollama_host ⇒ String
Gets the configured Ollama host URL
125 126 127 |
# File 'lib/ai_client/ollama_extensions.rb', line 125 def ollama_host class_config.providers[:ollama]&.dig(:host) || 'http://localhost:11434' end |
.ollama_model_exists?(model_name, host = nil) ⇒ Boolean
Checks if a specific model exists on the Ollama server.
136 137 138 139 |
# File 'lib/ai_client/ollama_extensions.rb', line 136 def ollama_model_exists?(model_name, host = nil) models = ollama_available_models(host) models.any? { |m| m['name'] == model_name } end |
.orc_client ⇒ OpenRouter::Client
Retrieves the ORC client instance.
89 90 91 |
# File 'lib/ai_client/open_router_extensions.rb', line 89 def orc_client @orc_client ||= add_open_router_extensions || raise("OpenRouter extensions are not available") end |
.providers ⇒ Array<Symbol>
Retrieves all available providers.
49 |
# File 'lib/ai_client/ollama_extensions.rb', line 49 def providers = LLM.providers |
.reset_default_config ⇒ void
This method returns an undefined value.
Resets the default configuration to the value defined in the class.
142 143 144 145 |
# File 'lib/ai_client/configuration.rb', line 142 def reset_default_config initialize_defaults .save(Config::DEFAULT_CONFIG_FILEPATH) end |
.use(middleware) ⇒ void
This method returns an undefined value.
Adds a middleware to the stack.
48 49 50 |
# File 'lib/ai_client/middleware.rb', line 48 def use(middleware) middlewares << middleware end |
.version ⇒ Object
7 |
# File 'lib/ai_client/version.rb', line 7 def self.version = VERSION |
Instance Method Details
#add_context(my_prompt) ⇒ String+
Adds context to the current prompt.
59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 |
# File 'lib/ai_client/chat.rb', line 59 def add_context(my_prompt) return(my_prompt) if @config.context_length.nil? || 0 == @config.context_length || my_prompt.is_a?(Array) || @context.empty? prompt = "\nUser: #{my_prompt} Bot: " prompt.prepend( @context.map{|entry| "User: #{entry[:user]} Bot: #{entry[:bot]}" }.join("\n") ) prompt end |
#batch_embed(inputs, batch_size: 100, **params) ⇒ Object
14 15 16 17 18 19 |
# File 'lib/ai_client/embed.rb', line 14 def (inputs, batch_size: 100, **params) inputs.each_slice(batch_size).flat_map do |batch| sleep 1 # DEBUG rate limits being exceeded (batch, **params) end end |
#call_with_middlewares(method, *args, **kwargs, &block) ⇒ Object
Calls the specified method with middlewares applied.
24 25 26 27 28 29 |
# File 'lib/ai_client/middleware.rb', line 24 def call_with_middlewares(method, *args, **kwargs, &block) stack = self.class.middlewares.reverse.reduce(-> { send(method, *args, **kwargs, &block) }) do |next_middleware, middleware| -> { middleware.call(self, next_middleware, *args, **kwargs, &block) } end stack.call end |
#chat(messages = '', **params, &block) ⇒ String
OmniAI Params
model: @model [String] optional
format: @format [Symbol] optional :text or :json
stream: @stream [Proc, nil] optional
tools: @tools [Array<OmniAI::Tool>] optional
temperature: @temperature [Float, nil] optional
Initiates a chat session.
22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 |
# File 'lib/ai_client/chat.rb', line 22 def chat(='', **params, &block) if params.has_key? :tools tools = params[:tools] if tools.is_a? Array tools.map!{|function_name| AiClient::Function.registry[function_name]} elsif true == tools tools = AiClient::Function.registry.values else raise 'what is this' end params[:tools] = tools end @last_messages = = add_context() result = call_with_middlewares( :chat_without_middlewares, , **params, &block ) @last_response = result result = raw? ? result : content @context = @context.push({ user: @last_messages, bot: result }).last(config.context_length) result end |
#chat_without_middlewares(messages, **params, &block) ⇒ String
Chats with the client without middleware processing.
90 91 92 |
# File 'lib/ai_client/chat.rb', line 90 def chat_without_middlewares(, **params, &block) @client.chat(, model: @model, **params, &block) end |
#chatbot(prompt = 'hello') ⇒ Object
Just an example …
96 97 98 99 100 101 102 |
# File 'lib/ai_client/chat.rb', line 96 def chatbot(prompt='hello') until prompt.empty? do response = chat prompt print "\n\n#{content}\n\nFollow Up: " prompt = gets.chomp end end |
#clear_context ⇒ void
This method returns an undefined value.
Clears the current context.
80 81 82 |
# File 'lib/ai_client/chat.rb', line 80 def clear_context @context = [] end |
#embed(input, **params) ⇒ Object
OmniAI Params
model [String] required
10 11 12 |
# File 'lib/ai_client/embed.rb', line 10 def (input, **params) @client.(input, model: @model, **params) end |
#model_details ⇒ Hash?
Retrieves details for the current model.
33 34 35 36 |
# File 'lib/ai_client/ollama_extensions.rb', line 33 def model_details id = "#{@provider}/#{@model}" LLM.find(id.downcase) end |
#models ⇒ Array<String>
Retrieves model names for the current provider.
41 |
# File 'lib/ai_client/ollama_extensions.rb', line 41 def models = LLM.models(@provider) |
#speak(text, **params) {|output| ... } ⇒ Tempfile``
OmniAI Params
input [String] required
model [String] required
voice [String] required
speed [Float] optional
format [String] optional (default "aac")
aac mp3 flac opus pcm wav
18 19 20 |
# File 'lib/ai_client/speak.rb', line 18 def speak(text, **params) call_with_middlewares(:speak_without_middlewares, text, **params) end |
#speak_without_middlewares(text, **params) ⇒ Object
22 23 24 |
# File 'lib/ai_client/speak.rb', line 22 def speak_without_middlewares(text, **params) @client.speak(text, model: @model, **params) end |
#transcribe(audio, format: nil, **params) ⇒ Object
OmniAI Params
model [String]
language [String, nil] optional
prompt [String, nil] optional
format [Symbol] :text, :srt, :vtt, or :json (default)
temperature [Float, nil] optional
13 14 15 |
# File 'lib/ai_client/transcribe.rb', line 13 def transcribe(audio, format: nil, **params) call_with_middlewares(:transcribe_without_middlewares, audio, format: format, **params) end |
#transcribe_without_middlewares(audio, format: nil, **params) ⇒ Object
17 18 19 |
# File 'lib/ai_client/transcribe.rb', line 17 def transcribe_without_middlewares(audio, format: nil, **params) @client.transcribe(audio, model: @model, format: format, **params) end |
#version ⇒ Object
6 |
# File 'lib/ai_client/version.rb', line 6 def version = VERSION |