Class: AiClient

Inherits:
Object
  • Object
show all
Defined in:
lib/ai_client/chat.rb,
lib/ai_client/embed.rb,
lib/ai_client/speak.rb,
lib/ai_client/version.rb,
lib/ai_client/function.rb,
lib/ai_client/middleware.rb,
lib/ai_client/transcribe.rb,
lib/ai_client/configuration.rb,
lib/ai_client/retry_middleware.rb,
lib/ai_client/logger_middleware.rb,
lib/ai_client/ollama_extensions.rb,
lib/ai_client/open_router_extensions.rb

Overview

ai_client/logger_middleware.rb

Defined Under Namespace

Classes: Config, Function, LoggingMiddleware, RetryMiddleware, Tool

Constant Summary collapse

VERSION =
"0.4.6"

Class Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Class Attribute Details

.class_configObject

Returns the value of attribute class_config.



127
128
129
# File 'lib/ai_client/configuration.rb', line 127

def class_config
  @class_config
end

.default_configObject

Returns the value of attribute default_config.



127
128
129
# File 'lib/ai_client/configuration.rb', line 127

def default_config
  @default_config
end

Class Method Details

.add_ollama_extensionsvoid

This method returns an undefined value.

Initializes Ollama extensions for AiClient.

This sets up the access token and initializes the ORC client.



84
85
86
87
88
89
90
91
# File 'lib/ai_client/ollama_extensions.rb', line 84

def add_ollama_extensions
  access_token = fetch_access_token

  return unless access_token

  configure_ollama(access_token)
  initialize_ollama_client
end

.add_open_router_extensionsvoid

This method returns an undefined value.

Initializes OpenRouter extensions for AiClient.

This sets up the access token and initializes the ORC client.



74
75
76
77
78
79
80
81
# File 'lib/ai_client/open_router_extensions.rb', line 74

def add_open_router_extensions
  access_token = fetch_access_token

  return unless access_token

  configure_open_router(access_token)
  initialize_orc_client
end

.clear_middlewaresvoid

This method returns an undefined value.

Clears all middlewares from the client.



56
57
58
# File 'lib/ai_client/middleware.rb', line 56

def clear_middlewares
  @middlewares = []
end

.configure {|config| ... } ⇒ void

This method returns an undefined value.

Configures the AiClient with a given block.

Yield Parameters:



134
135
136
# File 'lib/ai_client/configuration.rb', line 134

def configure(&block)
  yield(class_config)
end

.middlewaresArray

Returns the list of middlewares applied to the client.

Returns:

  • (Array)

    list of middlewares



38
39
40
# File 'lib/ai_client/middleware.rb', line 38

def middlewares
  @middlewares ||= []
end

.model_details(model_id) ⇒ AiClient::LLM?

Retrieves details for a specific model.

Parameters:

  • model_id (String)

    The model ID to retrieve details for, in the pattern “provider/model”.downcase

Returns:

  • (AiClient::LLM, nil)

    Details of the model or nil if not found.



63
# File 'lib/ai_client/ollama_extensions.rb', line 63

def model_details(model_id) = LLM.find(model_id.downcase)

.models(substring = nil) ⇒ Array<String>

Retrieves model names, optionally filtered by provider.

Parameters:

  • substring (String, nil) (defaults to: nil)

    Optional substring to filter models by.

Returns:

  • (Array<String>)

    List of model names.



56
# File 'lib/ai_client/ollama_extensions.rb', line 56

def models(substring = nil) = LLM.models(substring)

.ollama_available_models(host = nil) ⇒ Array<Hash>

Retrieves the available models from the Ollama server.

Parameters:

  • host (String) (defaults to: nil)

    Optional host URL for the Ollama server. Defaults to the configured host or localhost:11434 if not specified.

Returns:

  • (Array<Hash>)

    List of available models with their details.



109
110
111
112
113
114
115
116
117
118
119
120
# File 'lib/ai_client/ollama_extensions.rb', line 109

def ollama_available_models(host = nil)
  host ||= ollama_host

  uri = URI("#{host}/api/tags")
  response = Net::HTTP.get_response(uri)

  if response.is_a?(Net::HTTPSuccess)
    JSON.parse(response.body)["models"] rescue []
  else
    []
  end
end

.ollama_clientOllama::Client

Retrieves the ORC client instance.

Returns:

  • (Ollama::Client)

    Instance of the Ollama client.



98
99
100
# File 'lib/ai_client/ollama_extensions.rb', line 98

def ollama_client
  @ollama_client ||= initialize_ollama_client
end

.ollama_hostString

Gets the configured Ollama host URL

Returns:

  • (String)

    The configured Ollama host URL



125
126
127
# File 'lib/ai_client/ollama_extensions.rb', line 125

def ollama_host
  class_config.providers[:ollama]&.dig(:host) || 'http://localhost:11434'
end

.ollama_model_exists?(model_name, host = nil) ⇒ Boolean

Checks if a specific model exists on the Ollama server.

Parameters:

  • model_name (String)

    The name of the model to check.

  • host (String) (defaults to: nil)

    Optional host URL for the Ollama server. Defaults to the configured host or localhost:11434 if not specified.

Returns:

  • (Boolean)

    True if the model exists, false otherwise.



136
137
138
139
# File 'lib/ai_client/ollama_extensions.rb', line 136

def ollama_model_exists?(model_name, host = nil)
  models = ollama_available_models(host)
  models.any? { |m| m['name'] == model_name }
end

.orc_clientOpenRouter::Client

Retrieves the ORC client instance.

Returns:

  • (OpenRouter::Client)

    Instance of the OpenRouter client.



89
90
91
# File 'lib/ai_client/open_router_extensions.rb', line 89

def orc_client
  @orc_client ||= add_open_router_extensions || raise("OpenRouter extensions are not available")
end

.providersArray<Symbol>

Retrieves all available providers.

Returns:

  • (Array<Symbol>)

    List of all provider names.



49
# File 'lib/ai_client/ollama_extensions.rb', line 49

def providers = LLM.providers

.reset_default_configvoid

This method returns an undefined value.

Resets the default configuration to the value defined in the class.



142
143
144
145
# File 'lib/ai_client/configuration.rb', line 142

def reset_default_config
  initialize_defaults
    .save(Config::DEFAULT_CONFIG_FILEPATH)      
end

.use(middleware) ⇒ void

This method returns an undefined value.

Adds a middleware to the stack.

Parameters:

  • middleware (Proc)

    the middleware to be added



48
49
50
# File 'lib/ai_client/middleware.rb', line 48

def use(middleware)
  middlewares << middleware
end

.versionObject



7
# File 'lib/ai_client/version.rb', line 7

def self.version  = VERSION

Instance Method Details

#add_context(my_prompt) ⇒ String+

Adds context to the current prompt.

Parameters:

  • prompt (String, Array<String>)

    the current prompt.

Returns:

  • (String, Array<String>)

    the prompt with context added.



59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
# File 'lib/ai_client/chat.rb', line 59

def add_context(my_prompt)
  return(my_prompt)   if  @config.context_length.nil? || 
                          0 == @config.context_length ||
                          my_prompt.is_a?(Array)      || 
                          @context.empty?

  prompt = "\nUser: #{my_prompt} Bot: "
  prompt.prepend(
    @context.map{|entry|
      "User: #{entry[:user]} Bot: #{entry[:bot]}"
    }.join("\n")
  )

  prompt
end

#batch_embed(inputs, batch_size: 100, **params) ⇒ Object



14
15
16
17
18
19
# File 'lib/ai_client/embed.rb', line 14

def batch_embed(inputs, batch_size: 100, **params)
  inputs.each_slice(batch_size).flat_map do |batch|
    sleep 1 # DEBUG rate limits being exceeded
    embed(batch, **params)
  end
end

#call_with_middlewares(method, *args, **kwargs, &block) ⇒ Object

Calls the specified method with middlewares applied.

Parameters:

  • method (Symbol)

    the name of the method to be called

  • args (Array)

    additional arguments for the method

  • kwargs (Hash)

    named parameters for the method

  • block (Proc)

    optional block to be passed to the method

Returns:

  • (Object)

    result of the method call after applying middlewares



24
25
26
27
28
29
# File 'lib/ai_client/middleware.rb', line 24

def call_with_middlewares(method, *args, **kwargs, &block)
  stack = self.class.middlewares.reverse.reduce(-> { send(method, *args, **kwargs, &block) }) do |next_middleware, middleware|
    -> { middleware.call(self, next_middleware, *args, **kwargs, &block) }
  end
  stack.call
end

#chat(messages = '', **params, &block) ⇒ String

OmniAI Params

model:        @model    [String] optional
format:       @format   [Symbol] optional :text or :json
stream:       @stream   [Proc, nil] optional
tools:        @tools    [Array<OmniAI::Tool>] optional
temperature:  @temperature  [Float, nil] optional

Initiates a chat session.

Parameters:

  • messages (Array<String>) (defaults to: '')

    the messages to send.

  • params (Hash)

    optional parameters.

Options Hash (**params):

  • :tools (Array<OmniAI::Tool>)

    an array of tools to use.

Returns:

  • (String)

    the result from the chat.

Raises:

  • (RuntimeError)

    if tools parameter is invalid.



22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
# File 'lib/ai_client/chat.rb', line 22

def chat(messages='', **params, &block)    
  if params.has_key? :tools
    tools = params[:tools]
    if tools.is_a? Array
      tools.map!{|function_name| AiClient::Function.registry[function_name]}
    elsif true == tools
      tools = AiClient::Function.registry.values
    else
      raise 'what is this'
    end
    params[:tools] = tools
  end

  @last_messages  = messages
  messages        = add_context(messages)
  result          = call_with_middlewares(
                      :chat_without_middlewares, 
                      messages, 
                      **params,
                      &block
                    )
  @last_response  = result
  result          = raw? ? result : content

  @context  = @context.push({
                user: @last_messages,
                bot:  result
              }).last(config.context_length)

  result
end

#chat_without_middlewares(messages, **params, &block) ⇒ String

Chats with the client without middleware processing.

Parameters:

  • messages (Array<String>)

    the messages to send.

  • params (Hash)

    optional parameters.

Returns:

  • (String)

    the result from the chat.



90
91
92
# File 'lib/ai_client/chat.rb', line 90

def chat_without_middlewares(messages, **params, &block)
  @client.chat(messages, model: @model, **params, &block)
end

#chatbot(prompt = 'hello') ⇒ Object

Just an example …



96
97
98
99
100
101
102
# File 'lib/ai_client/chat.rb', line 96

def chatbot(prompt='hello')
  until prompt.empty? do
    response = chat prompt
    print "\n\n#{content}\n\nFollow Up: "
    prompt = gets.chomp
  end
end

#clear_contextvoid

This method returns an undefined value.

Clears the current context.



80
81
82
# File 'lib/ai_client/chat.rb', line 80

def clear_context
  @context = []
end

#embed(input, **params) ⇒ Object

OmniAI Params

model [String] required


10
11
12
# File 'lib/ai_client/embed.rb', line 10

def embed(input, **params)
  @client.embed(input, model: @model, **params)
end

#model_detailsHash?

Retrieves details for the current model.

Returns:

  • (Hash, nil)

    Details of the current model or nil if not found.



33
34
35
36
# File 'lib/ai_client/ollama_extensions.rb', line 33

def model_details
  id = "#{@provider}/#{@model}"
  LLM.find(id.downcase)
end

#modelsArray<String>

Retrieves model names for the current provider.

Returns:

  • (Array<String>)

    List of model names for the current provider.



41
# File 'lib/ai_client/ollama_extensions.rb', line 41

def models = LLM.models(@provider)

#speak(text, **params) {|output| ... } ⇒ Tempfile``

OmniAI Params

input   [String] required
model   [String] required
voice   [String] required
speed   [Float] optional
format  [String] optional (default "aac")
  aac mp3 flac opus pcm wav

Yields:

  • (output)

    optional

Returns:

  • (Tempfile``)


18
19
20
# File 'lib/ai_client/speak.rb', line 18

def speak(text, **params)
  call_with_middlewares(:speak_without_middlewares, text, **params)
end

#speak_without_middlewares(text, **params) ⇒ Object



22
23
24
# File 'lib/ai_client/speak.rb', line 22

def speak_without_middlewares(text, **params)
  @client.speak(text, model: @model, **params)
end

#transcribe(audio, format: nil, **params) ⇒ Object

OmniAI Params

model    [String]
language [String, nil] optional
prompt   [String, nil] optional
format   [Symbol] :text, :srt, :vtt, or :json (default)
temperature [Float, nil] optional


13
14
15
# File 'lib/ai_client/transcribe.rb', line 13

def transcribe(audio, format: nil, **params)
  call_with_middlewares(:transcribe_without_middlewares, audio, format: format, **params)
end

#transcribe_without_middlewares(audio, format: nil, **params) ⇒ Object



17
18
19
# File 'lib/ai_client/transcribe.rb', line 17

def transcribe_without_middlewares(audio, format: nil, **params)
  @client.transcribe(audio, model: @model, format: format, **params)
end

#versionObject



6
# File 'lib/ai_client/version.rb', line 6

def version       = VERSION