Module: Raix::PromptDeclarations

Extended by:
ActiveSupport::Concern
Defined in:
lib/raix/prompt_declarations.rb

Overview

The PromptDeclarations module provides a way to chain prompts and handle user responses in a serialized manner, with support for functions if the FunctionDispatch module is also included.

Defined Under Namespace

Modules: ClassMethods

Constant Summary collapse

MAX_LOOP_COUNT =
5

Instance Attribute Summary collapse

Instance Method Summary collapse

Instance Attribute Details

#current_promptObject (readonly)

Returns the value of attribute current_prompt.



42
43
44
# File 'lib/raix/prompt_declarations.rb', line 42

def current_prompt
  @current_prompt
end

#last_responseObject (readonly)

Returns the value of attribute last_response.



42
43
44
# File 'lib/raix/prompt_declarations.rb', line 42

def last_response
  @last_response
end

Instance Method Details

#chat_completion(prompt = nil, params: {}, raw: false, openai: false) ⇒ Object

Executes the chat completion process based on the class-level declared prompts. The response to each prompt is added to the transcript automatically and returned.

Raises an error if there are not enough prompts defined.

Uses system prompt in following order of priority:

 - system lambda specified in the prompt declaration
 - system_prompt instance method if defined
 - system_prompt class-level declaration if defined

Prompts require a text lambda to be defined at minimum.
TODO: shortcut syntax passes just a string prompt if no other options are needed.

TODO: SHOULD NOT HAVE A DIFFERENT INTERFACE THAN PARENT

Parameters:

  • prompt (String) (defaults to: nil)

    The prompt to use for the chat completion.

  • params (Hash) (defaults to: {})

    Parameters for the chat completion.

  • raw (Boolean) (defaults to: false)

    Whether to return the raw response.

Raises:

  • (RuntimeError)

    If no prompts are defined.



66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
# File 'lib/raix/prompt_declarations.rb', line 66

def chat_completion(prompt = nil, params: {}, raw: false, openai: false)
  raise "No prompts defined" unless self.class.prompts.present?

  loop_count = 0

  current_prompts = self.class.prompts.clone

  while (@current_prompt = current_prompts.shift)
    next if @current_prompt.if.present? && !instance_exec(&@current_prompt.if)
    next if @current_prompt.unless.present? && instance_exec(&@current_prompt.unless)

    input = case current_prompt.text
            when Proc
              instance_exec(&current_prompt.text)
            when String
              current_prompt.text
            when Symbol
              send(current_prompt.text)
            else
              last_response.presence || prompt
            end

    if current_prompt.call.present?
      current_prompt.call.new(self).call(input).tap do |response|
        if response.present?
          transcript << { assistant: response }
          @last_response = send(current_prompt.name, response)
        end
      end
    else
      __system_prompt = instance_exec(&current_prompt.system) if current_prompt.system.present? # rubocop:disable Lint/UnderscorePrefixedVariableName
      __system_prompt ||= system_prompt if respond_to?(:system_prompt)
      __system_prompt ||= self.class.system_prompt.presence
      transcript << { system: __system_prompt } if __system_prompt
      transcript << { user: instance_exec(&current_prompt.text) } # text is required

      params = current_prompt.params.merge(params)

      # set the stream if necessary
      self.stream = instance_exec(&current_prompt.stream) if current_prompt.stream.present?

      execute_ai_request(params:, raw:, openai:, transcript:, loop_count:)
    end

    next unless current_prompt.until.present? && !instance_exec(&current_prompt.until)

    if loop_count >= MAX_LOOP_COUNT
      warn "Max loop count reached in chat_completion. Forcing return."

      return last_response
    else
      current_prompts.unshift(@current_prompt) # put it back at the front
      loop_count += 1
    end
  end

  last_response
end

#execute_ai_request(params:, raw:, openai:, transcript:, loop_count:) ⇒ Object



125
126
127
128
129
130
131
132
133
134
135
136
# File 'lib/raix/prompt_declarations.rb', line 125

def execute_ai_request(params:, raw:, openai:, transcript:, loop_count:)
  chat_completion_from_superclass(params:, raw:, openai:).then do |response|
    transcript << { assistant: response }
    @last_response = send(current_prompt.name, response)
    self.stream = nil # clear it again so it's not used for the next prompt
  end
rescue StandardError => e
  # Bubbles the error up the stack if no loops remain
  raise e if loop_count >= MAX_LOOP_COUNT

  sleep 1 # Wait before continuing
end

#max_tokensInteger

Returns the max_tokens parameter of the current prompt or the default max_tokens.

Returns:

  • (Integer)

    The max_tokens parameter of the current prompt or the default max_tokens.



155
156
157
# File 'lib/raix/prompt_declarations.rb', line 155

def max_tokens
  @current_prompt.params[:max_tokens] || super
end

#modelObject

Returns the model parameter of the current prompt or the default model.

Returns:

  • (Object)

    The model parameter of the current prompt or the default model.



141
142
143
# File 'lib/raix/prompt_declarations.rb', line 141

def model
  @current_prompt.params[:model] || super
end

#temperatureFloat

Returns the temperature parameter of the current prompt or the default temperature.

Returns:

  • (Float)

    The temperature parameter of the current prompt or the default temperature.



148
149
150
# File 'lib/raix/prompt_declarations.rb', line 148

def temperature
  @current_prompt.params[:temperature] || super
end