Class: Raif::Agent

Instance Attribute Summary collapse

Instance Method Summary collapse

Methods included from Concerns::HasAvailableModelTools

#available_model_tools_map

Methods included from Concerns::HasRequestedLanguage

#requested_language_name, #system_prompt_language_preference

Methods included from Concerns::HasLlm

#default_llm_model_key, #llm

Methods inherited from ApplicationRecord

table_name_prefix

Instance Attribute Details

#on_conversation_history_entryObject

Returns the value of attribute on_conversation_history_entry.



32
33
34
# File 'app/models/raif/agent.rb', line 32

def on_conversation_history_entry
  @on_conversation_history_entry
end

Instance Method Details

#run!(&block) ⇒ Raif::Agent

Runs the agent and returns a Raif::Agent. If a block is given, it will be called each time a new entry is added to the agent’s conversation history. The block will receive the Raif::Agent and the new entry as arguments: agent = Raif::Agent.new(

task: task,
tools: [Raif::ModelTools::WikipediaSearch, Raif::ModelTools::FetchUrl],
creator: creator

)

agent.run! do |conversation_history_entry|

Turbo::StreamsChannel.broadcast_append_to(
  :my_agent_channel,
  target: "agent-progress",
  partial: "my_partial_displaying_agent_progress",
  locals: { agent: agent, conversation_history_entry: conversation_history_entry }
)

end

The conversation_history_entry will be a hash with “role” and “content” keys: { “role” => “assistant”, “content” => “a message here” }

Parameters:

  • block (Proc)

    Optional block to be called each time a new entry to the agent’s conversation history is generated

Returns:



57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
# File 'app/models/raif/agent.rb', line 57

def run!(&block)
  self.on_conversation_history_entry = block_given? ? block : nil
  self.started_at = Time.current
  save!

  logger.debug "    --------------------------------\n    Starting Agent Run\n    --------------------------------\n    System Prompt:\n    \#{system_prompt}\n\n    Task: \#{task}\n  DEBUG\n\n  add_conversation_history_entry({ role: \"user\", content: task })\n\n  while iteration_count < max_iterations\n    update_columns(iteration_count: iteration_count + 1)\n\n    model_completion = llm.chat(\n      messages: conversation_history,\n      source: self,\n      system_prompt: system_prompt,\n      available_model_tools: native_model_tools\n    )\n\n    logger.debug <<~DEBUG\n      --------------------------------\n      Agent iteration \#{iteration_count}\n      Messages:\n      \#{JSON.pretty_generate(conversation_history)}\n\n      Response:\n      \#{model_completion.raw_response}\n      --------------------------------\n    DEBUG\n\n    process_iteration_model_completion(model_completion)\n    break if final_answer.present?\n  end\n\n  completed!\n  final_answer\nrescue StandardError => e\n  self.failed_at = Time.current\n  self.failure_reason = e.message\n  save!\n\n  raise\nend\n"