Class: LlmOrchestrator::Anthropic
- Defined in:
- lib/llm_orchestrator/llm.rb
Overview
Anthropic LLM provider implementation Handles interactions with Anthropic’s Claude models
Instance Method Summary collapse
-
#generate(prompt, context: nil, **options) ⇒ Object
rubocop:disable Metrics/MethodLength.
-
#initialize(api_key: nil, model: nil, temperature: nil, max_tokens: nil) ⇒ Anthropic
constructor
A new instance of Anthropic.
Constructor Details
#initialize(api_key: nil, model: nil, temperature: nil, max_tokens: nil) ⇒ Anthropic
56 57 58 59 60 61 62 63 |
# File 'lib/llm_orchestrator/llm.rb', line 56 def initialize(api_key: nil, model: nil, temperature: nil, max_tokens: nil) super @api_key ||= LlmOrchestrator.configuration.claude.api_key @client = ::Anthropic::Client.new(access_token: @api_key) @model = model || LlmOrchestrator.configuration.claude.model @temperature = temperature || LlmOrchestrator.configuration.claude.temperature @max_tokens = max_tokens || LlmOrchestrator.configuration.claude.max_tokens end |
Instance Method Details
#generate(prompt, context: nil, **options) ⇒ Object
rubocop:disable Metrics/MethodLength
66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 |
# File 'lib/llm_orchestrator/llm.rb', line 66 def generate(prompt, context: nil, **) response = @client.( parameters: { model: [:model] || @model, system: context, messages: [ { role: "user", content: prompt } ], temperature: [:temperature] || @temperature, max_tokens: [:max_tokens] || @max_tokens } ) response.content.first.text end |