Class: Durable::Llm::Providers::Anthropic

Inherits:
Base
  • Object
show all
Defined in:
lib/durable/llm/providers/anthropic.rb

Overview

Anthropic provider for accessing Claude language models through their API.

This provider implements the Durable::Llm::Providers::Base interface to provide completion and streaming capabilities for Anthropic’s Claude models including Claude 3.5 Sonnet, Claude 3 Opus, and Claude 3 Haiku. It handles authentication via API keys, supports system messages, and provides comprehensive error handling for various Anthropic API error conditions.

Key features:

  • Message-based chat completions with multi-turn conversations

  • Real-time streaming responses for interactive applications

  • System message support for setting context

  • Automatic model listing from predefined supported models

  • Comprehensive error handling with specific exception types

Examples:

Basic completion

provider = Durable::Llm::Providers::Anthropic.new(api_key: 'your-api-key')
response = provider.completion(
  model: 'claude-3-5-sonnet-20240620',
  messages: [{ role: 'user', content: 'Hello, world!' }]
)
puts response.choices.first.to_s

Completion with system message

response = provider.completion(
  model: 'claude-3-5-sonnet-20240620',
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
    { role: 'user', content: 'Hello!' }
  ]
)

Streaming response

provider.stream(model: 'claude-3-5-sonnet-20240620', messages: messages) do |chunk|
  print chunk.to_s
end

See Also:

Defined Under Namespace

Classes: AnthropicChoice, AnthropicMessage, AnthropicResponse, AnthropicStreamChoice, AnthropicStreamDelta, AnthropicStreamResponse

Constant Summary collapse

BASE_URL =
'https://api.anthropic.com'

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Methods inherited from Base

options, #stream?

Constructor Details

#initialize(api_key: nil) ⇒ Anthropic

Initializes a new Anthropic provider instance.

Parameters:

  • api_key (String, nil) (defaults to: nil)

    The Anthropic API key. If nil, uses default_api_key



68
69
70
71
72
73
74
75
76
77
# File 'lib/durable/llm/providers/anthropic.rb', line 68

def initialize(api_key: nil)
  super()
  @api_key = api_key || default_api_key

  @conn = Faraday.new(url: BASE_URL) do |faraday|
    faraday.request :json
    faraday.response :json
    faraday.adapter Faraday.default_adapter
  end
end

Instance Attribute Details

#api_keyString?

Returns The API key used for authentication with Anthropic.

Returns:

  • (String, nil)

    The API key used for authentication with Anthropic



62
63
64
# File 'lib/durable/llm/providers/anthropic.rb', line 62

def api_key
  @api_key
end

Class Method Details

.modelsArray<String>

Retrieves the list of supported Claude models.

Returns:

  • (Array<String>)

    Array of supported Claude model identifiers



129
130
131
# File 'lib/durable/llm/providers/anthropic.rb', line 129

def self.models
  ['claude-3-5-sonnet-20240620', 'claude-3-opus-20240229', 'claude-3-haiku-20240307']
end

.stream?Boolean

Returns True, indicating this provider supports streaming.

Returns:

  • (Boolean)

    True, indicating this provider supports streaming



134
135
136
# File 'lib/durable/llm/providers/anthropic.rb', line 134

def self.stream?
  true
end

Instance Method Details

#completion(options) ⇒ AnthropicResponse

Performs a completion request to Anthropic’s messages API.

Parameters:

  • options (Hash)

    The completion options

Options Hash (options):

  • :model (String)

    The Claude model to use

  • :messages (Array<Hash>)

    Array of message objects with role and content

  • :max_tokens (Integer)

    Maximum number of tokens to generate (default: 1024)

  • :temperature (Float)

    Sampling temperature between 0 and 1

  • :system (String)

    System message to set context

Returns:

Raises:



92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
# File 'lib/durable/llm/providers/anthropic.rb', line 92

def completion(options)
  # Convert symbol keys to strings for consistency
  options = options.transform_keys(&:to_s)

  # Ensure max_tokens is set
  options['max_tokens'] ||= 1024

  # Handle system message separately as Anthropic expects it as a top-level parameter
  system_message = nil
  messages = options['messages']&.dup || []
  if messages.first && (messages.first['role'] || messages.first[:role]) == 'system'
    system_message = messages.first['content'] || messages.first[:content]
    messages = messages[1..] || []
  end

  request_body = options.merge('messages' => messages)
  request_body['system'] = system_message if system_message

  response = @conn.post('/v1/messages') do |req|
    req.headers['x-api-key'] = @api_key
    req.headers['anthropic-version'] = '2023-06-01'
    req.body = request_body
  end

  handle_response(response)
end

#default_api_keyString?

Returns The default API key for Anthropic, or nil if not configured.

Returns:

  • (String, nil)

    The default API key for Anthropic, or nil if not configured



56
57
58
# File 'lib/durable/llm/providers/anthropic.rb', line 56

def default_api_key
  Durable::Llm.configuration.anthropic&.api_key || ENV['ANTHROPIC_API_KEY']
end

#embedding(model:, input:, **options) ⇒ Object

Performs an embedding request (not supported by Anthropic).

Parameters:

  • model (String)

    The model to use for generating embeddings

  • input (String, Array<String>)

    The input text(s) to embed

  • options (Hash)

    Additional options for the embedding request

Raises:

  • (NotImplementedError)

    Anthropic does not provide embedding APIs



144
145
146
# File 'lib/durable/llm/providers/anthropic.rb', line 144

def embedding(model:, input:, **options)
  raise NotImplementedError, 'Anthropic does not provide embedding APIs'
end

#modelsArray<String>

Retrieves the list of available models for this provider instance.

Returns:

  • (Array<String>)

    The list of available Claude model names



122
123
124
# File 'lib/durable/llm/providers/anthropic.rb', line 122

def models
  self.class.models
end

#stream(options) {|AnthropicStreamResponse| ... } ⇒ Object

Performs a streaming completion request to Anthropic’s messages API.

Parameters:

  • options (Hash)

    The stream options (same as completion plus stream: true)

Yields:

Returns:

  • (Object)

    The final response object

Raises:



157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
# File 'lib/durable/llm/providers/anthropic.rb', line 157

def stream(options)
  options = options.transform_keys(&:to_s)
  options['stream'] = true

  # Handle system message separately
  system_message = nil
  messages = options['messages']&.dup || []
  if messages.first && (messages.first['role'] || messages.first[:role]) == 'system'
    system_message = messages.first['content'] || messages.first[:content]
    messages = messages[1..] || []
  end

  request_body = options.merge('messages' => messages)
  request_body['system'] = system_message if system_message

  response = @conn.post('/v1/messages') do |req|
    req.headers['x-api-key'] = @api_key
    req.headers['anthropic-version'] = '2023-06-01'
    req.headers['Accept'] = 'text/event-stream'

    req.body = request_body

    user_proc = proc do |chunk, _size, _total|
      yield AnthropicStreamResponse.new(chunk)
    end

    req.options.on_data = to_json_stream(user_proc: user_proc)
  end

  handle_response(response)
end