Class: Durable::Llm::Providers::OpenAI

Inherits:
Base
  • Object
show all
Defined in:
lib/durable/llm/providers/openai.rb

Overview

OpenAI provider for accessing OpenAI’s language models through their API.

This provider implements the Durable::Llm::Providers::Base interface to provide completion, embedding, and streaming capabilities for OpenAI’s models including GPT-3.5, GPT-4, and their variants. It handles authentication via API keys, supports organization-based access, and provides comprehensive error handling for various OpenAI API error conditions.

Key features:

  • Chat completions with support for multi-turn conversations

  • Text embeddings for semantic similarity and retrieval tasks

  • Real-time streaming responses for interactive applications

  • Automatic model listing from OpenAI’s API

  • Organization support for enterprise accounts

  • Comprehensive error handling with specific exception types

Examples:

Basic completion

provider = Durable::Llm::Providers::OpenAI.new(api_key: 'your-api-key')
response = provider.completion(
  model: 'gpt-3.5-turbo',
  messages: [{ role: 'user', content: 'Hello, world!' }]
)
puts response.choices.first.to_s

Streaming response

provider.stream(model: 'gpt-4', messages: messages) do |chunk|
  print chunk.to_s
end

Text embedding

embedding = provider.embedding(
  model: 'text-embedding-ada-002',
  input: 'Some text to embed'
)

See Also:

Defined Under Namespace

Classes: OpenAIChoice, OpenAIEmbeddingResponse, OpenAIMessage, OpenAIResponse, OpenAIStreamChoice, OpenAIStreamDelta, OpenAIStreamResponse

Constant Summary collapse

BASE_URL =
'https://api.openai.com/v1'

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Methods inherited from Base

models, options, #stream?

Constructor Details

#initialize(api_key: nil, organization: nil) ⇒ OpenAI

Initializes a new OpenAI provider instance.

Parameters:

  • api_key (String, nil) (defaults to: nil)

    The OpenAI API key. If nil, uses default_api_key

  • organization (String, nil) (defaults to: nil)

    The OpenAI organization ID. If nil, uses ENV



72
73
74
75
76
77
78
79
80
# File 'lib/durable/llm/providers/openai.rb', line 72

def initialize(api_key: nil, organization: nil)
  super(api_key: api_key)
  @organization = organization || ENV['OPENAI_ORGANIZATION']
  @conn = Faraday.new(url: BASE_URL) do |faraday|
    faraday.request :json
    faraday.response :json
    faraday.adapter Faraday.default_adapter
  end
end

Instance Attribute Details

#api_keyString?

Returns The API key used for authentication with OpenAI.

Returns:

  • (String, nil)

    The API key used for authentication with OpenAI



65
66
67
# File 'lib/durable/llm/providers/openai.rb', line 65

def api_key
  @api_key
end

#organizationString?

Returns The OpenAI organization ID for enterprise accounts.

Returns:

  • (String, nil)

    The OpenAI organization ID for enterprise accounts



65
# File 'lib/durable/llm/providers/openai.rb', line 65

attr_accessor :api_key, :organization

Class Method Details

.stream?Boolean

Returns True, indicating this provider supports streaming.

Returns:

  • (Boolean)

    True, indicating this provider supports streaming



141
142
143
# File 'lib/durable/llm/providers/openai.rb', line 141

def self.stream?
  true
end

Instance Method Details

#completion(options) ⇒ OpenAIResponse

Performs a chat completion request to OpenAI’s API.

Parameters:

  • options (Hash)

    The completion options

Options Hash (options):

  • :model (String)

    The model to use (e.g., ‘gpt-3.5-turbo’, ‘gpt-4’)

  • :messages (Array<Hash>)

    Array of message objects with role and content

  • :temperature (Float)

    Sampling temperature between 0 and 2

  • :max_tokens (Integer)

    Maximum number of tokens to generate

  • :top_p (Float)

    Nucleus sampling parameter

Returns:

Raises:



95
96
97
98
99
100
101
102
103
# File 'lib/durable/llm/providers/openai.rb', line 95

def completion(options)
  response = @conn.post('chat/completions') do |req|
    req.headers['Authorization'] = "Bearer #{@api_key}"
    req.headers['OpenAI-Organization'] = @organization if @organization
    req.body = options
  end

  handle_response(response)
end

#default_api_keyObject



53
54
55
56
57
58
59
# File 'lib/durable/llm/providers/openai.rb', line 53

def default_api_key
  begin
    Durable::Llm.configuration.openai&.api_key
  rescue NoMethodError
    nil
  end || ENV['OPENAI_API_KEY']
end

#embedding(model:, input:, **options) ⇒ OpenAIEmbeddingResponse

Performs an embedding request to OpenAI’s API.

Parameters:

  • model (String)

    The embedding model to use (e.g., ‘text-embedding-ada-002’)

  • input (String, Array<String>)

    The text(s) to embed

  • options (Hash)

    Additional options for the embedding request

Returns:

Raises:



115
116
117
118
119
120
121
122
123
# File 'lib/durable/llm/providers/openai.rb', line 115

def embedding(model:, input:, **options)
  response = @conn.post('embeddings') do |req|
    req.headers['Authorization'] = "Bearer #{@api_key}"
    req.headers['OpenAI-Organization'] = @organization if @organization
    req.body = { model: model, input: input, **options }
  end

  handle_response(response, OpenAIEmbeddingResponse)
end

#modelsArray<String>

Retrieves the list of available models from OpenAI’s API.

Returns:

  • (Array<String>)

    Array of model IDs available to the account

Raises:



131
132
133
134
135
136
137
138
# File 'lib/durable/llm/providers/openai.rb', line 131

def models
  response = @conn.get('models') do |req|
    req.headers['Authorization'] = "Bearer #{@api_key}"
    req.headers['OpenAI-Organization'] = @organization if @organization
  end

  handle_response(response).data.map { |model| model['id'] }
end

#stream(options) {|OpenAIStreamResponse| ... } ⇒ Object

Performs a streaming chat completion request to OpenAI’s API.

Parameters:

  • options (Hash)

    The stream options (same as completion plus stream: true)

Yields:

Returns:

  • (Object)

    The final response object

Raises:



154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
# File 'lib/durable/llm/providers/openai.rb', line 154

def stream(options)
  options[:stream] = true

  response = @conn.post('chat/completions') do |req|
    req.headers['Authorization'] = "Bearer #{@api_key}"
    req.headers['OpenAI-Organization'] = @organization if @organization
    req.headers['Accept'] = 'text/event-stream'

    options['temperature'] = options['temperature'].to_f if options['temperature']

    req.body = options

    user_proc = proc do |chunk, _size, _total|
      yield OpenAIStreamResponse.new(chunk)
    end

    req.options.on_data = to_json_stream(user_proc: user_proc)
  end

  handle_response(response)
end