Class: Durable::Llm::Providers::OpenAI
- Defined in:
- lib/durable/llm/providers/openai.rb
Overview
OpenAI provider for accessing OpenAI’s language models through their API.
This provider implements the Durable::Llm::Providers::Base interface to provide completion, embedding, and streaming capabilities for OpenAI’s models including GPT-3.5, GPT-4, and their variants. It handles authentication via API keys, supports organization-based access, and provides comprehensive error handling for various OpenAI API error conditions.
Key features:
-
Chat completions with support for multi-turn conversations
-
Text embeddings for semantic similarity and retrieval tasks
-
Real-time streaming responses for interactive applications
-
Automatic model listing from OpenAI’s API
-
Organization support for enterprise accounts
-
Comprehensive error handling with specific exception types
Defined Under Namespace
Classes: OpenAIChoice, OpenAIEmbeddingResponse, OpenAIMessage, OpenAIResponse, OpenAIStreamChoice, OpenAIStreamDelta, OpenAIStreamResponse
Constant Summary collapse
- BASE_URL =
'https://api.openai.com/v1'
Instance Attribute Summary collapse
-
#api_key ⇒ String?
The API key used for authentication with OpenAI.
-
#organization ⇒ String?
The OpenAI organization ID for enterprise accounts.
Class Method Summary collapse
-
.stream? ⇒ Boolean
True, indicating this provider supports streaming.
Instance Method Summary collapse
-
#completion(options) ⇒ OpenAIResponse
Performs a chat completion request to OpenAI’s API.
- #default_api_key ⇒ Object
-
#embedding(model:, input:, **options) ⇒ OpenAIEmbeddingResponse
Performs an embedding request to OpenAI’s API.
-
#initialize(api_key: nil, organization: nil) ⇒ OpenAI
constructor
Initializes a new OpenAI provider instance.
-
#models ⇒ Array<String>
Retrieves the list of available models from OpenAI’s API.
-
#stream(options) {|OpenAIStreamResponse| ... } ⇒ Object
Performs a streaming chat completion request to OpenAI’s API.
Methods inherited from Base
Constructor Details
#initialize(api_key: nil, organization: nil) ⇒ OpenAI
Initializes a new OpenAI provider instance.
72 73 74 75 76 77 78 79 80 |
# File 'lib/durable/llm/providers/openai.rb', line 72 def initialize(api_key: nil, organization: nil) super(api_key: api_key) @organization = organization || ENV['OPENAI_ORGANIZATION'] @conn = Faraday.new(url: BASE_URL) do |faraday| faraday.request :json faraday.response :json faraday.adapter Faraday.default_adapter end end |
Instance Attribute Details
#api_key ⇒ String?
Returns The API key used for authentication with OpenAI.
65 66 67 |
# File 'lib/durable/llm/providers/openai.rb', line 65 def api_key @api_key end |
#organization ⇒ String?
Returns The OpenAI organization ID for enterprise accounts.
65 |
# File 'lib/durable/llm/providers/openai.rb', line 65 attr_accessor :api_key, :organization |
Class Method Details
.stream? ⇒ Boolean
Returns True, indicating this provider supports streaming.
141 142 143 |
# File 'lib/durable/llm/providers/openai.rb', line 141 def self.stream? true end |
Instance Method Details
#completion(options) ⇒ OpenAIResponse
Performs a chat completion request to OpenAI’s API.
95 96 97 98 99 100 101 102 103 |
# File 'lib/durable/llm/providers/openai.rb', line 95 def completion() response = @conn.post('chat/completions') do |req| req.headers['Authorization'] = "Bearer #{@api_key}" req.headers['OpenAI-Organization'] = @organization if @organization req.body = end handle_response(response) end |
#default_api_key ⇒ Object
53 54 55 56 57 58 59 |
# File 'lib/durable/llm/providers/openai.rb', line 53 def default_api_key begin Durable::Llm.configuration.openai&.api_key rescue NoMethodError nil end || ENV['OPENAI_API_KEY'] end |
#embedding(model:, input:, **options) ⇒ OpenAIEmbeddingResponse
Performs an embedding request to OpenAI’s API.
115 116 117 118 119 120 121 122 123 |
# File 'lib/durable/llm/providers/openai.rb', line 115 def (model:, input:, **) response = @conn.post('embeddings') do |req| req.headers['Authorization'] = "Bearer #{@api_key}" req.headers['OpenAI-Organization'] = @organization if @organization req.body = { model: model, input: input, ** } end handle_response(response, OpenAIEmbeddingResponse) end |
#models ⇒ Array<String>
Retrieves the list of available models from OpenAI’s API.
131 132 133 134 135 136 137 138 |
# File 'lib/durable/llm/providers/openai.rb', line 131 def models response = @conn.get('models') do |req| req.headers['Authorization'] = "Bearer #{@api_key}" req.headers['OpenAI-Organization'] = @organization if @organization end handle_response(response).data.map { |model| model['id'] } end |
#stream(options) {|OpenAIStreamResponse| ... } ⇒ Object
Performs a streaming chat completion request to OpenAI’s API.
154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 |
# File 'lib/durable/llm/providers/openai.rb', line 154 def stream() [:stream] = true response = @conn.post('chat/completions') do |req| req.headers['Authorization'] = "Bearer #{@api_key}" req.headers['OpenAI-Organization'] = @organization if @organization req.headers['Accept'] = 'text/event-stream' ['temperature'] = ['temperature'].to_f if ['temperature'] req.body = user_proc = proc do |chunk, _size, _total| yield OpenAIStreamResponse.new(chunk) end req..on_data = to_json_stream(user_proc: user_proc) end handle_response(response) end |