Class: Durable::Llm::Providers::Xai
- Defined in:
- lib/durable/llm/providers/xai.rb
Overview
xAI provider for accessing xAI’s Grok language models.
This class provides methods to interact with xAI’s API for chat completions, embeddings, model listing, and streaming responses.
Defined Under Namespace
Classes: XaiChoice, XaiEmbeddingResponse, XaiMessage, XaiResponse, XaiStreamChoice, XaiStreamDelta, XaiStreamResponse
Constant Summary collapse
- BASE_URL =
'https://api.x.ai/v1'
Instance Attribute Summary collapse
-
#api_key ⇒ Object
Returns the value of attribute api_key.
Class Method Summary collapse
-
.stream? ⇒ Boolean
Indicates whether this provider supports streaming responses.
Instance Method Summary collapse
-
#completion(options) ⇒ XaiResponse
Performs a chat completion request to xAI’s API.
-
#default_api_key ⇒ String?
Returns the default API key for xAI, checking configuration and environment variables.
-
#embedding(model:, input:, **options) ⇒ XaiEmbeddingResponse
Performs an embedding request to xAI’s API.
-
#initialize(api_key: nil) ⇒ Xai
constructor
Initializes the xAI provider with API key and HTTP connection.
-
#models ⇒ Array<String>
Retrieves the list of available models from xAI’s API.
-
#stream(options) {|XaiStreamResponse| ... } ⇒ nil
Performs a streaming chat completion request to xAI’s API.
Methods inherited from Base
Constructor Details
#initialize(api_key: nil) ⇒ Xai
Initializes the xAI provider with API key and HTTP connection.
41 42 43 44 45 46 47 48 |
# File 'lib/durable/llm/providers/xai.rb', line 41 def initialize(api_key: nil) super @conn = Faraday.new(url: BASE_URL) do |faraday| faraday.request :json faraday.response :json faraday.adapter Faraday.default_adapter end end |
Instance Attribute Details
#api_key ⇒ Object
Returns the value of attribute api_key.
36 37 38 |
# File 'lib/durable/llm/providers/xai.rb', line 36 def api_key @api_key end |
Class Method Details
.stream? ⇒ Boolean
Indicates whether this provider supports streaming responses.
92 93 94 |
# File 'lib/durable/llm/providers/xai.rb', line 92 def self.stream? true end |
Instance Method Details
#completion(options) ⇒ XaiResponse
Performs a chat completion request to xAI’s API.
54 55 56 57 58 59 60 61 |
# File 'lib/durable/llm/providers/xai.rb', line 54 def completion() response = @conn.post('chat/completions') do |req| req.headers['Authorization'] = "Bearer #{@api_key}" req.body = end handle_response(response) end |
#default_api_key ⇒ String?
Returns the default API key for xAI, checking configuration and environment variables.
28 29 30 31 32 33 34 |
# File 'lib/durable/llm/providers/xai.rb', line 28 def default_api_key begin Durable::Llm.configuration.xai&.api_key rescue NoMethodError nil end || ENV['XAI_API_KEY'] end |
#embedding(model:, input:, **options) ⇒ XaiEmbeddingResponse
Performs an embedding request to xAI’s API.
69 70 71 72 73 74 75 76 |
# File 'lib/durable/llm/providers/xai.rb', line 69 def (model:, input:, **) response = @conn.post('embeddings') do |req| req.headers['Authorization'] = "Bearer #{@api_key}" req.body = { model: model, input: input, ** } end handle_response(response, XaiEmbeddingResponse) end |
#models ⇒ Array<String>
Retrieves the list of available models from xAI’s API.
81 82 83 84 85 86 87 |
# File 'lib/durable/llm/providers/xai.rb', line 81 def models response = @conn.get('models') do |req| req.headers['Authorization'] = "Bearer #{@api_key}" end handle_response(response).data.map { |model| model['id'] } end |
#stream(options) {|XaiStreamResponse| ... } ⇒ nil
Performs a streaming chat completion request to xAI’s API.
101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 |
# File 'lib/durable/llm/providers/xai.rb', line 101 def stream() [:stream] = true response = @conn.post('chat/completions') do |req| req.headers['Authorization'] = "Bearer #{@api_key}" req.headers['Accept'] = 'text/event-stream' ['temperature'] = ['temperature'].to_f if ['temperature'] req.body = user_proc = proc do |chunk, _size, _total| yield XaiStreamResponse.new(chunk) end req..on_data = to_json_stream(user_proc: user_proc) end handle_response(response) end |