Class: Durable::Llm::Providers::Mistral

Inherits:
Base
  • Object
show all
Defined in:
lib/durable/llm/providers/mistral.rb

Overview

Mistral AI provider for accessing Mistral AI’s language models

This class provides a complete interface to Mistral AI’s API, supporting text completions, embeddings, model listing, and streaming responses. It handles authentication, HTTP communication, error management, and response normalization to provide a consistent API experience.

Examples:

Basic usage

provider = Durable::Llm::Providers::Mistral.new(api_key: 'your_key')
response = provider.completion(model: 'mistral-medium', messages: [{role: 'user', content: 'Hello'}])
puts response.choices.first.to_s

Streaming responses

provider.stream(model: 'mistral-medium', messages: [{role: 'user', content: 'Tell a story'}]) do |chunk|
  print chunk.to_s
end

Defined Under Namespace

Classes: MistralChoice, MistralEmbeddingResponse, MistralMessage, MistralResponse, MistralStreamChoice, MistralStreamDelta, MistralStreamResponse

Constant Summary collapse

BASE_URL =

Base URL for Mistral AI API

'https://api.mistral.ai/v1'

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Methods inherited from Base

models, options, #stream?

Constructor Details

#initialize(api_key: nil) ⇒ Mistral

Initializes a new Mistral provider instance

Parameters:

  • api_key (String, nil) (defaults to: nil)

    The API key for Mistral AI. If not provided, uses default_api_key



55
56
57
58
59
60
61
62
63
# File 'lib/durable/llm/providers/mistral.rb', line 55

def initialize(api_key: nil)
  super()
  @api_key = api_key || default_api_key
  @conn = Faraday.new(url: BASE_URL) do |faraday|
    faraday.request :json
    faraday.response :json
    faraday.adapter Faraday.default_adapter
  end
end

Instance Attribute Details

#api_keyString?

Returns The API key used for Mistral AI authentication.

Returns:

  • (String, nil)

    The API key used for Mistral AI authentication



50
51
52
# File 'lib/durable/llm/providers/mistral.rb', line 50

def api_key
  @api_key
end

Class Method Details

.stream?Boolean

Indicates whether this provider supports streaming responses

Returns:

  • (Boolean)

    Always returns true for Mistral provider



122
123
124
# File 'lib/durable/llm/providers/mistral.rb', line 122

def self.stream?
  true
end

Instance Method Details

#completion(options) ⇒ MistralResponse

Performs a chat completion request to Mistral AI

Parameters:

  • options (Hash)

    The completion options

Options Hash (options):

  • :model (String)

    The model to use (e.g., ‘mistral-medium’, ‘mistral-small’)

  • :messages (Array<Hash>)

    Array of message objects with :role and :content

  • :temperature (Float) — default: optional

    Controls randomness (0.0 to 1.0)

  • :max_tokens (Integer) — default: optional

    Maximum tokens to generate

Returns:

Raises:



77
78
79
80
81
82
83
84
# File 'lib/durable/llm/providers/mistral.rb', line 77

def completion(options)
  response = @conn.post('chat/completions') do |req|
    req.headers['Authorization'] = "Bearer #{@api_key}"
    req.body = options
  end

  handle_response(response)
end

#default_api_keyString?

Returns the default API key for Mistral AI

Checks the configuration object first, then falls back to the MISTRAL_API_KEY environment variable.

Returns:

  • (String, nil)

    The default API key, or nil if not configured



40
41
42
43
44
45
46
# File 'lib/durable/llm/providers/mistral.rb', line 40

def default_api_key
  begin
    Durable::Llm.configuration.mistral&.api_key
  rescue NoMethodError
    nil
  end || ENV['MISTRAL_API_KEY']
end

#embedding(model:, input:, **options) ⇒ MistralEmbeddingResponse

Generates embeddings for the given input text

Parameters:

  • model (String)

    The embedding model to use (e.g., ‘mistral-embed’)

  • input (String, Array<String>)

    The text(s) to embed

  • options (Hash)

    Additional options for the embedding request

Returns:

Raises:



96
97
98
99
100
101
102
103
# File 'lib/durable/llm/providers/mistral.rb', line 96

def embedding(model:, input:, **options)
  response = @conn.post('embeddings') do |req|
    req.headers['Authorization'] = "Bearer #{@api_key}"
    req.body = { model: model, input: input, **options }
  end

  handle_response(response, MistralEmbeddingResponse)
end

#modelsArray<String>

Retrieves the list of available models from Mistral AI

Returns:

  • (Array<String>)

    Array of available model identifiers

Raises:



111
112
113
114
115
116
117
# File 'lib/durable/llm/providers/mistral.rb', line 111

def models
  response = @conn.get('models') do |req|
    req.headers['Authorization'] = "Bearer #{@api_key}"
  end

  handle_response(response).data.map { |model| model['id'] }
end

#stream(options) {|MistralStreamResponse| ... } ⇒ nil

Performs a streaming chat completion request to Mistral AI

Yields response chunks as they arrive from the API.

Parameters:

  • options (Hash)

    The stream options (same as completion plus :stream => true)

Yields:

Returns:

  • (nil)

    Returns nil after streaming is complete

Raises:



137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
# File 'lib/durable/llm/providers/mistral.rb', line 137

def stream(options)
  options[:stream] = true

  response = @conn.post('chat/completions') do |req|
    req.headers['Authorization'] = "Bearer #{@api_key}"
    req.headers['Accept'] = 'text/event-stream'

    options['temperature'] = options['temperature'].to_f if options['temperature']

    req.body = options

    user_proc = proc do |chunk, _size, _total|
      yield MistralStreamResponse.new(chunk)
    end

    req.options.on_data = to_json_stream(user_proc: user_proc)
  end

  handle_response(response)
end