Module: RubyLLM::Providers::OpenAIResponses::Capabilities
- Defined in:
- lib/ruby_llm/providers/openai_responses/capabilities.rb
Overview
Model capabilities for OpenAI Responses API models. Defines which models support which features.
Constant Summary collapse
- RESPONSES_API_MODELS =
Models that support the Responses API
%w[ gpt-4o gpt-4o-mini gpt-4o-2024-05-13 gpt-4o-2024-08-06 gpt-4o-2024-11-20 gpt-4o-mini-2024-07-18 gpt-4.1 gpt-4.1-mini gpt-4.1-nano gpt-4-turbo gpt-4-turbo-2024-04-09 gpt-4-turbo-preview o1 o1-mini o1-preview o1-2024-12-17 o3 o3-mini o4-mini chatgpt-4o-latest ].freeze
- VISION_MODELS =
Models with vision capabilities
%w[ gpt-4o gpt-4o-mini gpt-4o-2024-05-13 gpt-4o-2024-08-06 gpt-4o-2024-11-20 gpt-4o-mini-2024-07-18 gpt-4.1 gpt-4.1-mini gpt-4.1-nano gpt-4-turbo gpt-4-turbo-2024-04-09 o1 o3 o4-mini chatgpt-4o-latest ].freeze
- REASONING_MODELS =
Reasoning models (o-series)
%w[o1 o1-mini o1-preview o1-2024-12-17 o3 o3-mini o4-mini].freeze
- WEB_SEARCH_MODELS =
Models that support web search
%w[ gpt-4o gpt-4o-mini gpt-4.1 gpt-4.1-mini gpt-4.1-nano o1 o3 o3-mini o4-mini ].freeze
- CODE_INTERPRETER_MODELS =
Models that support code interpreter
%w[ gpt-4o gpt-4o-mini gpt-4.1 gpt-4.1-mini gpt-4.1-nano o1 o3 o3-mini o4-mini ].freeze
- CONTEXT_WINDOWS =
Context windows by model
{ 'gpt-4o' => 128_000, 'gpt-4o-mini' => 128_000, 'gpt-4o-2024-05-13' => 128_000, 'gpt-4o-2024-08-06' => 128_000, 'gpt-4o-2024-11-20' => 128_000, 'gpt-4o-mini-2024-07-18' => 128_000, 'gpt-4.1' => 1_000_000, 'gpt-4.1-mini' => 1_000_000, 'gpt-4.1-nano' => 1_000_000, 'gpt-4-turbo' => 128_000, 'gpt-4-turbo-2024-04-09' => 128_000, 'o1' => 200_000, 'o1-mini' => 128_000, 'o1-preview' => 128_000, 'o3' => 200_000, 'o3-mini' => 200_000, 'o4-mini' => 200_000 }.freeze
- MAX_OUTPUT_TOKENS =
Max output tokens by model
{ 'gpt-4o' => 16_384, 'gpt-4o-mini' => 16_384, 'gpt-4o-2024-05-13' => 4_096, 'gpt-4o-2024-08-06' => 16_384, 'gpt-4o-2024-11-20' => 16_384, 'gpt-4o-mini-2024-07-18' => 16_384, 'gpt-4.1' => 32_768, 'gpt-4.1-mini' => 32_768, 'gpt-4.1-nano' => 32_768, 'gpt-4-turbo' => 4_096, 'o1' => 100_000, 'o1-mini' => 65_536, 'o3' => 100_000, 'o3-mini' => 100_000, 'o4-mini' => 100_000 }.freeze
- PRICING =
Pricing per million tokens (as of late 2024)
{ 'gpt-4o' => { input: 2.50, output: 10.00, cached_input: 1.25 }, 'gpt-4o-mini' => { input: 0.15, output: 0.60, cached_input: 0.075 }, 'gpt-4.1' => { input: 2.00, output: 8.00, cached_input: 0.50 }, 'gpt-4.1-mini' => { input: 0.40, output: 1.60, cached_input: 0.10 }, 'gpt-4.1-nano' => { input: 0.10, output: 0.40, cached_input: 0.025 }, 'o1' => { input: 15.00, output: 60.00, cached_input: 7.50 }, 'o1-mini' => { input: 1.10, output: 4.40, cached_input: 0.55 }, 'o3' => { input: 10.00, output: 40.00, cached_input: 2.50 }, 'o3-mini' => { input: 1.10, output: 4.40, cached_input: 0.275 }, 'o4-mini' => { input: 1.10, output: 4.40, cached_input: 0.275 } }.freeze
Class Method Summary collapse
- .capabilities_for(model_id) ⇒ Object
- .context_window_for(model_id) ⇒ Object
- .format_display_name(model_id) ⇒ Object
- .input_price_for(model_id) ⇒ Object
- .max_tokens_for(model_id) ⇒ Object
- .modalities_for(model_id) ⇒ Object
- .model_family(model_id) ⇒ Object
-
.normalize_temperature(temperature, model_id) ⇒ Object
Temperature is not supported for reasoning models.
- .output_price_for(model_id) ⇒ Object
- .pricing_for(model_id) ⇒ Object
- .reasoning_model?(model_id) ⇒ Boolean
- .supports_code_interpreter?(model_id) ⇒ Boolean
- .supports_functions?(model_id) ⇒ Boolean
- .supports_responses_api?(model_id) ⇒ Boolean
- .supports_structured_output?(model_id) ⇒ Boolean
- .supports_vision?(model_id) ⇒ Boolean
- .supports_web_search?(model_id) ⇒ Boolean
Class Method Details
.capabilities_for(model_id) ⇒ Object
170 171 172 173 174 175 176 177 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 170 def capabilities_for(model_id) caps = %w[streaming function_calling structured_output] caps << 'vision' if supports_vision?(model_id) caps << 'web_search' if supports_web_search?(model_id) caps << 'code_interpreter' if supports_code_interpreter?(model_id) caps << 'reasoning' if reasoning_model?(model_id) caps end |
.context_window_for(model_id) ⇒ Object
129 130 131 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 129 def context_window_for(model_id) find_capability(model_id, CONTEXT_WINDOWS) || 128_000 end |
.format_display_name(model_id) ⇒ Object
192 193 194 195 196 197 198 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 192 def format_display_name(model_id) model_id .gsub(/[-_]/, ' ') .split .map(&:capitalize) .join(' ') end |
.input_price_for(model_id) ⇒ Object
137 138 139 140 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 137 def input_price_for(model_id) pricing = find_capability(model_id, PRICING) pricing ? pricing[:input] : 0.0 end |
.max_tokens_for(model_id) ⇒ Object
133 134 135 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 133 def max_tokens_for(model_id) find_capability(model_id, MAX_OUTPUT_TOKENS) || 16_384 end |
.modalities_for(model_id) ⇒ Object
160 161 162 163 164 165 166 167 168 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 160 def modalities_for(model_id) input = ['text'] input << 'image' if supports_vision?(model_id) { input: input, output: ['text'] } end |
.model_family(model_id) ⇒ Object
179 180 181 182 183 184 185 186 187 188 189 190 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 179 def model_family(model_id) case model_id when /^gpt-4\.1/ then 'gpt-4.1' when /^gpt-4o-mini/ then 'gpt-4o-mini' when /^gpt-4o/ then 'gpt-4o' when /^gpt-4-turbo/ then 'gpt-4-turbo' when /^o1/ then 'o1' when /^o3/ then 'o3' when /^o4/ then 'o4' else 'other' end end |
.normalize_temperature(temperature, model_id) ⇒ Object
Temperature is not supported for reasoning models
201 202 203 204 205 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 201 def normalize_temperature(temperature, model_id) return nil if reasoning_model?(model_id) temperature end |
.output_price_for(model_id) ⇒ Object
142 143 144 145 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 142 def output_price_for(model_id) pricing = find_capability(model_id, PRICING) pricing ? pricing[:output] : 0.0 end |
.pricing_for(model_id) ⇒ Object
147 148 149 150 151 152 153 154 155 156 157 158 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 147 def pricing_for(model_id) pricing = find_capability(model_id, PRICING) || { input: 0.0, output: 0.0 } { text_tokens: { standard: { input_per_million: pricing[:input], output_per_million: pricing[:output], cached_input_per_million: pricing[:cached_input] || (pricing[:input] / 2) } } } end |
.reasoning_model?(model_id) ⇒ Boolean
125 126 127 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 125 def reasoning_model?(model_id) model_matches?(model_id, REASONING_MODELS) end |
.supports_code_interpreter?(model_id) ⇒ Boolean
121 122 123 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 121 def supports_code_interpreter?(model_id) model_matches?(model_id, CODE_INTERPRETER_MODELS) end |
.supports_functions?(model_id) ⇒ Boolean
109 110 111 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 109 def supports_functions?(model_id) supports_responses_api?(model_id) end |
.supports_responses_api?(model_id) ⇒ Boolean
101 102 103 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 101 def supports_responses_api?(model_id) model_matches?(model_id, RESPONSES_API_MODELS) end |
.supports_structured_output?(model_id) ⇒ Boolean
113 114 115 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 113 def supports_structured_output?(model_id) supports_responses_api?(model_id) end |
.supports_vision?(model_id) ⇒ Boolean
105 106 107 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 105 def supports_vision?(model_id) model_matches?(model_id, VISION_MODELS) end |
.supports_web_search?(model_id) ⇒ Boolean
117 118 119 |
# File 'lib/ruby_llm/providers/openai_responses/capabilities.rb', line 117 def supports_web_search?(model_id) model_matches?(model_id, WEB_SEARCH_MODELS) end |