Class: Anthropic::Models::Beta::Messages::BatchCreateParams::Request::Params
- Inherits:
-
Internal::Type::BaseModel
- Object
- Internal::Type::BaseModel
- Anthropic::Models::Beta::Messages::BatchCreateParams::Request::Params
- Defined in:
- lib/anthropic/models/beta/messages/batch_create_params.rb
Overview
Defined Under Namespace
Modules: Container, ServiceTier, System
Instance Attribute Summary collapse
-
#container ⇒ Anthropic::Models::Beta::BetaContainerParams, ...
Container identifier for reuse across requests.
-
#context_management ⇒ Anthropic::Models::Beta::BetaContextManagementConfig?
Context management configuration.
-
#max_tokens ⇒ Integer
The maximum number of tokens to generate before stopping.
-
#mcp_servers ⇒ Array<Anthropic::Models::Beta::BetaRequestMCPServerURLDefinition>?
MCP servers to be utilized in this request.
-
#messages ⇒ Array<Anthropic::Models::Beta::BetaMessageParam>
Input messages.
-
#metadata ⇒ Anthropic::Models::Beta::BetaMetadata?
An object describing metadata about the request.
-
#model ⇒ Symbol, ...
The model that will complete your prompt.\n\nSee models for additional details and options.
-
#service_tier ⇒ Symbol, ...
Determines whether to use priority capacity (if available) or standard capacity for this request.
-
#stop_sequences ⇒ Array<String>?
Custom text sequences that will cause the model to stop generating.
-
#stream ⇒ Boolean?
Whether to incrementally stream the response using server-sent events.
-
#system_ ⇒ String, ...
System prompt.
-
#temperature ⇒ Float?
Amount of randomness injected into the response.
-
#thinking ⇒ Anthropic::Models::Beta::BetaThinkingConfigEnabled, ...
Configuration for enabling Claude's extended thinking.
-
#tool_choice ⇒ Anthropic::Models::Beta::BetaToolChoiceAuto, ...
How the model should use the provided tools.
-
#tools ⇒ Array<Anthropic::Models::Beta::BetaTool, Anthropic::Models::Beta::BetaToolBash20241022, Anthropic::Models::Beta::BetaToolBash20250124, Anthropic::Models::Beta::BetaCodeExecutionTool20250522, Anthropic::Models::Beta::BetaCodeExecutionTool20250825, Anthropic::Models::Beta::BetaToolComputerUse20241022, Anthropic::Models::Beta::BetaMemoryTool20250818, Anthropic::Models::Beta::BetaToolComputerUse20250124, Anthropic::Models::Beta::BetaToolTextEditor20241022, Anthropic::Models::Beta::BetaToolTextEditor20250124, Anthropic::Models::Beta::BetaToolTextEditor20250429, Anthropic::Models::Beta::BetaToolTextEditor20250728, Anthropic::Models::Beta::BetaWebSearchTool20250305, Anthropic::Models::Beta::BetaWebFetchTool20250910>?
Definitions of tools that the model may use.
-
#top_k ⇒ Integer?
Only sample from the top K options for each subsequent token.
-
#top_p ⇒ Float?
Use nucleus sampling.
Class Method Summary collapse
Instance Method Summary collapse
-
#initialize(max_tokens: , messages: , model: , container: nil, context_management: nil, mcp_servers: nil, metadata: nil, service_tier: nil, stop_sequences: nil, stream: nil, system_: nil, temperature: nil, thinking: nil, tool_choice: nil, tools: nil, top_k: nil, top_p: nil) ⇒ void
constructor
Some parameter documentations has been truncated, see Params for more details.
Methods inherited from Internal::Type::BaseModel
==, #==, #[], coerce, #deconstruct_keys, #deep_to_h, dump, fields, hash, #hash, inherited, inspect, #inspect, known_fields, optional, recursively_to_h, required, #to_h, #to_json, #to_s, to_sorbet_type, #to_yaml
Methods included from Internal::Type::Converter
#coerce, coerce, #dump, dump, inspect, #inspect, meta_info, new_coerce_state, type_info
Methods included from Internal::Util::SorbetRuntimeSupport
#const_missing, #define_sorbet_constant!, #sorbet_constant_defined?, #to_sorbet_type, to_sorbet_type
Constructor Details
#initialize(max_tokens: , messages: , model: , container: nil, context_management: nil, mcp_servers: nil, metadata: nil, service_tier: nil, stop_sequences: nil, stream: nil, system_: nil, temperature: nil, thinking: nil, tool_choice: nil, tools: nil, top_k: nil, top_p: nil) ⇒ void
Some parameter documentations has been truncated, see Anthropic::Models::Beta::Messages::BatchCreateParams::Request::Params for more details.
Messages API creation parameters for the individual request.
See the Messages API reference for full documentation on available parameters.
|
|
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 371
|
Instance Attribute Details
#container ⇒ Anthropic::Models::Beta::BetaContainerParams, ...
Container identifier for reuse across requests.
160 161 162 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 160 optional :container, union: -> { Anthropic::Beta::Messages::BatchCreateParams::Request::Params::Container }, nil?: true |
#context_management ⇒ Anthropic::Models::Beta::BetaContextManagementConfig?
Context management configuration.
This allows you to control how Claude manages context across multiple requests, such as whether to clear function results or not.
171 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 171 optional :context_management, -> { Anthropic::Beta::BetaContextManagementConfig }, nil?: true |
#max_tokens ⇒ Integer
The maximum number of tokens to generate before stopping.
Note that our models may stop before reaching this maximum. This parameter only specifies the absolute maximum number of tokens to generate.
Different models have different maximum values for this parameter. See models for details.
76 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 76 required :max_tokens, Integer |
#mcp_servers ⇒ Array<Anthropic::Models::Beta::BetaRequestMCPServerURLDefinition>?
MCP servers to be utilized in this request
177 178 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 177 optional :mcp_servers, -> { Anthropic::Internal::Type::ArrayOf[Anthropic::Beta::BetaRequestMCPServerURLDefinition] } |
#messages ⇒ Array<Anthropic::Models::Beta::BetaMessageParam>
Input messages.
Our models are trained to operate on alternating user and assistant
conversational turns. When creating a new Message, you specify the prior
conversational turns with the messages parameter, and the model then generates
the next Message in the conversation. Consecutive user or assistant turns
in your request will be combined into a single turn.
Each input message must be an object with a role and content. You can
specify a single user-role message, or you can include multiple user and
assistant messages.
If the final message uses the assistant role, the response content will
continue immediately from the content in that message. This can be used to
constrain part of the model's response.
Example with a single user message:
[{ "role": "user", "content": "Hello, Claude" }]
Example with multiple conversational turns:
[
{ "role": "user", "content": "Hello there." },
{ "role": "assistant", "content": "Hi, I'm Claude. How can I help you?" },
{ "role": "user", "content": "Can you explain LLMs in plain English?" }
]
Example with a partially-filled response from Claude:
[
{
"role": "user",
"content": "What's the Greek name for Sun? (A) Sol (B) Helios (C) Sun"
},
{ "role": "assistant", "content": "The best answer is (" }
]
Each input message content may be either a single string or an array of
content blocks, where each block has a specific type. Using a string for
content is shorthand for an array of one content block of type "text". The
following input messages are equivalent:
{ "role": "user", "content": "Hello, Claude" }
{ "role": "user", "content": [{ "type": "text", "text": "Hello, Claude" }] }
See input examples.
Note that if you want to include a
system prompt, you can use the
top-level system parameter — there is no "system" role for input messages in
the Messages API.
There is a limit of 100,000 messages in a single request.
146 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 146 required :messages, -> { Anthropic::Internal::Type::ArrayOf[Anthropic::Beta::BetaMessageParam] } |
#metadata ⇒ Anthropic::Models::Beta::BetaMetadata?
An object describing metadata about the request.
184 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 184 optional :metadata, -> { Anthropic::Beta::BetaMetadata } |
#model ⇒ Symbol, ...
The model that will complete your prompt.\n\nSee models for additional details and options.
154 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 154 required :model, union: -> { Anthropic::Model } |
#service_tier ⇒ Symbol, ...
Determines whether to use priority capacity (if available) or standard capacity for this request.
Anthropic offers different levels of service for your API requests. See service-tiers for details.
194 195 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 194 optional :service_tier, enum: -> { Anthropic::Beta::Messages::BatchCreateParams::Request::Params::ServiceTier } |
#stop_sequences ⇒ Array<String>?
Custom text sequences that will cause the model to stop generating.
Our models will normally stop when they have naturally completed their turn,
which will result in a response stop_reason of "end_turn".
If you want the model to stop generating when it encounters custom strings of
text, you can use the stop_sequences parameter. If the model encounters one of
the custom sequences, the response stop_reason value will be "stop_sequence"
and the response stop_sequence value will contain the matched stop sequence.
209 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 209 optional :stop_sequences, Anthropic::Internal::Type::ArrayOf[String] |
#stream ⇒ Boolean?
Whether to incrementally stream the response using server-sent events.
See streaming for details.
217 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 217 optional :stream, Anthropic::Internal::Type::Boolean |
#system_ ⇒ String, ...
System prompt.
A system prompt is a way of providing context and instructions to Claude, such as specifying a particular goal or role. See our guide to system prompts.
227 228 229 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 227 optional :system_, union: -> { Anthropic::Beta::Messages::BatchCreateParams::Request::Params::System }, api_name: :system |
#temperature ⇒ Float?
Amount of randomness injected into the response.
Defaults to 1.0. Ranges from 0.0 to 1.0. Use temperature closer to 0.0
for analytical / multiple choice, and closer to 1.0 for creative and
generative tasks.
Note that even with temperature of 0.0, the results will not be fully
deterministic.
242 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 242 optional :temperature, Float |
#thinking ⇒ Anthropic::Models::Beta::BetaThinkingConfigEnabled, ...
Configuration for enabling Claude's extended thinking.
When enabled, responses include thinking content blocks showing Claude's
thinking process before the final answer. Requires a minimum budget of 1,024
tokens and counts towards your max_tokens limit.
See extended thinking for details.
256 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 256 optional :thinking, union: -> { Anthropic::Beta::BetaThinkingConfigParam } |
#tool_choice ⇒ Anthropic::Models::Beta::BetaToolChoiceAuto, ...
How the model should use the provided tools. The model can use a specific tool, any available tool, decide by itself, or not use tools at all.
263 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 263 optional :tool_choice, union: -> { Anthropic::Beta::BetaToolChoice } |
#tools ⇒ Array<Anthropic::Models::Beta::BetaTool, Anthropic::Models::Beta::BetaToolBash20241022, Anthropic::Models::Beta::BetaToolBash20250124, Anthropic::Models::Beta::BetaCodeExecutionTool20250522, Anthropic::Models::Beta::BetaCodeExecutionTool20250825, Anthropic::Models::Beta::BetaToolComputerUse20241022, Anthropic::Models::Beta::BetaMemoryTool20250818, Anthropic::Models::Beta::BetaToolComputerUse20250124, Anthropic::Models::Beta::BetaToolTextEditor20241022, Anthropic::Models::Beta::BetaToolTextEditor20250124, Anthropic::Models::Beta::BetaToolTextEditor20250429, Anthropic::Models::Beta::BetaToolTextEditor20250728, Anthropic::Models::Beta::BetaWebSearchTool20250305, Anthropic::Models::Beta::BetaWebFetchTool20250910>?
Definitions of tools that the model may use.
If you include tools in your API request, the model may return tool_use
content blocks that represent the model's use of those tools. You can then run
those tools using the tool input generated by the model and then optionally
return results back to the model using tool_result content blocks.
There are two types of tools: client tools and server tools. The behavior described below applies to client tools. For server tools, see their individual documentation as each has its own behavior (e.g., the web search tool).
Each tool definition includes:
name: Name of the tool.description: Optional, but strongly-recommended description of the tool.input_schema: JSON schema for the toolinputshape that the model will produce intool_useoutput content blocks.
For example, if you defined tools as:
[
{
"name": "get_stock_price",
"description": "Get the current stock price for a given ticker symbol.",
"input_schema": {
"type": "object",
"properties": {
"ticker": {
"type": "string",
"description": "The stock ticker symbol, e.g. AAPL for Apple Inc."
}
},
"required": ["ticker"]
}
}
]
And then asked the model "What's the S&P 500 at today?", the model might produce
tool_use content blocks in the response like this:
[
{
"type": "tool_use",
"id": "toolu_01D7FLrfh4GYq7yT1ULFeyMV",
"name": "get_stock_price",
"input": { "ticker": "^GSPC" }
}
]
You might then run your get_stock_price tool with {"ticker": "^GSPC"} as an
input, and return the following back to the model in a subsequent user
message:
[
{
"type": "tool_result",
"tool_use_id": "toolu_01D7FLrfh4GYq7yT1ULFeyMV",
"content": "259.75 USD"
}
]
Tools can be used for workflows that include running client-side tools and functions, or more generally whenever you want the model to produce a particular JSON structure of output.
See our guide for more details.
343 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 343 optional :tools, -> { Anthropic::Internal::Type::ArrayOf[union: Anthropic::Beta::BetaToolUnion] } |
#top_k ⇒ Integer?
Only sample from the top K options for each subsequent token.
Used to remove "long tail" low probability responses. Learn more technical details here.
Recommended for advanced use cases only. You usually only need to use
temperature.
355 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 355 optional :top_k, Integer |
#top_p ⇒ Float?
Use nucleus sampling.
In nucleus sampling, we compute the cumulative distribution over all the options
for each subsequent token in decreasing probability order and cut it off once it
reaches a particular probability specified by top_p. You should either alter
temperature or top_p, but not both.
Recommended for advanced use cases only. You usually only need to use
temperature.
369 |
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 369 optional :top_p, Float |
Class Method Details
.values ⇒ Array<Symbol>
|
|
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 443
|
.variants ⇒ Array(Anthropic::Models::Beta::BetaContainerParams, String)
|
|
# File 'lib/anthropic/models/beta/messages/batch_create_params.rb', line 426
|