Class: Aws::BedrockAgent::Types::PromptModelInferenceConfiguration

Inherits:
Struct
  • Object
show all
Includes:
Structure
Defined in:
lib/aws-sdk-bedrockagent/types.rb

Overview

Contains inference configurations related to model inference for a prompt. For more information, see [Inference parameters].

[1]: docs.aws.amazon.com/bedrock/latest/userguide/inference-parameters.html

Constant Summary collapse

SENSITIVE =
[]

Instance Attribute Summary collapse

Instance Attribute Details

#max_tokensInteger

The maximum number of tokens to return in the response.



8605
8606
8607
8608
8609
8610
8611
8612
# File 'lib/aws-sdk-bedrockagent/types.rb', line 8605

class PromptModelInferenceConfiguration < Struct.new(
  :temperature,
  :top_p,
  :max_tokens,
  :stop_sequences)
  SENSITIVE = []
  include Aws::Structure
end

#stop_sequencesArray<String>

A list of strings that define sequences after which the model will stop generating.



8605
8606
8607
8608
8609
8610
8611
8612
# File 'lib/aws-sdk-bedrockagent/types.rb', line 8605

class PromptModelInferenceConfiguration < Struct.new(
  :temperature,
  :top_p,
  :max_tokens,
  :stop_sequences)
  SENSITIVE = []
  include Aws::Structure
end

#temperatureFloat

Controls the randomness of the response. Choose a lower value for more predictable outputs and a higher value for more surprising outputs.



8605
8606
8607
8608
8609
8610
8611
8612
# File 'lib/aws-sdk-bedrockagent/types.rb', line 8605

class PromptModelInferenceConfiguration < Struct.new(
  :temperature,
  :top_p,
  :max_tokens,
  :stop_sequences)
  SENSITIVE = []
  include Aws::Structure
end

#top_pFloat

The percentage of most-likely candidates that the model considers for the next token.



8605
8606
8607
8608
8609
8610
8611
8612
# File 'lib/aws-sdk-bedrockagent/types.rb', line 8605

class PromptModelInferenceConfiguration < Struct.new(
  :temperature,
  :top_p,
  :max_tokens,
  :stop_sequences)
  SENSITIVE = []
  include Aws::Structure
end