Class: LastLLM::Providers::Anthropic
- Inherits:
-
LastLLM::Provider
- Object
- LastLLM::Provider
- LastLLM::Providers::Anthropic
- Defined in:
- lib/last_llm/providers/anthropic.rb
Overview
Anthropic provider implementation
Constant Summary collapse
- BASE_ENDPOINT =
API Configuration
'https://api.anthropic.com'- DEFAULT_MODEL =
'claude-3-5-haiku-latest'- API_VERSION =
'2023-06-01'- DEFAULT_TEMPERATURE =
LLM Default Parameters
0.2- DEFAULT_TOP_P =
0.8- DEFAULT_MAX_TOKENS =
4096- DEFAULT_MAX_TOKENS_OBJECT =
8192- SUCCESS_STATUS =
Response Configuration
200- UNAUTHORIZED_STATUS =
Error Status Codes
401- BAD_REQUEST_STATUS =
400
Instance Attribute Summary
Attributes inherited from LastLLM::Provider
Class Method Summary collapse
-
.execute_tool(tool, response) ⇒ Hash?
Execute a tool from an Anthropic response.
-
.format_tool(tool) ⇒ Hash
Format a tool for Anthropic tools format.
Instance Method Summary collapse
- #generate_object(prompt, schema, options = {}) ⇒ Object
- #generate_text(prompt, options = {}) ⇒ Object
-
#initialize(config) ⇒ Anthropic
constructor
A new instance of Anthropic.
Methods inherited from LastLLM::Provider
Constructor Details
#initialize(config) ⇒ Anthropic
Returns a new instance of Anthropic.
27 28 29 30 31 |
# File 'lib/last_llm/providers/anthropic.rb', line 27 def initialize(config) super(:anthropic, config) @conn = connection(config[:base_url] || BASE_ENDPOINT) logger.debug("#{@name}: Initialized Anthropic provider with endpoint: #{config[:base_url] || BASE_ENDPOINT}") end |
Class Method Details
.execute_tool(tool, response) ⇒ Hash?
Execute a tool from an Anthropic response
79 80 81 82 83 84 |
# File 'lib/last_llm/providers/anthropic.rb', line 79 def self.execute_tool(tool, response) tool_use = response[:tool_use] return nil unless tool_use && tool_use[:name] == tool.name tool.call(tool_use[:input]) end |
.format_tool(tool) ⇒ Hash
Format a tool for Anthropic tools format
67 68 69 70 71 72 73 |
# File 'lib/last_llm/providers/anthropic.rb', line 67 def self.format_tool(tool) { name: tool.name, description: tool.description, input_schema: tool.parameters } end |
Instance Method Details
#generate_object(prompt, schema, options = {}) ⇒ Object
45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 |
# File 'lib/last_llm/providers/anthropic.rb', line 45 def generate_object(prompt, schema, = {}) model = get_model(, DEFAULT_MODEL) logger.info("#{@name}: Generating object with model: #{model}") logger.debug("#{@name}: Object prompt: #{format_prompt_for_logging(prompt)}") = .dup system_prompt = 'You are a helpful assistant that responds with valid JSON.' formatted_prompt = LastLLM::StructuredOutput.format_prompt(prompt, schema) [:system_prompt] = system_prompt [:max_tokens] ||= DEFAULT_MAX_TOKENS_OBJECT make_request(formatted_prompt, ) do |result| content = result.dig(:content, 0, :text) logger.debug("#{@name}: Raw JSON response: #{content}") parse_json_response(content) end end |
#generate_text(prompt, options = {}) ⇒ Object
33 34 35 36 37 38 39 40 41 42 43 |
# File 'lib/last_llm/providers/anthropic.rb', line 33 def generate_text(prompt, = {}) model = get_model(, DEFAULT_MODEL) logger.info("#{@name}: Generating text with model: #{model}") logger.debug("#{@name}: Text prompt: #{format_prompt_for_logging(prompt)}") make_request(prompt, ) do |result| response = result.dig(:content, 0, :text).to_s logger.debug("#{@name}: Generated response of #{response.length} characters") response end end |