Class: Langchain::LLM::AI21
Overview
Constant Summary collapse
- DEFAULTS =
{ temperature: 0.0, model: "j2-ultra" }.freeze
- LENGTH_VALIDATOR =
Langchain::Utils::TokenLength::AI21Validator
Instance Attribute Summary
Attributes inherited from Base
Instance Method Summary collapse
-
#complete(prompt:, **params) ⇒ Langchain::LLM::AI21Response
Generate a completion for a given prompt.
-
#initialize(api_key:, default_options: {}) ⇒ AI21
constructor
A new instance of AI21.
-
#summarize(text:, **params) ⇒ String
Generate a summary for a given text.
Methods inherited from Base
#chat, #default_dimensions, #embed
Methods included from DependencyHelper
Constructor Details
Instance Method Details
#complete(prompt:, **params) ⇒ Langchain::LLM::AI21Response
Generate a completion for a given prompt
35 36 37 38 39 40 41 42 |
# File 'lib/langchain/llm/ai21.rb', line 35 def complete(prompt:, **params) parameters = complete_parameters params parameters[:maxTokens] = LENGTH_VALIDATOR.validate_max_tokens!(prompt, parameters[:model], {llm: client}) response = client.complete(prompt, parameters) Langchain::LLM::AI21Response.new response, model: parameters[:model] end |
#summarize(text:, **params) ⇒ String
Generate a summary for a given text
51 52 53 54 55 |
# File 'lib/langchain/llm/ai21.rb', line 51 def summarize(text:, **params) response = client.summarize(text, "TEXT", params) response.dig(:summary) # Should we update this to also return a Langchain::LLM::AI21Response? end |