Class: LlmService
- Inherits:
-
Object
- Object
- LlmService
- Includes:
- LangsmithrbRails::TracedService
- Defined in:
- lib/generators/langsmithrb_rails/demo/templates/llm_service.rb
Overview
Service for interacting with LLM providers
Instance Method Summary collapse
-
#generate(prompt, context = [], options = {}) ⇒ String
Generate a response to a prompt.
-
#initialize(provider = ENV.fetch("LLM_PROVIDER", "<%= options[:provider] %>")) ⇒ LlmService
constructor
Initialize the service.
Constructor Details
#initialize(provider = ENV.fetch("LLM_PROVIDER", "<%= options[:provider] %>")) ⇒ LlmService
Initialize the service
9 10 11 |
# File 'lib/generators/langsmithrb_rails/demo/templates/llm_service.rb', line 9 def initialize(provider = ENV.fetch("LLM_PROVIDER", "<%= options[:provider] %>")) @provider = provider.to_s.downcase end |
Instance Method Details
#generate(prompt, context = [], options = {}) ⇒ String
Generate a response to a prompt
18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 |
# File 'lib/generators/langsmithrb_rails/demo/templates/llm_service.rb', line 18 def generate(prompt, context = [], = {}) # Create trace with LangSmith langsmith_trace("llm_generate", inputs: { prompt: prompt, context: context }) do |run| response = case @provider when "openai" generate_with_openai(prompt, context, ) when "anthropic" generate_with_anthropic(prompt, context, ) else generate_with_mock(prompt, context, ) end # Record the output run.outputs = { response: response } response end end |