Class: AsktiveRecord::LlmService
- Inherits:
-
Object
- Object
- AsktiveRecord::LlmService
- Defined in:
- lib/asktive_record/llm_service.rb
Overview
Service class for interacting with the LLM API to generate SQL queries and answer questions based on the generated queries and database responses.
Instance Attribute Summary collapse
-
#configuration ⇒ Object
readonly
Returns the value of attribute configuration.
Instance Method Summary collapse
- #answer(question, query, response) ⇒ Object
-
#generate_sql(natural_language_query, schema_string, table_name) ⇒ Object
Original method for model-specific queries.
-
#generate_sql_for_service(natural_language_query, schema_string, _target_table = "any") ⇒ Object
New method for service-class-based queries that can target any table.
-
#initialize(configuration) ⇒ LlmService
constructor
A new instance of LlmService.
-
#upload_schema(_schema_string) ⇒ Object
Placeholder for schema upload/management with the LLM if needed for more advanced scenarios For instance, if using OpenAI Assistants API or fine-tuning.
Constructor Details
#initialize(configuration) ⇒ LlmService
Returns a new instance of LlmService.
12 13 14 15 16 17 18 19 |
# File 'lib/asktive_record/llm_service.rb', line 12 def initialize(configuration) @configuration = configuration return if @configuration&.llm_api_key raise ConfigurationError, "LLM API key is not configured. Please set it in config/initializers/asktive_record.rb\ or via environment variable." end |
Instance Attribute Details
#configuration ⇒ Object (readonly)
Returns the value of attribute configuration.
10 11 12 |
# File 'lib/asktive_record/llm_service.rb', line 10 def configuration @configuration end |
Instance Method Details
#answer(question, query, response) ⇒ Object
30 31 32 33 34 35 |
# File 'lib/asktive_record/llm_service.rb', line 30 def answer(question, query, response) puts "Answering question: #{question}" puts "Generated SQL query: #{query}" puts "Response from database: #{response.inspect}" answer_as_human(question, query, response) end |
#generate_sql(natural_language_query, schema_string, table_name) ⇒ Object
Original method for model-specific queries
38 39 40 41 42 43 44 45 46 47 48 |
# File 'lib/asktive_record/llm_service.rb', line 38 def generate_sql(natural_language_query, schema_string, table_name) client = OpenAI::Client.new(access_token: configuration.llm_api_key) prompt = Prompt.as_sql_generator_for_model( natural_language_query, schema_string, table_name ) generate_and_validate_sql(client, prompt) end |
#generate_sql_for_service(natural_language_query, schema_string, _target_table = "any") ⇒ Object
New method for service-class-based queries that can target any table
51 52 53 54 55 |
# File 'lib/asktive_record/llm_service.rb', line 51 def generate_sql_for_service(natural_language_query, schema_string, _target_table = "any") client = OpenAI::Client.new(access_token: configuration.llm_api_key) prompt = Prompt.as_sql_generator(natural_language_query, schema_string) generate_and_validate_sql(client, prompt) end |
#upload_schema(_schema_string) ⇒ Object
Placeholder for schema upload/management with the LLM if needed for more advanced scenarios For instance, if using OpenAI Assistants API or fine-tuning. For now, the schema is passed with each query.
24 25 26 27 28 |
# File 'lib/asktive_record/llm_service.rb', line 24 def upload_schema(_schema_string) # This could be used to upload schema to a vector store or a fine-tuning dataset in the future. puts "Schema upload functionality is a placeholder for now." true end |