RubyLLM::Sequel
Sequel ORM integration for RubyLLM, providing the same ActiveRecord-style acts_as_* methods for Sequel models to interact with LLM providers.
This gem enables your Sequel models to serve as:
- Chat interfaces - Direct LLM conversations with persistent message history
- Message storage - User, assistant, system, and tool messages
- Tool call tracking - Function/tool invocations and results
- Model registry - Track available models, pricing, and capabilities
Installation
Add this line to your application's Gemfile:
gem 'ruby_llm-sequel'
And then execute:
bundle install
Or install it yourself as:
gem install ruby_llm-sequel
Usage
Setup
First, require the gem in your application:
require 'ruby_llm/sequel'
Database Schema
Your database needs tables for models, chats, messages, and tool calls. Here's an example schema:
DB.create_table :models do
primary_key :id
String :model_id, null: false
String :name, null: false
String :provider, null: false
String :family
Integer :context_window
Integer :max_output_tokens
String :capabilities, text: true
String :pricing, text: true
# or use jsonb for Postgres
jsonb :capabilities, text: true
jsonb :pricing, text: true
DateTime :created_at
DateTime :updated_at
index [:provider, :model_id], unique: true
end
DB.create_table :chats do
primary_key :id
foreign_key :model_id, :models
TrueClass :active, default: true
DateTime :created_at
DateTime :updated_at
end
DB.create_table :messages do
primary_key :id
foreign_key :chat_id, :chats, null: false
String :role, null: false
String :content, text: true
Integer :input_tokens
Integer :output_tokens
String :content_raw, text: true
# or use jsonb for Postgres
jsonb :content_raw
foreign_key :model_id, :models
foreign_key :tool_call_id, :tool_calls
DateTime :created_at
DateTime :updated_at
end
DB.create_table :tool_calls do
primary_key :id
foreign_key :message_id, :messages, null: false
String :tool_call_id, null: false, unique: true
String :name, null: false
String :arguments, text: true
DateTime :created_at
DateTime :updated_at
end
Model Setup
Define your Sequel models with the RubyLLM plugin and appropriate acts_as_* methods:
class Model < Sequel::Model
plugin ::Sequel::Plugins::RubyLLM
acts_as_model
end
class Chat < Sequel::Model
plugin ::Sequel::Plugins::RubyLLM
acts_as_chat(model: :llm_model,model_class: 'ApiFr::Agents::Model')
end
class Message < Sequel::Model
plugin ::Sequel::Plugins::RubyLLM
(model: :llm_model, model_class: 'ApiFr::Agents::Model',)
end
class ToolCall < Sequel::Model
plugin ::Sequel::Plugins::RubyLLM
acts_as_tool_call
end
Basic Chat Usage
# Configure RubyLLM
RubyLLM.configure do |config|
config.openai_api_key = ENV['OPENAI_API_KEY']
# For custom Model class names (defaults to 'Model')
# config.model_registry_class = 'AIModel'
end
# Opt-in behaviour to use models registry from database
RubyLLM.Sequel.use_model_registry!
# Create a chat
chat = Chat.create
# Set instructions
chat.with_instructions("You are a helpful assistant")
# Send a message and get a response
chat.("Hello!")
response = chat.ask # Calls the LLM and stores the response
puts response.content
# => "Hello! How can I help you today?"
# Access message history
chat..each do ||
puts "#{message.role}: #{message.content}"
end
Tool Calls
# Define a tool
class Weather < RubyLLM::Tool
description "Gets current weather for a location"
params do # the params DSL is only available in v1.9+. older versions should use the param helper instead
string :latitude, description: "Latitude (e.g., 52.5200)"
string :longitude, description: "Longitude (e.g., 13.4050)"
end
def execute(latitude:, longitude:)
url = "https://api.open-meteo.com/v1/forecast?latitude=#{latitude}&longitude=#{longitude}¤t=temperature_2m,wind_speed_10m"
response = Faraday.get(url)
data = JSON.parse(response.body)
rescue => e
{ error: e. }
end
end
# Use the tool in a chat
chat.with_tool(Weather.new)
response = chat.ask("What's the weather in San Francisco?")
# Tool calls are automatically tracked
response.tool_calls.each do |tool_call|
puts "Called: #{tool_call.name}"
puts "Arguments: #{tool_call.arguments}"
end
Model Registry
# Refresh model information from providers
Model.refresh! # Fetches latest models from all configured providers
# Query models
gpt4 = Model.first(provider: 'openai', model_id: 'gpt-4')
# Check capabilities
gpt4.function_calling? # => true
gpt4.streaming? # => true
gpt4.supports?('vision') # => false
# Get pricing
gpt4.input_price_per_million # => 30.0
gpt4.output_price_per_million # => 60.0
Important Notes
Database Compatibility
The gem supports both PostgreSQL (with jsonb) and other databases (JSON as text):
- PostgreSQL: Uses
jsonbcolumns automatically - SQLite/MySQL: Stores JSON as text strings
- JSON parsing/serialization is handled transparently
The model association
The Sequel gem already defines the model instance method for subclasses of Sequel::Model so it's recommended to use a different association name like llm_model to avoid name conflicts
Development
After checking out the repo, run bin/setup to install dependencies. Then, run rake test to run the tests. You can also run bin/console for an interactive prompt that will allow you to experiment.
Running Tests
The test suite uses Minitest and runs against an in-memory SQLite database. To run the tests:
bundle exec rake test
To run a specific test file:
ruby -Ilib:spec spec/ruby_llm/sequel/chat_methods_spec.rb
Test Configuration
The tests use:
- Minitest for the test framework
- SQLite in-memory database for fast test execution
- SimpleCov for code coverage reporting
- Transaction rollback to isolate tests and maintain a clean database state
All test configuration is in spec/spec_helper.rb.
Contributing to Tests
When adding new features, please include corresponding tests. Test files are located in spec/ruby_llm/sequel/ and follow the naming convention *_methods_spec.rb.
To install this gem onto your local machine, run bundle exec rake install. To release a new version, update the version number in version.rb, and then run bundle exec rake release, which will create a git tag for the version, push git commits and the created tag, and push the .gem file to rubygems.org.
Contributing
Bug reports and pull requests are welcome on GitHub at https://github.com/[USERNAME]/ruby_llm-sequel.
License
The gem is available as open source under the terms of the MIT License.