RubyConversations
A Rails engine for managing AI conversations and storing them in prompt studio. Built on top of RubyLLM for AI interactions.
Features
- Built-in Prompt Management: Version-controlled prompts with placeholder validation
- Conversation History: Track and manage conversation threads
- Input/Output Storage: Structured storage for message inputs and responses
- JWT Authentication: Secure remote mode with JWT authentication
Installation
Add this line to your application's Gemfile:
# Either
gem 'ruby_llm_community'
# Or
gem 'ruby_llm'
# Then
gem 'ruby_conversations'
And then execute:
bundle install
Or install it yourself as:
gem install ruby_conversations
Configuration
Configure the engine in config/initializers/ai_conversation_engine.rb:
RubyConversations.configure do |config|
# API settings
config.api_url = ENV['AI_CONVERSATION_API_URL']
config.jwt_secret = ENV['AI_CONVERSATION_JWT_SECRET']
# Default LLM settings
config.default_llm_model = 'gpt-4'
end
Usage
Basic Conversation
# Create a new conversation
conversation = RubyConversations::Conversation.new
# Ask a question using a predefined prompt
conversation.with_prompt("explain_code", inputs: { code: "def hello; end" })
result = conversation.call_llm
puts result[:content]
Development
After checking out the repo, run bin/setup to install dependencies. Then, run rake spec to run the tests. You can also run bin/console for an interactive prompt that will allow you to experiment.
To install this gem onto your local machine, run bundle exec rake install. To release a new version, update the version number in version.rb, and then run bundle exec rake release, which will create a git tag for the version, push git commits and the created tag, and push the .gem file to rubygems.org.