Class: OllamaChat::FollowChat
- Inherits:
-
Object
- Object
- OllamaChat::FollowChat
- Includes:
- Ollama, Ollama::Handlers::Concern, MessageFormat, Term::ANSIColor
- Defined in:
- lib/ollama_chat/follow_chat.rb
Overview
A class that handles chat responses and manages the flow of conversation between the user and Ollama models.
This class is responsible for processing Ollama API responses, updating message history, displaying formatted output to the terminal, and managing voice synthesis for spoken responses. It acts as a handler for streaming responses and ensures proper formatting and display of both regular content and thinking annotations.
Instance Attribute Summary collapse
-
#messages ⇒ OllamaChat::MessageList<Ollama::Message>
readonly
Returns the conversation history (an array of message objects).
Instance Method Summary collapse
-
#call(response) ⇒ OllamaChat::FollowChat
Invokes the chat flow based on the provided Ollama server response.
-
#initialize(chat:, messages:, voice: nil, output: STDOUT) ⇒ OllamaChat::FollowChat
constructor
Initializes a new instance of OllamaChat::FollowChat.
Methods included from MessageFormat
#message_type, #talk_annotate, #think_annotate
Constructor Details
#initialize(chat:, messages:, voice: nil, output: STDOUT) ⇒ OllamaChat::FollowChat
Initializes a new instance of OllamaChat::FollowChat.
conversation context. conversation history. should be printed. Defaults to STDOUT.
30 31 32 33 34 35 36 37 |
# File 'lib/ollama_chat/follow_chat.rb', line 30 def initialize(chat:, messages:, voice: nil, output: STDOUT) super(output:) @chat = chat @output.sync = true @say = voice ? Handlers::Say.new(voice:) : NOP @messages = @user = nil end |
Instance Attribute Details
#messages ⇒ OllamaChat::MessageList<Ollama::Message> (readonly)
Returns the conversation history (an array of message objects).
the conversation.
43 44 45 |
# File 'lib/ollama_chat/follow_chat.rb', line 43 def @messages end |
Instance Method Details
#call(response) ⇒ OllamaChat::FollowChat
Invokes the chat flow based on the provided Ollama server response.
The response is expected to be a parsed JSON object containing information about the user input and the assistant’s response.
If the response indicates an assistant message, this method:
1. Ensures that an assistant response exists in the message history (if
not already present).
2. Updates the last message with the new content and thinking (if
applicable).
3. Displays the formatted terminal output for the user.
4. Outputs the voice response (if configured).
Regardless of whether an assistant message is present, this method also outputs evaluation statistics (if applicable).
server.
65 66 67 68 69 70 71 72 73 74 75 76 77 78 |
# File 'lib/ollama_chat/follow_chat.rb', line 65 def call(response) debug_output(response) if response&.&.role == 'assistant' ensure_assistant_response_exists (response) display_formatted_terminal_output @say.call(response) end output_eval_stats(response) self end |