Class: OllamaChat::FollowChat

Inherits:
Object
  • Object
show all
Includes:
Ollama, Ollama::Handlers::Concern, MessageFormat, Term::ANSIColor
Defined in:
lib/ollama_chat/follow_chat.rb

Overview

A class that handles chat responses and manages the flow of conversation between the user and Ollama models.

This class is responsible for processing Ollama API responses, updating message history, displaying formatted output to the terminal, and managing voice synthesis for spoken responses. It acts as a handler for streaming responses and ensures proper formatting and display of both regular content and thinking annotations.

Examples:

Processing a chat response

follow_chat = OllamaChat::FollowChat.new(chat: chat_instance, messages: message_list)
follow_chat.call(response)

Instance Attribute Summary collapse

Instance Method Summary collapse

Methods included from MessageFormat

#message_type, #talk_annotate, #think_annotate

Constructor Details

#initialize(chat:, messages:, voice: nil, output: STDOUT) ⇒ OllamaChat::FollowChat

Initializes a new instance of OllamaChat::FollowChat.

conversation context. conversation history. should be printed. Defaults to STDOUT.

Parameters:

  • chat (OllamaChat::Chat)

    The chat object, which represents the

  • messages (#to_a)

    A collection of message objects, representing the

  • voice (String) (defaults to: nil)

    (optional) to speek with if any.

  • output (IO) (defaults to: STDOUT)

    (optional) The output stream where terminal output



30
31
32
33
34
35
36
37
# File 'lib/ollama_chat/follow_chat.rb', line 30

def initialize(chat:, messages:, voice: nil, output: STDOUT)
  super(output:)
  @chat        = chat
  @output.sync = true
  @say         = voice ? Handlers::Say.new(voice:) : NOP
  @messages    = messages
  @user        = nil
end

Instance Attribute Details

#messagesOllamaChat::MessageList<Ollama::Message> (readonly)

Returns the conversation history (an array of message objects).

the conversation.

Returns:



43
44
45
# File 'lib/ollama_chat/follow_chat.rb', line 43

def messages
  @messages
end

Instance Method Details

#call(response) ⇒ OllamaChat::FollowChat

Invokes the chat flow based on the provided Ollama server response.

The response is expected to be a parsed JSON object containing information about the user input and the assistant’s response.

If the response indicates an assistant message, this method:

1. Ensures that an assistant response exists in the message history (if
   not already present).
2. Updates the last message with the new content and thinking (if
   applicable).
3. Displays the formatted terminal output for the user.
4. Outputs the voice response (if configured).

Regardless of whether an assistant message is present, this method also outputs evaluation statistics (if applicable).

server.

Parameters:

  • response (Ollama::Response)

    The parsed JSON response from the Ollama

Returns:



65
66
67
68
69
70
71
72
73
74
75
76
77
78
# File 'lib/ollama_chat/follow_chat.rb', line 65

def call(response)
  debug_output(response)

  if response&.message&.role == 'assistant'
    ensure_assistant_response_exists
    update_last_message(response)
    display_formatted_terminal_output
    @say.call(response)
  end

  output_eval_stats(response)

  self
end