Class: OllamaChat::FollowChat

Inherits:
Object
  • Object
show all
Includes:
Ollama, Ollama::Handlers::Concern, MessageFormat, Term::ANSIColor
Defined in:
lib/ollama_chat/follow_chat.rb

Instance Attribute Summary collapse

Instance Method Summary collapse

Methods included from MessageFormat

#message_type, #talk_annotate, #think_annotate

Constructor Details

#initialize(chat:, messages:, voice: nil, output: STDOUT) ⇒ OllamaChat::FollowChat

Initializes a new instance of OllamaChat::FollowChat.

Parameters:

  • chat (OllamaChat::Chat)

    The chat object, which represents the conversation context.

  • messages (#to_a)

    A collection of message objects, representing the conversation history.

  • voice (String) (defaults to: nil)

    (optional) to speek with if any.

  • output (IO) (defaults to: STDOUT)

    (optional) The output stream where terminal output should be printed. Defaults to STDOUT.



16
17
18
19
20
21
22
23
# File 'lib/ollama_chat/follow_chat.rb', line 16

def initialize(chat:, messages:, voice: nil, output: STDOUT)
  super(output:)
  @chat        = chat
  @output.sync = true
  @say         = voice ? Handlers::Say.new(voice:) : NOP
  @messages    = messages
  @user        = nil
end

Instance Attribute Details

#messagesOllamaChat::MessageList<Ollama::Message> (readonly)

Returns the conversation history (an array of message objects).

Returns:



28
29
30
# File 'lib/ollama_chat/follow_chat.rb', line 28

def messages
  @messages
end

Instance Method Details

#call(response) ⇒ OllamaChat::FollowChat

Invokes the chat flow based on the provided Ollama server response.

The response is expected to be a parsed JSON object containing information about the user input and the assistant’s response.

If the response indicates an assistant message, this method:

1. Ensures that an assistant response exists in the message history (if not already present).
2. Updates the last message with the new content and thinking (if applicable).
3. Displays the formatted terminal output for the user.
4. Outputs the voice response (if configured).

Regardless of whether an assistant message is present, this method also outputs evaluation statistics (if applicable).

Parameters:

  • response (Ollama::Response)

    The parsed JSON response from the Ollama server.

Returns:



47
48
49
50
51
52
53
54
55
56
57
58
59
60
# File 'lib/ollama_chat/follow_chat.rb', line 47

def call(response)
  debug_output(response)

  if response&.message&.role == 'assistant'
    ensure_assistant_response_exists
    update_last_message(response)
    display_formatted_terminal_output
    @say.call(response)
  end

  output_eval_stats(response)

  self
end