Class: LLM::LlamaCpp
- Inherits:
-
OpenAI
show all
- Defined in:
- lib/llm/shell/internal/llm.rb/lib/llm/providers/llamacpp.rb
Overview
The LlamaCpp class implements a provider for [llama.cpp](github.com/ggml-org/llama.cpp) through the OpenAI-compatible API provided by the llama-server binary. Similar to the ollama provider, this provider supports a wide range of models and is straightforward to run on your own hardware.
Constant Summary
Constants inherited
from OpenAI
OpenAI::HOST
Instance Method Summary
collapse
Methods inherited from OpenAI
#assistant_role, #complete, #embed, #models, #server_tools, #web_search
#format
Methods inherited from Provider
#assistant_role, #chat, clients, #complete, #embed, #inspect, #models, #respond, #schema, #server_tool, #server_tools, #web_search, #with
Constructor Details
#initialize(host: "localhost", port: 8080, ssl: false) ⇒ LLM::LlamaCpp
26
27
28
|
# File 'lib/llm/shell/internal/llm.rb/lib/llm/providers/llamacpp.rb', line 26
def initialize(host: "localhost", port: 8080, ssl: false, **)
super
end
|
Instance Method Details
44
45
46
|
# File 'lib/llm/shell/internal/llm.rb/lib/llm/providers/llamacpp.rb', line 44
def audio
raise NotImplementedError
end
|
#default_model ⇒ String
Returns the default model for chat completions
70
71
72
|
# File 'lib/llm/shell/internal/llm.rb/lib/llm/providers/llamacpp.rb', line 70
def default_model
"qwen3"
end
|
32
33
34
|
# File 'lib/llm/shell/internal/llm.rb/lib/llm/providers/llamacpp.rb', line 32
def files
raise NotImplementedError
end
|
38
39
40
|
# File 'lib/llm/shell/internal/llm.rb/lib/llm/providers/llamacpp.rb', line 38
def images
raise NotImplementedError
end
|
#moderations ⇒ Object
50
51
52
|
# File 'lib/llm/shell/internal/llm.rb/lib/llm/providers/llamacpp.rb', line 50
def moderations
raise NotImplementedError
end
|
#responses ⇒ Object
56
57
58
|
# File 'lib/llm/shell/internal/llm.rb/lib/llm/providers/llamacpp.rb', line 56
def responses
raise NotImplementedError
end
|
#vector_stores ⇒ Object
62
63
64
|
# File 'lib/llm/shell/internal/llm.rb/lib/llm/providers/llamacpp.rb', line 62
def vector_stores
raise NotImplementedError
end
|