Class: Langchain::LLM::HuggingFace
- Defined in:
- lib/langchain/llm/hugging_face.rb
Overview
Wrapper around the HuggingFace Inference API: huggingface.co/inference-api
Gem requirements:
gem "hugging-face", "~> 0.3.4"
Usage:
hf = Langchain::LLM::HuggingFace.new(api_key: ENV["HUGGING_FACE_API_KEY"])
Constant Summary collapse
- DEFAULTS =
The gem does not currently accept other models: github.com/alchaplinsky/hugging-face/blob/main/lib/hugging_face/inference_api.rb#L32-L34
{ temperature: 0.0, embeddings_model_name: "sentence-transformers/all-MiniLM-L6-v2", dimensions: 384 # Vector size generated by the above model }.freeze
Instance Attribute Summary
Attributes inherited from Base
Instance Method Summary collapse
-
#embed(text:) ⇒ Langchain::LLM::HuggingFaceResponse
Generate an embedding for a given text.
-
#initialize(api_key:) ⇒ HuggingFace
constructor
Intialize the HuggingFace LLM.
Methods inherited from Base
#chat, #complete, #default_dimensions, #summarize
Methods included from DependencyHelper
Constructor Details
#initialize(api_key:) ⇒ HuggingFace
Intialize the HuggingFace LLM
27 28 29 30 31 |
# File 'lib/langchain/llm/hugging_face.rb', line 27 def initialize(api_key:) depends_on "hugging-face", req: "hugging_face" @client = ::HuggingFace::InferenceApi.new(api_token: api_key) end |
Instance Method Details
#embed(text:) ⇒ Langchain::LLM::HuggingFaceResponse
Generate an embedding for a given text
39 40 41 42 43 44 45 |
# File 'lib/langchain/llm/hugging_face.rb', line 39 def (text:) response = client.( input: text, model: DEFAULTS[:embeddings_model_name] ) Langchain::LLM::HuggingFaceResponse.new(response, model: DEFAULTS[:embeddings_model_name]) end |