Class: Langchain::LLM::HuggingFace

Inherits:
Base
  • Object
show all
Defined in:
lib/langchain/llm/hugging_face.rb

Overview

Wrapper around the HuggingFace Inference API: huggingface.co/inference-api

Gem requirements:

gem "hugging-face", "~> 0.3.4"

Usage:

hf = Langchain::LLM::HuggingFace.new(api_key: ENV["HUGGING_FACE_API_KEY"])

Constant Summary collapse

DEFAULTS =
{
  temperature: 0.0,
  embeddings_model_name: "sentence-transformers/all-MiniLM-L6-v2",
  dimensions: 384 # Vector size generated by the above model
}.freeze

Instance Attribute Summary

Attributes inherited from Base

#client

Instance Method Summary collapse

Methods inherited from Base

#chat, #complete, #default_dimensions, #summarize

Methods included from DependencyHelper

#depends_on

Constructor Details

#initialize(api_key:) ⇒ HuggingFace

Intialize the HuggingFace LLM

Parameters:

  • api_key (String)

    The API key to use



27
28
29
30
31
# File 'lib/langchain/llm/hugging_face.rb', line 27

def initialize(api_key:)
  depends_on "hugging-face", req: "hugging_face"

  @client = ::HuggingFace::InferenceApi.new(api_token: api_key)
end

Instance Method Details

#embed(text:) ⇒ Langchain::LLM::HuggingFaceResponse

Generate an embedding for a given text

Parameters:

  • text (String)

    The text to embed

Returns:



39
40
41
42
43
44
45
# File 'lib/langchain/llm/hugging_face.rb', line 39

def embed(text:)
  response = client.embedding(
    input: text,
    model: DEFAULTS[:embeddings_model_name]
  )
  Langchain::LLM::HuggingFaceResponse.new(response, model: DEFAULTS[:embeddings_model_name])
end