Module: DiscourseAi::Tokenizer
- Defined in:
- lib/discourse_ai/tokenizer/bert_tokenizer.rb,
lib/discourse_ai/tokenizer/qwen_tokenizer.rb,
lib/discourse_ai/tokenizer/basic_tokenizer.rb,
lib/discourse_ai/tokenizer/bge_m3_tokenizer.rb,
lib/discourse_ai/tokenizer/gemini_tokenizer.rb,
lib/discourse_ai/tokenizer/llama3_tokenizer.rb,
lib/discourse_ai/tokenizer/mistral_tokenizer.rb,
lib/discourse_ai/tokenizer/open_ai_tokenizer.rb,
lib/discourse_ai/tokenizer/anthropic_tokenizer.rb,
lib/discourse_ai/tokenizer/bge_large_en_tokenizer.rb,
lib/discourse_ai/tokenizer/open_ai_cl100k_tokenizer.rb,
lib/discourse_ai/tokenizer/all_mpnet_base_v2_tokenizer.rb,
lib/discourse_ai/tokenizer/multilingual_e5_large_tokenizer.rb
Defined Under Namespace
Classes: AllMpnetBaseV2Tokenizer, AnthropicTokenizer, BasicTokenizer, BertTokenizer, BgeLargeEnTokenizer, BgeM3Tokenizer, GeminiTokenizer, Llama3Tokenizer, MistralTokenizer, MultilingualE5LargeTokenizer, OpenAiCl100kTokenizer, OpenAiTokenizer, QwenTokenizer
Constant Summary collapse
- OpenAiO200kTokenizer =
OpenAiTokenizer