Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 13 additions & 1 deletion examples/basic_modules/embedder.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,10 @@


# Scenario 1: Using EmbedderFactory

# Prerequisites:
# 1. Install Ollama: https://ollama.com/
# 2. Start Ollama server: `ollama serve`
# 3. Pull the model: `ollama pull nomic-embed-text`
config = EmbedderConfigFactory.model_validate(
{
"backend": "ollama",
Expand Down Expand Up @@ -33,6 +36,9 @@


# Scenario 3: Using SenTranEmbedder
# Prerequisites:
# 1. Ensure `einops` is installed: `pip install einops` (Required for some HF models like nomic-bert)
# 2. The model `nomic-ai/nomic-embed-text-v1.5` will be downloaded automatically from HuggingFace.

config_hf = EmbedderConfigFactory.model_validate(
{
Expand All @@ -49,6 +55,9 @@
print("==" * 20)

# === Scenario 4: Using UniversalAPIEmbedder(OpenAI) ===
# Prerequisites:
# 1. Set a valid OPENAI_API_KEY
# 2. Ensure the base_url is reachable

config_api = EmbedderConfigFactory.model_validate(
{
Expand All @@ -68,6 +77,9 @@
print("Embedding preview:", embedding_api[0][:10])

# === Scenario 5: Using UniversalAPIEmbedder(Azure) ===
# Prerequisites:
# 1. Set a valid AZURE_API_KEY
# 2. Ensure the base_url is reachable

config_api = EmbedderConfigFactory.model_validate(
{
Expand Down
8 changes: 8 additions & 0 deletions examples/basic_modules/llm.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@

# Scenario 1: Using LLMFactory with Ollama Backend
# This is the most recommended way! 🌟
# Prerequisites:
# 1. Install Ollama: https://ollama.com/
# 2. Start Ollama server: `ollama serve`
# 3. Need python ollama package(>=0.5.0,<0.6.0)

config = LLMConfigFactory.model_validate(
{
Expand Down Expand Up @@ -46,6 +50,10 @@


# Scenario 3: Using LLMFactory with OpenAI Backend
# Prerequisites:
# 1. You need a valid OpenAI API key to run this scenario.
# 2. Replace 'sk-xxxx' with your actual API key below.


config = LLMConfigFactory.model_validate(
{
Expand Down
Loading
Loading