-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Copied from LeapLabTHU/Absolute-Zero-Reasoner#7 (comment)
Today I have been trying to use a GROQ LLM (namely: "llama-3.3-70b-versatile", because "mixtral-8x7b-32768" is DEPRECATED) for all 3 functions:
Proposer/Solver/Evaluator
In config\llm_config.yaml :
proposer: llama-3.3-70b-versatile # Instead of OpenAI's smaller model for proposal generation
solver: llama-3.3-70b-versatile # Instead of Qwen3 for solving complex tasks
evaluator: llama-3.3-70b-versatile # Instead of DeepSeek for evaluation
Also I followed your instructions in the README.md file to get started.
Clone the repository
Copy config/llm_config.yaml.example to config/llm_config.yaml
In llm_config.yaml I changes these things:
default_provider: openai
default_provider: groq
proposer: llama-3.3-70b-versatile # Instead of OpenAI's smaller model for proposal generation
solver: llama-3.3-70b-versatile # Instead of Qwen3 for solving complex tasks
evaluator: llama-3.3-70b-versatile # Instead of DeepSeek for evaluation
Model-specific settings
model_settings:
"llama-3.3-70b-versatile"
temperature: 0.5
max_tokens: 4000
top_p: 0.9
Set your API keys as environment variables
Here I made a .env file:
GROQ_API_KEY="gsk_......"
Run python azr_ukg.py
But no matter what I tried, I keep getting this error trace (I use CURSOR as my IDE):
F:\Downloads\absolute-zero-universal-knowledge-main\absolute-zero-universal-knowledge-features-multi-llm-support>python azr_ukg.py
Traceback (most recent call last):
File "F:\Downloads\absolute-zero-universal-knowledge-main\absolute-zero-universal-knowledge-features-multi-llm-support\azr_ukg.py", line 18, in
from modules.config import (
File "F:\Downloads\absolute-zero-universal-knowledge-main\absolute-zero-universal-knowledge-features-multi-llm-support\modules\config_init_.py", line 7, in
from .loader import (
File "F:\Downloads\absolute-zero-universal-knowledge-main\absolute-zero-universal-knowledge-features-multi-llm-support\modules\config\loader.py", line 12, in
from ..llm.rate_limit import RateLimitConfig
File "F:\Downloads\absolute-zero-universal-knowledge-main\absolute-zero-universal-knowledge-features-multi-llm-support\modules\llm_init_.py", line 12, in
from .providers.base import (
File "F:\Downloads\absolute-zero-universal-knowledge-main\absolute-zero-universal-knowledge-features-multi-llm-support\modules\llm\providers_init_.py", line 24, in
from . import openai
File "F:\Downloads\absolute-zero-universal-knowledge-main\absolute-zero-universal-knowledge-features-multi-llm-support\modules\llm\providers\openai.py", line 67, in
from .. import LLMFactory
ImportError: cannot import name 'LLMFactory' from partially initialized module 'modules.llm' (most likely due to a circular import) (F:\Downloads\absolute-zero-universal-knowledge-main\absolute-zero-universal-knowledge-features-multi-llm-support\modules\llm_init_.py)
F:\Downloads\absolute-zero-universal-knowledge-main\absolute-zero-universal-knowledge-features-multi-llm-support>
Could this be caused by a circular import?
Have I forgotten something?
And most importantly: What can be the solution to this?