-
Notifications
You must be signed in to change notification settings - Fork 65
LCORE-1107: Bumped up llama-stack to RHOAI 3.2 and consuming downstream image #906
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -75,7 +75,7 @@ USER root | |
| RUN microdnf install -y --nodocs --setopt=keepcache=0 --setopt=tsflags=nodocs jq patch | ||
|
|
||
| # Create llama-stack directories for library mode | ||
| RUN mkdir -p /opt/app-root/src/.llama/distributions/ollama /opt/app-root/src/.llama/providers.d && \ | ||
| RUN mkdir -p /opt/app-root/src/.llama/storage /opt/app-root/src/.llama/providers.d && \ | ||
| chown -R 1001:1001 /opt/app-root/src/.llama | ||
|
Comment on lines
+78
to
79
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🧩 Analysis chain🏁 Script executed: #!/bin/bash
# Search for any remaining references to the old storage path structure
rg -n "distributions/ollama" --type yaml --type pyRepository: lightspeed-core/lightspeed-stack Length of output: 7007 Storage path migration is incomplete and will cause runtime failures. The change in the Containerfile creates the new 🤖 Prompt for AI Agents |
||
|
|
||
| # Add executables from .venv to system PATH | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -23,6 +23,9 @@ services: | |
| - RHEL_AI_PORT=${RHEL_AI_PORT} | ||
| - RHEL_AI_API_KEY=${RHEL_AI_API_KEY} | ||
| - RHEL_AI_MODEL=${RHEL_AI_MODEL} | ||
| # Enable debug logging if needed | ||
| - LLAMA_STACK_LOGGING=${LLAMA_STACK_LOGGING:-} | ||
|
|
||
|
Comment on lines
+26
to
+28
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🧩 Analysis chain🏁 Script executed: # Find references to LLAMA_STACK_LOGGING and locate the containerfile
rg "LLAMA_STACK_LOGGING" -i -A 3 -B 3Repository: lightspeed-core/lightspeed-stack Length of output: 939 🏁 Script executed: # Locate and examine the test.containerfile referenced in docker-compose.yaml
fd "test.containerfile" -x cat {}Repository: lightspeed-core/lightspeed-stack Length of output: 574 🏁 Script executed: # Check for any documentation or README about environment variables
fd -e "md" -e "txt" | head -20 | xargs grep -l "LLAMA_STACK_LOGGING" 2>/dev/null || echo "No docs found with LLAMA_STACK_LOGGING"Repository: lightspeed-core/lightspeed-stack Length of output: 114 🏁 Script executed: # Find and examine run.yaml
fd "run.yaml" -x cat {}Repository: lightspeed-core/lightspeed-stack Length of output: 34362 🏁 Script executed: # Check for any logging-related configuration or comments in the repo
rg "logging|LOGGING" -i -l | head -10 | xargs cat 2>/dev/null | grep -i "logging\|debug" | head -20Repository: lightspeed-core/lightspeed-stack Length of output: 794 🏁 Script executed: # Check README, docs, or scripts that might explain LLAMA_STACK_LOGGING
fd -e "md" -e "sh" -e "py" -e "txt" | xargs grep -l "LLAMA_STACK" 2>/dev/null | head -10Repository: lightspeed-core/lightspeed-stack Length of output: 212 🏁 Script executed: # Look for any CI/CD or entrypoint scripts that might use or document this
fd -e "sh" -e "py" | xargs grep -B2 -A2 "LLAMA_STACK_LOGGING" 2>/dev/nullRepository: lightspeed-core/lightspeed-stack Length of output: 60 🏁 Script executed: # Check if there's a Makefile or setup guide
fd "Makefile\|makefile\|setup\|SETUP\|README" | head -10 | xargs cat 2>/dev/null | grep -i "llama\|logging" | head -20Repository: lightspeed-core/lightspeed-stack Length of output: 58 🏁 Script executed: # Read README to understand LLAMA_STACK_LOGGING context
cat README.md | grep -i "logging\|environment\|variable" -A 3 -B 3 | head -50Repository: lightspeed-core/lightspeed-stack Length of output: 1228 🏁 Script executed: # Check the tutorial script for LLAMA_STACK usage patterns
cat scripts/llama_stack_tutorial.sh | head -100Repository: lightspeed-core/lightspeed-stack Length of output: 2830 🏁 Script executed: # Check constants.py for any logging-related constants
cat src/constants.pyRepository: lightspeed-core/lightspeed-stack Length of output: 4784 Document expected values for
🤖 Prompt for AI Agents |
||
| networks: | ||
| - lightspeednet | ||
| healthcheck: | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Config selection is correct; fix hard‑coded source path in summary
The new
CONFIG_ENVIRONMENT/run-$ENVIRONMENT.yamlselection logic is sound and will work forciandazureenvironments.However, the summary step always prints:
so azure runs will be mis‑reported.
Consider deriving this from the matrix value:
This keeps logs accurate if more environments are added later.
Also applies to: 133-133
🤖 Prompt for AI Agents