feat: Add GPU-specific Ray worker images with CI/CD
Some checks failed
Build and Push Images / build-nvidia (push) Failing after 1s
Build and Push Images / build-rdna2 (push) Failing after 1s
Build and Push Images / build-strixhalo (push) Failing after 1s
Build and Push Images / build-intel (push) Failing after 1s

- Add Dockerfiles for nvidia, rdna2, strixhalo, and intel GPU targets
- Add ray-serve modules (embeddings, whisper, tts, llm, reranker)
- Add Gitea Actions workflow for automated builds
- Add Makefile for local development
- Update README with comprehensive documentation
This commit is contained in:
2026-02-01 15:04:31 -05:00
parent e68d5c1f0e
commit a16ffff73f
16 changed files with 1311 additions and 2 deletions

View File

@@ -0,0 +1,27 @@
#!/bin/bash
# Ray Worker Entrypoint
# Connects to Ray head node and registers custom resources
set -e
# Ensure Ray is in PATH (works across all base images)
export PATH="/home/ray/.local/bin:/home/ray/anaconda3/bin:${PATH}"
# Get Ray head address from environment or default
RAY_HEAD_ADDRESS="${RAY_HEAD_SVC:-ray-head-svc}:6379"
# Get custom resources from environment
GPU_RESOURCE="${GPU_RESOURCE:-gpu_amd}"
NUM_GPUS="${NUM_GPUS:-1}"
echo "Starting Ray worker..."
echo " Head address: $RAY_HEAD_ADDRESS"
echo " GPU resource: $GPU_RESOURCE"
echo " Num GPUs: $NUM_GPUS"
# Start Ray worker with custom resources
exec ray start \
--address="$RAY_HEAD_ADDRESS" \
--num-gpus="$NUM_GPUS" \
--resources="{\"$GPU_RESOURCE\": 1}" \
--block