Files
ray-serve/ray_serve/serve_whisper.py
Billy D. 15e4b8afa3
All checks were successful
Build and Publish ray-serve-apps / build-and-publish (push) Successful in 11s
fix: make mlflow_logger import optional with no-op fallback
The strixhalo LLM worker uses py_executable pointing to the Docker
image venv which doesn't have the updated ray-serve-apps package.
Wrap all InferenceLogger imports in try/except and guard usage with
None checks so apps degrade gracefully without MLflow logging.
2026-02-12 07:01:17 -05:00

5.0 KiB