llama-stack-mirror/llama_stack/providers/remote/inference/openai
Matthew Farrellee 79c9e46582 feat(openai): add configurable base_url support with OPENAI_BASE_URL env var
- Add base_url field to OpenAIConfig with default "https://api.openai.com/v1"
- Update sample_run_config to support OPENAI_BASE_URL environment variable
- Modify get_base_url() to return configured base_url instead of hardcoded value
- Add comprehensive test suite covering:
  - Default base URL behavior
  - Custom base URL from config
  - Environment variable override
  - Config precedence over environment variables
  - Client initialization with configured URL
  - Model availability checks using configured URL

This enables users to configure custom OpenAI-compatible API endpoints
via environment variables or configuration files.
2025-07-26 06:04:00 -04:00
..
__init__.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
config.py feat(openai): add configurable base_url support with OPENAI_BASE_URL env var 2025-07-26 06:04:00 -04:00
models.py fix: starter template and litellm backward compat conflict for openai (#2885) 2025-07-24 17:28:37 +02:00
openai.py feat(openai): add configurable base_url support with OPENAI_BASE_URL env var 2025-07-26 06:04:00 -04:00