mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-10-18 15:27:16 +00:00
As indicated in the title. Our `starter` distribution enables all remote providers _very intentionally_ because we believe it creates an easier, more welcoming experience to new folks using the software. If we do that, and then slam the logs with errors making them question their life choices, it is not so good :) Note that this fix is limited in scope. If you ever try to actually instantiate the OpenAI client from a code path without an API key being present, you deserve to fail hard. ## Test Plan Run `llama stack run starter` with `OPENAI_API_KEY` set. No more wall of text, just one message saying "listed 96 models". |
||
---|---|---|
.. | ||
bedrock | ||
common | ||
datasetio | ||
files | ||
inference | ||
kvstore | ||
memory | ||
responses | ||
scoring | ||
sqlstore | ||
telemetry | ||
tools | ||
vector_io | ||
__init__.py | ||
pagination.py | ||
scheduler.py |