llama-stack-mirror/llama_stack/distribution/server
Charlie Doern dffacae2de fix: uvicorn respect log_config
currently if a user does something like `LLAMA_STACK_LOG_FILE=foobar.log` the uvicorn logs do not end up in that file

meaning during testing, it looks like a server gets no requests. This is problematic as it can lead users to believe they have misconfigured their server

when in reality the logging config is just incorrect.

Signed-off-by: Charlie Doern <cdoern@redhat.com>
2025-07-21 15:07:24 -04:00
..
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
auth.py feat(auth): support github tokens (#2509) 2025-07-08 11:02:36 -07:00
auth_providers.py feat(auth): support github tokens (#2509) 2025-07-08 11:02:36 -07:00
quota.py feat: drop python 3.10 support (#2469) 2025-06-19 12:07:14 +05:30
routes.py chore: use starlette built-in Route class (#2267) 2025-05-28 09:53:33 -07:00
server.py fix: uvicorn respect log_config 2025-07-21 15:07:24 -04:00