llama-stack/llama_stack
ehhuang 1166afdf76
fix: some telemetry APIs don't currently work (#1188)
Summary:

This bug is surfaced by using the http LS client. The issue is that
non-scalar values in 'GET' method are `body` params in fastAPI, but our
spec generation script doesn't respect that. We fix by just making them
POST method instead.

Test Plan:
Test API call with newly sync'd client
(https://github.com/meta-llama/llama-stack-client-python/pull/149)

<img width="1114" alt="image"
src="https://github.com/user-attachments/assets/7710aca5-d163-4e00-a465-14e6fcaac2b2"
/>
2025-02-20 14:09:25 -08:00
..
apis fix: some telemetry APIs don't currently work (#1188) 2025-02-20 14:09:25 -08:00
cli feat: add a option to list the downloaded models (#1127) 2025-02-19 22:17:39 -08:00
distribution fix: some telemetry APIs don't currently work (#1188) 2025-02-20 14:09:25 -08:00
models/llama chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00
providers chore!: deprecate eval/tasks (#1186) 2025-02-20 14:06:21 -08:00
scripts precommit again 2025-02-19 22:40:45 -08:00
strong_typing Ensure that deprecations for fields follow through to OpenAPI 2025-02-19 13:54:04 -08:00
templates ModelAlias -> ProviderModelEntry 2025-02-20 14:02:36 -08:00
__init__.py export LibraryClient 2024-12-13 12:08:00 -08:00
schema_utils.py feat: adding endpoints for files and uploads (#1070) 2025-02-20 13:09:00 -08:00