llama-stack-mirror/llama_stack/providers/impls/meta_reference
Ashwin Bharambe 4fa467731e Fix a bug in meta-reference inference when stream=False
Also introduce a gross hack (to cover grosser(?) hack) to ensure
non-stream requests don't send back responses in SSE format. Not sure
which of these hacks is grosser.
2024-10-08 17:23:02 -07:00
..
agents Fix ValueError in case chunks are empty (#206) 2024-10-07 08:55:06 -07:00
inference Fix a bug in meta-reference inference when stream=False 2024-10-08 17:23:02 -07:00
memory Introduce model_store, shield_store, memory_bank_store 2024-10-08 17:23:02 -07:00
safety Introduce model_store, shield_store, memory_bank_store 2024-10-08 17:23:02 -07:00
telemetry API Updates (#73) 2024-09-17 19:51:35 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00