llama-stack-mirror/llama_stack/distribution/server
Ashwin Bharambe 4fa467731e Fix a bug in meta-reference inference when stream=False
Also introduce a gross hack (to cover grosser(?) hack) to ensure
non-stream requests don't send back responses in SSE format. Not sure
which of these hacks is grosser.
2024-10-08 17:23:02 -07:00
..
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
endpoints.py Add an introspection "Api.inspect" API 2024-10-02 15:41:14 -07:00
server.py Fix a bug in meta-reference inference when stream=False 2024-10-08 17:23:02 -07:00