llama-stack/llama_stack/providers
Ashwin Bharambe 14f973a64f
Make LlamaStackLibraryClient work correctly (#581)
This PR does a few things:

- it moves "direct client" to llama-stack repo instead of being in the
llama-stack-client-python repo
- renames it to `LlamaStackLibraryClient`
- actually makes synchronous generators work 
- makes streaming and non-streaming work properly

In many ways, this PR makes things finally "work"

## Test Plan

See a `library_client_test.py` I added. This isn't really quite a test
yet but it demonstrates that this mode now works. Here's the invocation
and the response:

```
INFERENCE_MODEL=meta-llama/Llama-3.2-3B-Instruct python llama_stack/distribution/tests/library_client_test.py ollama
```


![image](https://github.com/user-attachments/assets/17d4e116-4457-4755-a14e-d9a668801fe0)
2024-12-07 14:59:36 -08:00
..
inline Console span processor improvements (#577) 2024-12-06 11:46:16 -08:00
registry Add ability to query and export spans to dataset (#574) 2024-12-05 21:07:30 -08:00
remote Make LlamaStackLibraryClient work correctly (#581) 2024-12-07 14:59:36 -08:00
tests unregister API for dataset (#507) 2024-12-03 21:18:30 -08:00
utils Console span processor improvements (#577) 2024-12-06 11:46:16 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py unregister API for dataset (#507) 2024-12-03 21:18:30 -08:00