llama-stack/llama_stack/providers
Botao Chen 90ca4d94de
fix: fix passthrough inference provider to make it work for agent (#1577)
## What does this PR do?
We noticed that the passthrough inference provider doesn't work agent
due to the type mis-match between client and server. We manually cast
the llama stack client type to llama stack server type to fix the issue.

## test 
run `python -m examples.agents.hello localhost 8321` within
llama-stack-apps

<img width="1073" alt="Screenshot 2025-03-11 at 8 43 44 PM"
src="https://github.com/user-attachments/assets/bd1bdd31-606a-420c-a249-95f6184cc0b1"
/>

fix https://github.com/meta-llama/llama-stack/issues/1560
2025-03-12 11:16:17 -07:00
..
inline fix: Fixed bad file name in inline::localfs (#1358) 2025-03-11 12:46:11 -07:00
registry fix: revert to using faiss for ollama distro (#1530) 2025-03-10 16:15:17 -07:00
remote fix: fix passthrough inference provider to make it work for agent (#1577) 2025-03-12 11:16:17 -07:00
tests refactor(test): introduce --stack-config and simplify options (#1404) 2025-03-05 17:02:02 -08:00
utils feat: convert typehints from client_tool to litellm format (#1565) 2025-03-11 20:02:11 -07:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py chore: move all Llama Stack types from llama-models to llama-stack (#1098) 2025-02-14 09:10:59 -08:00