llama-stack/llama_stack/providers
Hardik Shah b2ac29b9da
fix provider model list test (#800)
Fixes provider tests

```
pytest -v -s -k "together or fireworks or ollama" --inference-model="meta-llama/Llama-3.1-8B-Instruct" ./llama_stack/providers/tests/inference/test_text_inference.py 
```
```
...
.... 
llama_stack/providers/tests/inference/test_text_inference.py::TestInference::test_chat_completion_streaming[-together] PASSED
llama_stack/providers/tests/inference/test_text_inference.py::TestInference::test_chat_completion_with_tool_calling[-together] PASSED
llama_stack/providers/tests/inference/test_text_inference.py::TestInference::test_chat_completion_with_tool_calling_streaming[-together] PASSED

================ 21 passed, 6 skipped, 81 deselected, 5 warnings in 32.11s =================
```

Co-authored-by: Hardik Shah <hjshah@fb.com>
2025-01-16 19:27:29 -08:00
..
inline meta reference inference fixes (#797) 2025-01-16 18:17:46 -08:00
registry Pin torchtune pkg version (#791) 2025-01-16 16:31:13 -08:00
remote Fix tgi adapter (#796) 2025-01-16 17:44:12 -08:00
tests fix provider model list test (#800) 2025-01-16 19:27:29 -08:00
utils meta reference inference fixes (#797) 2025-01-16 18:17:46 -08:00
__init__.py API Updates (#73) 2024-09-17 19:51:35 -07:00
datatypes.py Tools API with brave and MCP providers (#639) 2024-12-19 21:25:17 -08:00