mirror of
https://github.com/meta-llama/llama-stack.git
synced 2025-12-30 19:23:52 +00:00
This updates test_openai_completion.py to allow chunks with null text in streaming responses, as that's a valid chunk and I just hastily wrote the test without accounting for this originally. This is required to get this test passing with gpt-4o models and the openai provider. Signed-off-by: Ben Browning <bbrownin@redhat.com> |
||
|---|---|---|
| .. | ||
| client-sdk/post_training | ||
| external-provider/llama-stack-provider-ollama | ||
| integration | ||
| unit | ||
| verifications | ||
| __init__.py | ||