fix: remove inference.completion from docs

This commit is contained in:
Matthew Farrellee 2025-09-28 07:30:19 -04:00
parent 65f7b81e98
commit e1b750e4e1
6 changed files with 26 additions and 64 deletions

View file

@ -178,10 +178,10 @@ Note that when re-recording tests, you must use a Stack pointing to a server (i.
### Basic Test Pattern
```python
def test_basic_completion(llama_stack_client, text_model_id):
response = llama_stack_client.inference.completion(
def test_basic_chat_completion(llama_stack_client, text_model_id):
response = llama_stack_client.inference.chat_completion(
model_id=text_model_id,
content=CompletionMessage(role="user", content="Hello"),
messages=[{"role": "user", "content": "Hello"}],
)
# Test structure, not AI output quality