llama-stack-mirror/llama_stack/providers/remote/inference/nvidia
Matthew Farrellee e9eb004bf8
fix: remove inference.completion from docs (#3589)
# What does this PR do?

now that /v1/inference/completion has been removed, no docs should refer
to it

this cleans up remaining references

## Test Plan

ci

Co-authored-by: Ashwin Bharambe <ashwin.bharambe@gmail.com>
2025-09-29 13:14:41 -07:00
..
__init__.py add NVIDIA NIM inference adapter (#355) 2024-11-23 15:59:00 -08:00
config.py fix: allow default empty vars for conditionals (#2570) 2025-07-01 14:42:05 +02:00
NVIDIA.md fix: remove inference.completion from docs (#3589) 2025-09-29 13:14:41 -07:00
nvidia.py chore(api): remove deprecated embeddings impls (#3301) 2025-09-29 14:45:09 -04:00
openai_utils.py chore: enable pyupgrade fixes (#1806) 2025-05-01 14:23:50 -07:00
utils.py refactor(logging): rename llama_stack logger categories (#3065) 2025-08-21 17:31:04 -07:00