forked from phoenix/litellm-mirror
docs(vllm.md): update docs to tell people to check openai-compatible endpoint docs for vllm
This commit is contained in:
parent
1f6c342e94
commit
f74a43aa78
1 changed files with 7 additions and 0 deletions
|
@ -4,6 +4,13 @@ LiteLLM supports all models on VLLM.
|
||||||
|
|
||||||
🚀[Code Tutorial](https://github.com/BerriAI/litellm/blob/main/cookbook/VLLM_Model_Testing.ipynb)
|
🚀[Code Tutorial](https://github.com/BerriAI/litellm/blob/main/cookbook/VLLM_Model_Testing.ipynb)
|
||||||
|
|
||||||
|
|
||||||
|
:::info
|
||||||
|
|
||||||
|
To call a HOSTED VLLM Endpoint use [these docs](./openai_compatible.md)
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
### Quick Start
|
### Quick Start
|
||||||
```
|
```
|
||||||
pip install litellm vllm
|
pip install litellm vllm
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue