docs(docs): cleanup

This commit is contained in:
Krrish Dholakia 2023-10-13 13:03:57 -07:00
parent 9eb8524c09
commit c09104e577
2 changed files with 4 additions and 2 deletions

View file

@ -53,6 +53,9 @@ By default, LiteLLM raises an exception if the openai param being passed in isn'
To drop the param instead, set `litellm.drop_params = True`.
**For function calling:**
Add to prompt for non-openai models, set: `litellm.add_function_to_prompt = True`.
:::
## Provider-specific Params

View file

@ -30,10 +30,9 @@ In order to use litellm to call a hosted vllm server add the following to your c
import litellm
response = completion(
model="facebook/opt-125m", # pass the vllm model name
model="openai/facebook/opt-125m", # pass the vllm model name
messages=messages,
api_base="https://hosted-vllm-api.co",
custom_llm_provider="openai",
temperature=0.2,
max_tokens=80)