From 2913c9525b0d334656c958d9631f5427e083c2e4 Mon Sep 17 00:00:00 2001 From: Krish Dholakia Date: Mon, 26 Feb 2024 15:23:52 -0800 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index deef7ddf2..a7ce7b5ba 100644 --- a/README.md +++ b/README.md @@ -68,7 +68,7 @@ response = completion(model="command-nightly", messages=messages) print(response) ``` -To call any model supported by a provider, just use `model=/`. This way, LiteLLM will know which provider to route it to. There might be provider-specific details here (e.g. for vertex ai, any unmapped model is assumed to be a model garden endpoint). So refer to [provider docs for more information](https://docs.litellm.ai/docs/providers) +Call any model supported by a provider, with `model=/`. There might be provider-specific details here, so refer to [provider docs for more information](https://docs.litellm.ai/docs/providers) ## Async ([Docs](https://docs.litellm.ai/docs/completion/stream#async-completion))