From 70a36740bc2d8728801b6b78783f0d3717eb9305 Mon Sep 17 00:00:00 2001 From: Krrish Dholakia Date: Thu, 14 Sep 2023 10:47:38 -0700 Subject: [PATCH] update docs --- docs/my-website/docs/getting_started.md | 6 +++--- docs/my-website/docs/tutorials/azure_openai.md | 2 +- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/my-website/docs/getting_started.md b/docs/my-website/docs/getting_started.md index 14f097c5d..f0cef0313 100644 --- a/docs/my-website/docs/getting_started.md +++ b/docs/my-website/docs/getting_started.md @@ -52,15 +52,15 @@ More details 👉 LiteLLM maps exceptions across all supported providers to the OpenAI exceptions. All our exceptions inherit from OpenAI's exception types, so any error-handling you have for that, should work out of the box with LiteLLM. ```python -from openai.errors import AuthenticationError +from openai.errors import OpenAIError from litellm import completion os.environ["ANTHROPIC_API_KEY"] = "bad-key" try: # some code completion(model="claude-instant-1", messages=[{"role": "user", "content": "Hey, how's it going?"}]) -except AuthenticationError as e: - print(e.llm_provider) +except OpenAIError as e: + print(e) ``` More details 👉 diff --git a/docs/my-website/docs/tutorials/azure_openai.md b/docs/my-website/docs/tutorials/azure_openai.md index 9b05fd861..16436550a 100644 --- a/docs/my-website/docs/tutorials/azure_openai.md +++ b/docs/my-website/docs/tutorials/azure_openai.md @@ -1,4 +1,4 @@ -# Use Completion() for OpenAI, Azure +# Replacing OpenAI ChatCompletion with Completion() * [Supported OpenAI LLMs](https://docs.litellm.ai/docs/providers/openai) * [Supported Azure OpenAI LLMs](https://docs.litellm.ai/docs/providers/azure)