forked from phoenix/litellm-mirror
update docs for openai mapping
This commit is contained in:
parent
f7955d52b5
commit
f0ccfecb0e
1 changed files with 5 additions and 0 deletions
|
@ -34,12 +34,17 @@ To see how it's implemented - [check out the code](https://github.com/BerriAI/li
|
|||
|
||||
**Note** For OpenAI and Azure we return the original exception (since they're of the OpenAI Error type). But we add the 'llm_provider' attribute to them. [See code](https://github.com/BerriAI/litellm/blob/a42c197e5a6de56ea576c73715e6c7c6b19fa249/litellm/utils.py#L1221)
|
||||
|
||||
## custom mapping list
|
||||
|
||||
Base case - we return the original exception.
|
||||
|
||||
| LLM Provider | Initial Status Code / Initial Error Message | Returned Exception | Returned Status Code |
|
||||
|----------------------|------------------------|-----------------|-----------------|
|
||||
| Anthropic | 401 | AuthenticationError | 401 |
|
||||
| Anthropic | Could not resolve authentication method. Expected either api_key or auth_token to be set. | AuthenticationError | 401 |
|
||||
| Anthropic | 400 | InvalidRequestError | 400 |
|
||||
| Anthropic | 429 | RateLimitError | 429 |
|
||||
| OpenAI | This model's maximum context length is | ContextWindowExceededError | 400 |
|
||||
| Replicate | Incorrect authentication token | AuthenticationError | 401 |
|
||||
| Replicate | ModelError | InvalidRequestError | 400 |
|
||||
| Replicate | Request was throttled | RateLimitError | 429 |
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue