From 9d02d51a1728bb9ceed0e00face2d50b51b0efea Mon Sep 17 00:00:00 2001 From: Krrish Dholakia Date: Sat, 13 Jul 2024 09:56:06 -0700 Subject: [PATCH] docs(pass_through.md): cleanup docs --- docs/my-website/docs/proxy/pass_through.md | 6 ++---- 1 file changed, 2 insertions(+), 4 deletions(-) diff --git a/docs/my-website/docs/proxy/pass_through.md b/docs/my-website/docs/proxy/pass_through.md index 82a374503..092b2424b 100644 --- a/docs/my-website/docs/proxy/pass_through.md +++ b/docs/my-website/docs/proxy/pass_through.md @@ -222,15 +222,13 @@ general_settings: ## Custom Chat Endpoints (Anthropic/Bedrock/Vertex) -Allow developers to call the proxy with Anthropic sdk/boto3/etc. +Allow developers to call the proxy with Anthropic/boto3/etc. client sdk's. Use our [Anthropic Adapter](https://github.com/BerriAI/litellm/blob/fd743aaefd23ae509d8ca64b0c232d25fe3e39ee/litellm/adapters/anthropic_adapter.py#L50) for reference ### 1. Write an Adapter -Translate the request/response from your custom API schema to the OpenAI schema and back. - -This is used internally to do Logging, Guardrails, etc. in a consistent format. +Translate the request/response from your custom API schema to the OpenAI schema (used by litellm.completion()) and back. For provider-specific params 👉 [**Provider-Specific Params**](../completion/provider_specific_params.md)