diff --git a/docs/my-website/docs/providers/anthropic.md b/docs/my-website/docs/providers/anthropic.md index 290e094d0..c28f97ea0 100644 --- a/docs/my-website/docs/providers/anthropic.md +++ b/docs/my-website/docs/providers/anthropic.md @@ -957,3 +957,69 @@ curl http://0.0.0.0:4000/v1/chat/completions \ ``` + +## Usage - passing 'user_id' to Anthropic + +LiteLLM translates the OpenAI `user` param to Anthropic's `metadata[user_id]` param. + + + + +```python +response = completion( + model="claude-3-5-sonnet-20240620", + messages=messages, + user="user_123", +) +``` + + + +1. Setup config.yaml + +```yaml +model_list: + - model_name: claude-3-5-sonnet-20240620 + litellm_params: + model: anthropic/claude-3-5-sonnet-20240620 + api_key: os.environ/ANTHROPIC_API_KEY +``` + +2. Start Proxy + +``` +litellm --config /path/to/config.yaml +``` + +3. Test it! + +```bash +curl http://0.0.0.0:4000/v1/chat/completions \ + -H "Content-Type: application/json" \ + -H "Authorization: Bearer " \ + -d '{ + "model": "claude-3-5-sonnet-20240620", + "messages": [{"role": "user", "content": "What is Anthropic?"}], + "user": "user_123" + }' +``` + + + + +## All Supported OpenAI Params + +``` +"stream", +"stop", +"temperature", +"top_p", +"max_tokens", +"max_completion_tokens", +"tools", +"tool_choice", +"extra_headers", +"parallel_tool_calls", +"response_format", +"user" +``` \ No newline at end of file