diff --git a/docs/my-website/docs/proxy/bucket.md b/docs/my-website/docs/proxy/bucket.md new file mode 100644 index 000000000..963b8a4f6 --- /dev/null +++ b/docs/my-website/docs/proxy/bucket.md @@ -0,0 +1,129 @@ + +# 🪣 Logging GCS, s3 Buckets + +LiteLLM Supports Logging to the following Cloud Buckets +- (Enterprise) ✨ [Google Cloud Storage Buckets](#logging-proxy-inputoutput-to-google-cloud-storage-buckets) +- (Free OSS) [Amazon s3 Buckets](#logging-proxy-inputoutput---s3-buckets) + +## Logging Proxy Input/Output to Google Cloud Storage Buckets + +Log LLM Logs to [Google Cloud Storage Buckets](https://cloud.google.com/storage?hl=en) + +:::info + +✨ This is an Enterprise only feature [Get Started with Enterprise here](https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat) + +::: + + +### Usage + +1. Add `gcs_bucket` to LiteLLM Config.yaml +```yaml +model_list: +- litellm_params: + api_base: https://openai-function-calling-workers.tasslexyz.workers.dev/ + api_key: my-fake-key + model: openai/my-fake-model + model_name: fake-openai-endpoint + +litellm_settings: + callbacks: ["gcs_bucket"] # 👈 KEY CHANGE # 👈 KEY CHANGE +``` + +2. Set required env variables + +```shell +GCS_BUCKET_NAME="" +GCS_PATH_SERVICE_ACCOUNT="/Users/ishaanjaffer/Downloads/adroit-crow-413218-a956eef1a2a8.json" # Add path to service account.json +``` + +3. Start Proxy + +``` +litellm --config /path/to/config.yaml +``` + +4. Test it! + +```bash +curl --location 'http://0.0.0.0:4000/chat/completions' \ +--header 'Content-Type: application/json' \ +--data ' { + "model": "fake-openai-endpoint", + "messages": [ + { + "role": "user", + "content": "what llm are you" + } + ], + } +' +``` + + +### Expected Logs on GCS Buckets + + + + + +## Logging Proxy Input/Output - s3 Buckets + +We will use the `--config` to set + +- `litellm.success_callback = ["s3"]` + +This will log all successfull LLM calls to s3 Bucket + +**Step 1** Set AWS Credentials in .env + +```shell +AWS_ACCESS_KEY_ID = "" +AWS_SECRET_ACCESS_KEY = "" +AWS_REGION_NAME = "" +``` + +**Step 2**: Create a `config.yaml` file and set `litellm_settings`: `success_callback` + +```yaml +model_list: + - model_name: gpt-3.5-turbo + litellm_params: + model: gpt-3.5-turbo +litellm_settings: + success_callback: ["s3"] + s3_callback_params: + s3_bucket_name: logs-bucket-litellm # AWS Bucket Name for S3 + s3_region_name: us-west-2 # AWS Region Name for S3 + s3_aws_access_key_id: os.environ/AWS_ACCESS_KEY_ID # us os.environ/ to pass environment variables. This is AWS Access Key ID for S3 + s3_aws_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY # AWS Secret Access Key for S3 + s3_path: my-test-path # [OPTIONAL] set path in bucket you want to write logs to + s3_endpoint_url: https://s3.amazonaws.com # [OPTIONAL] S3 endpoint URL, if you want to use Backblaze/cloudflare s3 buckets +``` + +**Step 3**: Start the proxy, make a test request + +Start proxy + +```shell +litellm --config config.yaml --debug +``` + +Test Request + +```shell +curl --location 'http://0.0.0.0:4000/chat/completions' \ + --header 'Content-Type: application/json' \ + --data ' { + "model": "Azure OpenAI GPT-4 East", + "messages": [ + { + "role": "user", + "content": "what llm are you" + } + ] + }' +``` + +Your logs should be available on the specified s3 Bucket diff --git a/docs/my-website/docs/proxy/logging.md b/docs/my-website/docs/proxy/logging.md index 81d6c8767..c4098bf75 100644 --- a/docs/my-website/docs/proxy/logging.md +++ b/docs/my-website/docs/proxy/logging.md @@ -8,7 +8,6 @@ Log Proxy input, output, and exceptions using: - Langsmith - DataDog - DynamoDB -- s3 Bucket - etc. import Image from '@theme/IdealImage'; @@ -1379,66 +1378,6 @@ Expected output on Datadog -## Logging Proxy Input/Output - s3 Buckets - -We will use the `--config` to set - -- `litellm.success_callback = ["s3"]` - -This will log all successfull LLM calls to s3 Bucket - -**Step 1** Set AWS Credentials in .env - -```shell -AWS_ACCESS_KEY_ID = "" -AWS_SECRET_ACCESS_KEY = "" -AWS_REGION_NAME = "" -``` - -**Step 2**: Create a `config.yaml` file and set `litellm_settings`: `success_callback` - -```yaml -model_list: - - model_name: gpt-3.5-turbo - litellm_params: - model: gpt-3.5-turbo -litellm_settings: - success_callback: ["s3"] - s3_callback_params: - s3_bucket_name: logs-bucket-litellm # AWS Bucket Name for S3 - s3_region_name: us-west-2 # AWS Region Name for S3 - s3_aws_access_key_id: os.environ/AWS_ACCESS_KEY_ID # us os.environ/ to pass environment variables. This is AWS Access Key ID for S3 - s3_aws_secret_access_key: os.environ/AWS_SECRET_ACCESS_KEY # AWS Secret Access Key for S3 - s3_path: my-test-path # [OPTIONAL] set path in bucket you want to write logs to - s3_endpoint_url: https://s3.amazonaws.com # [OPTIONAL] S3 endpoint URL, if you want to use Backblaze/cloudflare s3 buckets -``` - -**Step 3**: Start the proxy, make a test request - -Start proxy - -```shell -litellm --config config.yaml --debug -``` - -Test Request - -```shell -curl --location 'http://0.0.0.0:4000/chat/completions' \ - --header 'Content-Type: application/json' \ - --data ' { - "model": "Azure OpenAI GPT-4 East", - "messages": [ - { - "role": "user", - "content": "what llm are you" - } - ] - }' -``` - -Your logs should be available on the specified s3 Bucket - ## Logging Proxy Input/Output - DynamoDB We will use the `--config` to set diff --git a/docs/my-website/sidebars.js b/docs/my-website/sidebars.js index e5a40eaa7..61aca1169 100644 --- a/docs/my-website/sidebars.js +++ b/docs/my-website/sidebars.js @@ -47,7 +47,7 @@ const sidebars = { { type: "category", label: "🪢 Logging", - items: ["proxy/logging", "proxy/streaming_logging"], + items: ["proxy/logging", "proxy/bucket", "proxy/streaming_logging"], }, "proxy/team_logging", "proxy/guardrails",