diff --git a/docs/my-website/docs/observability/langsmith_integration.md b/docs/my-website/docs/observability/langsmith_integration.md index c038abd82..79d047e33 100644 --- a/docs/my-website/docs/observability/langsmith_integration.md +++ b/docs/my-website/docs/observability/langsmith_integration.md @@ -14,7 +14,7 @@ https://github.com/BerriAI/litellm An all-in-one developer platform for every step of the application lifecycle https://smith.langchain.com/ - + :::info We want to learn how we can make the callbacks better! Meet the LiteLLM [founders](https://calendly.com/d/4mp-gd3-k5k/berriai-1-1-onboarding-litellm-hosted-version) or diff --git a/docs/my-website/docs/proxy/logging.md b/docs/my-website/docs/proxy/logging.md index 27f1789e0..0d5016645 100644 --- a/docs/my-website/docs/proxy/logging.md +++ b/docs/my-website/docs/proxy/logging.md @@ -5,6 +5,7 @@ Log Proxy input, output, and exceptions using: - Langfuse - OpenTelemetry - Custom Callbacks +- Langsmith - DataDog - DynamoDB - s3 Bucket @@ -1086,6 +1087,50 @@ litellm_settings: Start the LiteLLM Proxy and make a test request to verify the logs reached your callback API +## Logging LLM IO to Langsmith + +1. Set `success_callback: ["langsmith"]` on litellm config.yaml + +If you're using a custom LangSmith instance, you can set the +`LANGSMITH_BASE_URL` environment variable to point to your instance. + +```yaml +litellm_settings: + success_callback: ["langsmith"] + +environment_variables: + LANGSMITH_API_KEY: "lsv2_pt_xxxxxxxx" + LANGSMITH_PROJECT: "litellm-proxy" + + LANGSMITH_BASE_URL: "https://api.smith.langchain.com" # (Optional - only needed if you have a custom Langsmith instance) +``` + + +2. Start Proxy + +``` +litellm --config /path/to/config.yaml +``` + +3. Test it! + +```bash +curl --location 'http://0.0.0.0:4000/chat/completions' \ +--header 'Content-Type: application/json' \ +--data ' { + "model": "fake-openai-endpoint", + "messages": [ + { + "role": "user", + "content": "Hello, Claude gm!" + } + ], + } +' +``` +Expect to see your log on Langfuse + + ## Logging LLM IO to Galileo [BETA] diff --git a/docs/my-website/img/langsmith_new.png b/docs/my-website/img/langsmith_new.png new file mode 100644 index 000000000..d5586bdbe Binary files /dev/null and b/docs/my-website/img/langsmith_new.png differ