add missing references

This commit is contained in:
Vince Lwt 2023-08-21 21:38:44 +02:00
parent 61afceece1
commit 342b83544f
6 changed files with 52 additions and 43 deletions

View file

@ -6,8 +6,8 @@ liteLLM provides `success_callbacks` and `failure_callbacks`, making it easy for
liteLLM supports: liteLLM supports:
- [Helicone](https://docs.helicone.ai/introduction)
- [LLMonitor](https://llmonitor.com/docs) - [LLMonitor](https://llmonitor.com/docs)
- [Helicone](https://docs.helicone.ai/introduction)
- [Sentry](https://docs.sentry.io/platforms/python/) - [Sentry](https://docs.sentry.io/platforms/python/)
- [PostHog](https://posthog.com/docs/libraries/python) - [PostHog](https://posthog.com/docs/libraries/python)
- [Slack](https://slack.dev/bolt-python/concepts) - [Slack](https://slack.dev/bolt-python/concepts)

View file

@ -19,34 +19,44 @@ const sidebars = {
tutorialSidebar: [ tutorialSidebar: [
{ type: "doc", id: "index" }, // NEW { type: "doc", id: "index" }, // NEW
{ {
type: 'category', type: "category",
label: 'Completion()', label: "Completion()",
items: ['completion/input','completion/output'], items: ["completion/input", "completion/output"],
}, },
{ {
type: 'category', type: "category",
label: 'Embedding()', label: "Embedding()",
items: ['embedding/supported_embedding'], items: ["embedding/supported_embedding"],
}, },
'debugging/local_debugging', "debugging/local_debugging",
'completion/supported', "completion/supported",
{ {
type: 'category', type: "category",
label: 'Tutorials', label: "Tutorials",
items: ['tutorials/huggingface_tutorial', 'tutorials/TogetherAI_liteLLM', 'tutorials/debugging_tutorial'], items: [
"tutorials/huggingface_tutorial",
"tutorials/TogetherAI_liteLLM",
"tutorials/debugging_tutorial",
],
}, },
'token_usage', "token_usage",
'stream', "stream",
'secret', "secret",
'caching', "caching",
{ {
type: 'category', type: "category",
label: 'Logging & Observability', label: "Logging & Observability",
items: ['observability/callbacks', 'observability/integrations', 'observability/helicone_integration', 'observability/supabase_integration'], items: [
"observability/callbacks",
"observability/integrations",
"observability/llmonitor_integration",
"observability/helicone_integration",
"observability/supabase_integration",
],
}, },
'troubleshoot', "troubleshoot",
'contributing', "contributing",
'contact' "contact",
], ],
}; };

View file

@ -1,22 +1,25 @@
# Callbacks # Callbacks
## Use Callbacks to send Output Data to Posthog, Sentry etc ## Use Callbacks to send Output Data to Posthog, Sentry etc
liteLLM provides `success_callbacks` and `failure_callbacks`, making it easy for you to send data to a particular provider depending on the status of your responses. liteLLM provides `success_callbacks` and `failure_callbacks`, making it easy for you to send data to a particular provider depending on the status of your responses.
liteLLM supports: liteLLM supports:
- [LLMonitor](https://llmonitor.com/docs)
- [Helicone](https://docs.helicone.ai/introduction) - [Helicone](https://docs.helicone.ai/introduction)
- [Sentry](https://docs.sentry.io/platforms/python/) - [Sentry](https://docs.sentry.io/platforms/python/)
- [PostHog](https://posthog.com/docs/libraries/python) - [PostHog](https://posthog.com/docs/libraries/python)
- [Slack](https://slack.dev/bolt-python/concepts) - [Slack](https://slack.dev/bolt-python/concepts)
### Quick Start ### Quick Start
```python ```python
from litellm import completion from litellm import completion
# set callbacks # set callbacks
litellm.success_callback=["posthog", "helicone"] litellm.success_callback=["posthog", "helicone", "llmonitor"]
litellm.failure_callback=["sentry"] litellm.failure_callback=["sentry", "llmonitor"]
## set env variables ## set env variables
os.environ['SENTRY_API_URL'], os.environ['SENTRY_API_TRACE_RATE']= "" os.environ['SENTRY_API_URL'], os.environ['SENTRY_API_TRACE_RATE']= ""
@ -25,5 +28,3 @@ os.environ["HELICONE_API_KEY"] = ""
response = completion(model="gpt-3.5-turbo", messages=messages) response = completion(model="gpt-3.5-turbo", messages=messages)
``` ```

View file

@ -1,12 +1,9 @@
# Logging Integrations # Logging Integrations
| Integration | Required OS Variables | How to Use with callbacks | | Integration | Required OS Variables | How to Use with callbacks |
|-----------------|--------------------------------------------|-------------------------------------------| | ----------- | -------------------------------------------------------- | ---------------------------------------- |
| LLMonitor | `LLMONITOR_APP_ID` | `litellm.success_callback=["llmonitor"]` |
| Sentry | `SENTRY_API_URL` | `litellm.success_callback=["sentry"]` | | Sentry | `SENTRY_API_URL` | `litellm.success_callback=["sentry"]` |
| Posthog | `POSTHOG_API_KEY`,`POSTHOG_API_URL` | `litellm.success_callback=["posthog"]` | | Posthog | `POSTHOG_API_KEY`,`POSTHOG_API_URL` | `litellm.success_callback=["posthog"]` |
| Slack | `SLACK_API_TOKEN`,`SLACK_API_SECRET`,`SLACK_API_CHANNEL` | `litellm.success_callback=["slack"]` | | Slack | `SLACK_API_TOKEN`,`SLACK_API_SECRET`,`SLACK_API_CHANNEL` | `litellm.success_callback=["slack"]` |
| Helicone | `HELICONE_API_TOKEN` | `litellm.success_callback=["helicone"]` | | Helicone | `HELICONE_API_TOKEN` | `litellm.success_callback=["helicone"]` |

View file

@ -16,6 +16,7 @@ nav:
- 💾 Callbacks - Logging Output: - 💾 Callbacks - Logging Output:
- Quick Start: advanced.md - Quick Start: advanced.md
- Output Integrations: client_integrations.md - Output Integrations: client_integrations.md
- LLMonitor Tutorial: llmonitor_integration.md
- Helicone Tutorial: helicone_integration.md - Helicone Tutorial: helicone_integration.md
- Supabase Tutorial: supabase_integration.md - Supabase Tutorial: supabase_integration.md
- BerriSpend Tutorial: berrispend_integration.md - BerriSpend Tutorial: berrispend_integration.md

View file

@ -33,7 +33,7 @@
- Call all models using the OpenAI format - `completion(model, messages)` - Call all models using the OpenAI format - `completion(model, messages)`
- Text responses will always be available at `['choices'][0]['message']['content']` - Text responses will always be available at `['choices'][0]['message']['content']`
- **Error Handling** Using Model Fallbacks (if `GPT-4` fails, try `llama2`) - **Error Handling** Using Model Fallbacks (if `GPT-4` fails, try `llama2`)
- **Logging** - Log Requests, Responses and Errors to `Supabase`, `Posthog`, `Mixpanel`, `Sentry`, `Helicone`, `LLMonitor` (Any of the supported providers here: https://litellm.readthedocs.io/en/latest/advanced/ - **Logging** - Log Requests, Responses and Errors to `Supabase`, `Posthog`, `Mixpanel`, `Sentry`, `LLMonitor`, `Helicone` (Any of the supported providers here: https://litellm.readthedocs.io/en/latest/advanced/
**Example: Logs sent to Supabase** **Example: Logs sent to Supabase**
<img width="1015" alt="Screenshot 2023-08-11 at 4 02 46 PM" src="https://github.com/ishaan-jaff/proxy-server/assets/29436595/237557b8-ba09-4917-982c-8f3e1b2c8d08"> <img width="1015" alt="Screenshot 2023-08-11 at 4 02 46 PM" src="https://github.com/ishaan-jaff/proxy-server/assets/29436595/237557b8-ba09-4917-982c-8f3e1b2c8d08">