forked from phoenix/litellm-mirror
update logging docs
This commit is contained in:
parent
18f1a42c5a
commit
b4acd483c4
12 changed files with 87 additions and 6 deletions
|
@ -2,6 +2,15 @@ import Image from '@theme/IdealImage';
|
||||||
|
|
||||||
# Athina
|
# Athina
|
||||||
|
|
||||||
|
|
||||||
|
:::tip
|
||||||
|
|
||||||
|
This is community maintained, Please make an issue if you run into a bug
|
||||||
|
https://github.com/BerriAI/litellm
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
|
|
||||||
[Athina](https://athina.ai/) is an evaluation framework and production monitoring platform for your LLM-powered app. Athina is designed to enhance the performance and reliability of AI applications through real-time monitoring, granular analytics, and plug-and-play evaluations.
|
[Athina](https://athina.ai/) is an evaluation framework and production monitoring platform for your LLM-powered app. Athina is designed to enhance the performance and reliability of AI applications through real-time monitoring, granular analytics, and plug-and-play evaluations.
|
||||||
|
|
||||||
<Image img={require('../../img/athina_dashboard.png')} />
|
<Image img={require('../../img/athina_dashboard.png')} />
|
||||||
|
|
|
@ -1,5 +1,14 @@
|
||||||
# Greenscale - Track LLM Spend and Responsible Usage
|
# Greenscale - Track LLM Spend and Responsible Usage
|
||||||
|
|
||||||
|
|
||||||
|
:::tip
|
||||||
|
|
||||||
|
This is community maintained, Please make an issue if you run into a bug
|
||||||
|
https://github.com/BerriAI/litellm
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
|
|
||||||
[Greenscale](https://greenscale.ai/) is a production monitoring platform for your LLM-powered app that provides you granular key insights into your GenAI spending and responsible usage. Greenscale only captures metadata to minimize the exposure risk of personally identifiable information (PII).
|
[Greenscale](https://greenscale.ai/) is a production monitoring platform for your LLM-powered app that provides you granular key insights into your GenAI spending and responsible usage. Greenscale only captures metadata to minimize the exposure risk of personally identifiable information (PII).
|
||||||
|
|
||||||
## Getting Started
|
## Getting Started
|
||||||
|
|
|
@ -1,4 +1,13 @@
|
||||||
# Helicone Tutorial
|
# Helicone Tutorial
|
||||||
|
|
||||||
|
:::tip
|
||||||
|
|
||||||
|
This is community maintained, Please make an issue if you run into a bug
|
||||||
|
https://github.com/BerriAI/litellm
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
|
|
||||||
[Helicone](https://helicone.ai/) is an open source observability platform that proxies your OpenAI traffic and provides you key insights into your spend, latency and usage.
|
[Helicone](https://helicone.ai/) is an open source observability platform that proxies your OpenAI traffic and provides you key insights into your spend, latency and usage.
|
||||||
|
|
||||||
## Use Helicone to log requests across all LLM Providers (OpenAI, Azure, Anthropic, Cohere, Replicate, PaLM)
|
## Use Helicone to log requests across all LLM Providers (OpenAI, Azure, Anthropic, Cohere, Replicate, PaLM)
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
import Image from '@theme/IdealImage';
|
import Image from '@theme/IdealImage';
|
||||||
|
|
||||||
# Langfuse - Logging LLM Input/Output
|
# 🔥 Langfuse - Logging LLM Input/Output
|
||||||
|
|
||||||
LangFuse is open Source Observability & Analytics for LLM Apps
|
LangFuse is open Source Observability & Analytics for LLM Apps
|
||||||
Detailed production traces and a granular view on quality, cost and latency
|
Detailed production traces and a granular view on quality, cost and latency
|
||||||
|
|
|
@ -1,6 +1,16 @@
|
||||||
import Image from '@theme/IdealImage';
|
import Image from '@theme/IdealImage';
|
||||||
|
|
||||||
# Langsmith - Logging LLM Input/Output
|
# Langsmith - Logging LLM Input/Output
|
||||||
|
|
||||||
|
|
||||||
|
:::tip
|
||||||
|
|
||||||
|
This is community maintained, Please make an issue if you run into a bug
|
||||||
|
https://github.com/BerriAI/litellm
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
|
|
||||||
An all-in-one developer platform for every step of the application lifecycle
|
An all-in-one developer platform for every step of the application lifecycle
|
||||||
https://smith.langchain.com/
|
https://smith.langchain.com/
|
||||||
|
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
import Image from '@theme/IdealImage';
|
import Image from '@theme/IdealImage';
|
||||||
|
|
||||||
# Logfire - Logging LLM Input/Output
|
# 🔥 Logfire - Logging LLM Input/Output
|
||||||
|
|
||||||
Logfire is open Source Observability & Analytics for LLM Apps
|
Logfire is open Source Observability & Analytics for LLM Apps
|
||||||
Detailed production traces and a granular view on quality, cost and latency
|
Detailed production traces and a granular view on quality, cost and latency
|
||||||
|
|
|
@ -1,5 +1,13 @@
|
||||||
# Lunary - Logging and tracing LLM input/output
|
# Lunary - Logging and tracing LLM input/output
|
||||||
|
|
||||||
|
:::tip
|
||||||
|
|
||||||
|
This is community maintained, Please make an issue if you run into a bug
|
||||||
|
https://github.com/BerriAI/litellm
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
|
|
||||||
[Lunary](https://lunary.ai/) is an open-source AI developer platform providing observability, prompt management, and evaluation tools for AI developers.
|
[Lunary](https://lunary.ai/) is an open-source AI developer platform providing observability, prompt management, and evaluation tools for AI developers.
|
||||||
|
|
||||||
<video controls width='900' >
|
<video controls width='900' >
|
||||||
|
|
|
@ -2,6 +2,15 @@ import Image from '@theme/IdealImage';
|
||||||
|
|
||||||
# Promptlayer Tutorial
|
# Promptlayer Tutorial
|
||||||
|
|
||||||
|
|
||||||
|
:::tip
|
||||||
|
|
||||||
|
This is community maintained, Please make an issue if you run into a bug
|
||||||
|
https://github.com/BerriAI/litellm
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
|
|
||||||
Promptlayer is a platform for prompt engineers. Log OpenAI requests. Search usage history. Track performance. Visually manage prompt templates.
|
Promptlayer is a platform for prompt engineers. Log OpenAI requests. Search usage history. Track performance. Visually manage prompt templates.
|
||||||
|
|
||||||
<Image img={require('../../img/promptlayer.png')} />
|
<Image img={require('../../img/promptlayer.png')} />
|
||||||
|
|
|
@ -1,5 +1,14 @@
|
||||||
import Image from '@theme/IdealImage';
|
import Image from '@theme/IdealImage';
|
||||||
|
|
||||||
|
|
||||||
|
:::tip
|
||||||
|
|
||||||
|
This is community maintained, Please make an issue if you run into a bug
|
||||||
|
https://github.com/BerriAI/litellm
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
|
|
||||||
# Sentry - Log LLM Exceptions
|
# Sentry - Log LLM Exceptions
|
||||||
[Sentry](https://sentry.io/) provides error monitoring for production. LiteLLM can add breadcrumbs and send exceptions to Sentry with this integration
|
[Sentry](https://sentry.io/) provides error monitoring for production. LiteLLM can add breadcrumbs and send exceptions to Sentry with this integration
|
||||||
|
|
||||||
|
|
|
@ -1,4 +1,12 @@
|
||||||
# Supabase Tutorial
|
# Supabase Tutorial
|
||||||
|
|
||||||
|
:::tip
|
||||||
|
|
||||||
|
This is community maintained, Please make an issue if you run into a bug
|
||||||
|
https://github.com/BerriAI/litellm
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
[Supabase](https://supabase.com/) is an open source Firebase alternative.
|
[Supabase](https://supabase.com/) is an open source Firebase alternative.
|
||||||
Start your project with a Postgres database, Authentication, instant APIs, Edge Functions, Realtime subscriptions, Storage, and Vector embeddings.
|
Start your project with a Postgres database, Authentication, instant APIs, Edge Functions, Realtime subscriptions, Storage, and Vector embeddings.
|
||||||
|
|
||||||
|
|
|
@ -1,6 +1,16 @@
|
||||||
import Image from '@theme/IdealImage';
|
import Image from '@theme/IdealImage';
|
||||||
|
|
||||||
# Weights & Biases - Logging LLM Input/Output
|
# Weights & Biases - Logging LLM Input/Output
|
||||||
|
|
||||||
|
|
||||||
|
:::tip
|
||||||
|
|
||||||
|
This is community maintained, Please make an issue if you run into a bug
|
||||||
|
https://github.com/BerriAI/litellm
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
|
|
||||||
Weights & Biases helps AI developers build better models faster https://wandb.ai
|
Weights & Biases helps AI developers build better models faster https://wandb.ai
|
||||||
|
|
||||||
<Image img={require('../../img/wandb.png')} />
|
<Image img={require('../../img/wandb.png')} />
|
||||||
|
|
|
@ -172,10 +172,8 @@ const sidebars = {
|
||||||
"proxy/custom_pricing",
|
"proxy/custom_pricing",
|
||||||
"routing",
|
"routing",
|
||||||
"scheduler",
|
"scheduler",
|
||||||
"rules",
|
|
||||||
"set_keys",
|
"set_keys",
|
||||||
"budget_manager",
|
"budget_manager",
|
||||||
"contributing",
|
|
||||||
"secret",
|
"secret",
|
||||||
"completion/token_usage",
|
"completion/token_usage",
|
||||||
"load_test",
|
"load_test",
|
||||||
|
@ -183,11 +181,11 @@ const sidebars = {
|
||||||
type: "category",
|
type: "category",
|
||||||
label: "Logging & Observability",
|
label: "Logging & Observability",
|
||||||
items: [
|
items: [
|
||||||
|
"observability/langfuse_integration",
|
||||||
|
"observability/logfire_integration",
|
||||||
"debugging/local_debugging",
|
"debugging/local_debugging",
|
||||||
"observability/raw_request_response",
|
"observability/raw_request_response",
|
||||||
"observability/callbacks",
|
|
||||||
"observability/custom_callback",
|
"observability/custom_callback",
|
||||||
"observability/langfuse_integration",
|
|
||||||
"observability/sentry",
|
"observability/sentry",
|
||||||
"observability/lago",
|
"observability/lago",
|
||||||
"observability/openmeter",
|
"observability/openmeter",
|
||||||
|
@ -233,6 +231,8 @@ const sidebars = {
|
||||||
label: "Extras",
|
label: "Extras",
|
||||||
items: [
|
items: [
|
||||||
"extras/contributing",
|
"extras/contributing",
|
||||||
|
"contributing",
|
||||||
|
"rules",
|
||||||
"proxy_server",
|
"proxy_server",
|
||||||
{
|
{
|
||||||
type: "category",
|
type: "category",
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue