forked from phoenix/litellm-mirror
adding support for supabase integration
This commit is contained in:
parent
31ca0839e5
commit
75057c72d2
15 changed files with 259 additions and 23 deletions
75
docs/supabase_integration.md
Normal file
75
docs/supabase_integration.md
Normal file
|
@ -0,0 +1,75 @@
|
|||
# Supabase Tutorial
|
||||
[Supabase](https://supabase.com/) is an open source Firebase alternative.
|
||||
Start your project with a Postgres database, Authentication, instant APIs, Edge Functions, Realtime subscriptions, Storage, and Vector embeddings.
|
||||
|
||||
## Use Supabase to see total spend across all LLM Providers (OpenAI, Azure, Anthropic, Cohere, Replicate, PaLM)
|
||||
liteLLM provides `success_callbacks` and `failure_callbacks`, making it easy for you to send data to a particular provider depending on the status of your responses.
|
||||
|
||||
In this case, we want to log requests to Supabase in both scenarios - when it succeeds and fails.
|
||||
|
||||
### Create a supabase table
|
||||
|
||||
Go to your Supabase project > go to the [Supabase SQL Editor](https://supabase.com/dashboard/projects) and create a new table with this configuration.
|
||||
|
||||
Note: You can change the table name. Just don't change the column names.
|
||||
|
||||
```sql
|
||||
create table
|
||||
public.request_logs (
|
||||
id bigint generated by default as identity,
|
||||
created_at timestamp with time zone null default now(),
|
||||
model text null default ''::text,
|
||||
messages json null default '{}'::json,
|
||||
response json null default '{}'::json,
|
||||
end_user text null default ''::text,
|
||||
error json null default '{}'::json,
|
||||
response_time real null default '0'::real,
|
||||
total_cost real null,
|
||||
additional_details json null default '{}'::json,
|
||||
constraint request_logs_pkey primary key (id)
|
||||
) tablespace pg_default;
|
||||
```
|
||||
|
||||
### Use Callbacks
|
||||
Use just 2 lines of code, to instantly see costs and log your responses **across all providers** with Supabase:
|
||||
|
||||
```
|
||||
litellm.success_callback=["supabase"]
|
||||
litellm.failure_callback=["supabase"]
|
||||
```
|
||||
|
||||
Complete code
|
||||
```python
|
||||
from litellm import completion
|
||||
|
||||
## set env variables
|
||||
os.environ["SUPABASE_URL"] = "your-supabase-url"
|
||||
os.environ["SUPABASE_key"] = "your-supabase-key"
|
||||
os.environ["OPENAI_API_KEY"] = ""
|
||||
|
||||
# set callbacks
|
||||
litellm.success_callback=["supabase"]
|
||||
litellm.failure_callback=["supabase"]
|
||||
|
||||
#openai call
|
||||
response = completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}])
|
||||
|
||||
#bad call
|
||||
response = completion(model="chatgpt-test", messages=[{"role": "user", "content": "Hi 👋 - i'm a bad call to test error logging"}])
|
||||
```
|
||||
|
||||
### Additional Controls
|
||||
|
||||
**Different Table name**
|
||||
If you modified your table name, here's how to pass the new name.
|
||||
|
||||
```python
|
||||
litellm.modify_integration("supabase",{"table_name": "litellm_logs"})
|
||||
```
|
||||
|
||||
**Identify end-user**
|
||||
Here's how to map your llm call to an end-user
|
||||
|
||||
```python
|
||||
litellm.identify({"end_user": "krrish@berri.ai"})
|
||||
```
|
Loading…
Add table
Add a link
Reference in a new issue