Marc Abramowitz
be4b7629b5
Proxy: Add x-litellm-call-id
response header
...
This gives the value of `logging_obj.litellm_call_id` and one particular use of
this is to correlate the HTTP response from a request with a trace in an LLM
logging tool like Langfuse, Langsmith, etc.
For example, if a user in my environment (w/ Langfuse) gets back this in the
response headers:
```
x-litellm-call-id: ffcb49e7-bd6e-4e56-9c08-a7243802b26e
```
then they know that they can see the trace for this request in Langfuse by
visiting https://langfuse.domain.com/trace/ffcb49e7-bd6e-4e56-9c08-a7243802b26e
They can also use this ID to submit scores for this request to the Langfuse
scoring API.
2024-07-10 16:05:37 -07:00
Ishaan Jaff
9e71006d07
add /v1/files LIST
2024-07-10 16:04:07 -07:00
Marc Abramowitz
416bca4a3f
Remove unnecessary imports
...
from `litellm/proxy/proxy_server.py`
2024-07-10 15:06:47 -07:00
Ishaan Jaff
0f39e09913
add "/v1/files/{file_id}" as openai route
2024-07-10 14:56:53 -07:00
Ishaan Jaff
f118123ae1
add /files endpoints
2024-07-10 14:55:10 -07:00
Krrish Dholakia
3f4f5ae994
fix(proxy_server.py): fix proxy_server.py premium user check for encrypted license key
2024-07-10 12:25:31 -07:00
Ishaan Jaff
3ae7dcccb5
add "/v1/assistants/{assistant_id}", as openai route
2024-07-10 11:42:02 -07:00
Ishaan Jaff
a9e15dad62
feat - add DELETE assistants endpoint
2024-07-10 11:37:37 -07:00
Krrish Dholakia
01a335b4c3
feat(anthropic_adapter.py): support for translating anthropic params to openai format
2024-07-10 00:32:28 -07:00
Krrish Dholakia
d66a48b3d1
style(litellm_license.py): add debug statement for litellm license
2024-07-09 22:43:33 -07:00
Ishaan Jaff
49f8894dcc
fix show exact prisma exception when starting proxy
2024-07-09 18:20:09 -07:00
Ishaan Jaff
eb43343643
ui new build
2024-07-09 16:33:00 -07:00
Ishaan Jaff
a784b54245
feat - add mgtm endpoint routes
2024-07-09 15:29:41 -07:00
Ishaan Jaff
41e7f96c80
ui - add Create, get, delete endpoints for IP Addresses
2024-07-09 15:12:08 -07:00
Krrish Dholakia
789d2dab15
fix(vertex_httpx.py): add sync vertex image gen support
...
Fixes https://github.com/BerriAI/litellm/issues/4623
2024-07-09 13:33:54 -07:00
Ishaan Jaff
6000687601
Merge pull request #4627 from BerriAI/litellm_fix_thread_auth
...
[Fix] Authentication on /thread endpoints on Proxy
2024-07-09 12:19:19 -07:00
Ishaan Jaff
a7a6567da4
test /threads endpoint
2024-07-09 12:17:42 -07:00
Ishaan Jaff
6891b29444
fix - use helper to check if a route is openai route
2024-07-09 12:00:07 -07:00
Ishaan Jaff
c8a15ab83e
add helper to check is_openai_route
2024-07-09 11:50:12 -07:00
Ishaan Jaff
b6e9fe384c
fix add assistant settings on config
2024-07-09 10:05:32 -07:00
Ishaan Jaff
bce7b5f8c8
feat - support /create assistants endpoint
2024-07-09 10:03:47 -07:00
Ishaan Jaff
0f43869706
feat - support acreate_assistants endpoint
2024-07-09 09:49:38 -07:00
Krrish Dholakia
932814aa1c
docs(configs.md): add ip address filtering to docs
2024-07-08 21:59:26 -07:00
Krish Dholakia
4bd65aac40
Merge pull request #4615 from BerriAI/litellm_user_api_key_auth
...
Enable `allowed_ip's` for proxy
2024-07-08 17:35:11 -07:00
Krrish Dholakia
aecbc98b9b
fix(proxy_cli.py): bump default azure api version
2024-07-08 16:28:22 -07:00
Ishaan Jaff
519a083e09
ui new build
2024-07-08 16:19:25 -07:00
Krrish Dholakia
0ecf94d32e
fix(proxy_server.py): add license protection for 'allowed_ip' address feature
2024-07-08 16:04:44 -07:00
Krrish Dholakia
f982e93d24
feat(user_api_key_auth.py): allow restricting calls by IP address
...
Allows admin to restrict which IP addresses can make calls to the proxy
2024-07-08 15:58:15 -07:00
Ishaan Jaff
449dd4c78c
Merge pull request #4611 from BerriAI/litellm_fix_assistants_routes
...
[Proxy-Fix]: Add /assistants, /threads as OpenAI routes
2024-07-08 15:16:11 -07:00
Ishaan Jaff
f837e6e8d1
fix routes on assistants endpoints
2024-07-08 15:02:12 -07:00
Andres Guzman
e0ac480584
fix(utils.py): change update to upsert
2024-07-08 15:49:29 -06:00
Ishaan Jaff
170150e02e
use ProxyErrorTypes,
2024-07-08 12:47:53 -07:00
Ishaan Jaff
4202be8e1f
raise budget_exceeded in user_api_key_auth
2024-07-08 12:45:39 -07:00
Ishaan Jaff
d3fc5d4a17
use types for ProxyErrorTypes
2024-07-08 12:42:27 -07:00
Ishaan Jaff
113f625469
Merge pull request #4603 from BerriAI/litellm_track_user_ip
...
[Enterprise-Feature: Proxy] Track user-ip address in requests & in LiteLLM_SpendLogs
2024-07-08 12:18:52 -07:00
Ishaan Jaff
78839eb46e
SpendLogsPayload- track user ip
2024-07-08 10:16:58 -07:00
Krrish Dholakia
d68ab2a8bc
fix(whisper---handle-openai/azure-vtt-response-format): Fixes https://github.com/BerriAI/litellm/issues/4595
2024-07-08 09:10:40 -07:00
Ishaan Jaff
7bbf4047e7
track user_ip address per request
2024-07-08 09:00:08 -07:00
Ishaan Jaff
af4bdae066
fix - setting rpm/tpm
2024-07-08 07:43:00 -07:00
Krrish Dholakia
bcd7358daf
fix(presidio_pii_masking.py): fix presidio unset url check + add same check for langfuse
2024-07-06 17:50:55 -07:00
Krrish Dholakia
e424fea721
fix(presidio_pii_masking.py): add support for setting 'http://' if unset by render env for presidio base url
2024-07-06 17:42:10 -07:00
Krrish Dholakia
1dae0a5b6a
fix(utils.py): cleanup 'additionalProperties=False' for tool calling with zod
...
Fixes issue with zod passing in additionalProperties=False, causing vertex ai / gemini calls to fail
2024-07-06 17:27:37 -07:00
Ishaan Jaff
202f34fff5
fix trace hierarchy on otel
2024-07-06 15:37:23 -07:00
Ishaan Jaff
8f5ce1a3ad
ui new build
2024-07-06 15:26:26 -07:00
Ishaan Jaff
76bb824f59
Merge pull request #4578 from BerriAI/litellm_allow_querying_spend_report_by_key
...
[Feat-Enterprise] /spend/report view spend for a specific key
2024-07-06 15:11:46 -07:00
Ishaan Jaff
f96c0efd90
Merge pull request #4576 from BerriAI/litellm_encrypt_decrypt_using_salt
...
[Refactor] Use helper function to encrypt/decrypt model credentials
2024-07-06 15:11:09 -07:00
Krish Dholakia
ece24015cc
Merge branch 'main' into litellm_tts_pricing
2024-07-06 14:57:34 -07:00
Krish Dholakia
5640ed4c8c
Merge branch 'main' into litellm_proxy_tts_pricing
2024-07-06 14:56:16 -07:00
Krrish Dholakia
f89632f5ac
fix(main.py): fix stream_chunk_builder usage calc
...
Closes https://github.com/BerriAI/litellm/issues/4496
2024-07-06 14:52:59 -07:00
Ishaan Jaff
37108765eb
get spend per internal user / api_key
2024-07-06 14:45:58 -07:00