Commit graph

14594 commits

Author SHA1 Message Date
Ishaan Jaff
265ec00d0f fix test routes on litellm proxy 2024-07-10 16:51:47 -07:00
Ishaan Jaff
0cb4dabaf0
Merge pull request #4642 from BerriAI/litellm_safe_access_slack
[fix] slack alerting reports - add validation for safe access into attributes
2024-07-10 16:46:08 -07:00
Ishaan Jaff
a313174ecb
Merge pull request #4648 from BerriAI/litellm_add_remaining_file_endpoints
[Feat] Add LIST, DELETE, GET `/files`
2024-07-10 16:42:05 -07:00
Ishaan Jaff
f34e45db93 test - /file endpoints 2024-07-10 16:36:27 -07:00
Marc Abramowitz
982603714e Add docs 2024-07-10 16:32:05 -07:00
Ishaan Jaff
a5f12aba81 add file content 2024-07-10 16:15:43 -07:00
Ishaan Jaff
cb300d30a9 add file delete path 2024-07-10 16:08:58 -07:00
Marc Abramowitz
3a2cb151aa Proxy: Add x-litellm-call-id response header
This gives the value of `logging_obj.litellm_call_id` and one particular use of
this is to correlate the HTTP response from a request with a trace in an LLM
logging tool like Langfuse, Langsmith, etc.

For example, if a user in my environment (w/ Langfuse) gets back this in the
response headers:

```
x-litellm-call-id: ffcb49e7-bd6e-4e56-9c08-a7243802b26e
```

then they know that they can see the trace for this request in Langfuse by
visiting https://langfuse.domain.com/trace/ffcb49e7-bd6e-4e56-9c08-a7243802b26e

They can also use this ID to submit scores for this request to the Langfuse
scoring API.
2024-07-10 16:05:37 -07:00
Ishaan Jaff
0b39ec6a8e add /v1/files LIST 2024-07-10 16:04:07 -07:00
Ishaan Jaff
a741586519 test openai files endpoints 2024-07-10 15:54:55 -07:00
Ishaan Jaff
ef3bd2df22 support list files on litellm SDK 2024-07-10 15:51:28 -07:00
Ishaan Jaff
f18754b6ed test - delete file 2024-07-10 15:42:15 -07:00
Ishaan Jaff
fc2b2fbe49 test - deleting a file 2024-07-10 15:41:32 -07:00
Ishaan Jaff
a542e7be61 add all openai file endpoints 2024-07-10 15:35:21 -07:00
Marc Abramowitz
2db9c23bce Remove unnecessary imports
from `litellm/proxy/proxy_server.py`
2024-07-10 15:06:47 -07:00
Ishaan Jaff
5187569e11 test retrieve file 2024-07-10 15:00:27 -07:00
Ishaan Jaff
efca9daf5d add "/v1/files/{file_id}" as openai route 2024-07-10 14:56:53 -07:00
Ishaan Jaff
393ce7df14 add /files endpoints 2024-07-10 14:55:10 -07:00
Ishaan Jaff
99fd388943 add retrive file to litellm SDK 2024-07-10 14:51:48 -07:00
Krrish Dholakia
aace0b22a3 fix(proxy_server.py): fix proxy_server.py premium user check for encrypted license key 2024-07-10 12:25:31 -07:00
Krrish Dholakia
d5d782f844 build(requirements.txt): bump openai version
Fixes https://github.com/BerriAI/litellm/issues/4639
2024-07-10 11:52:29 -07:00
Ishaan Jaff
e4dbd5abd4
Merge pull request #4645 from BerriAI/litellm_add_assistants_delete_endpoint
[Feat-Proxy] Add DELETE /assistants
2024-07-10 11:45:37 -07:00
Ishaan Jaff
09fe40791e add "/v1/assistants/{assistant_id}", as openai route 2024-07-10 11:42:02 -07:00
Ishaan Jaff
62f475919b feat - add DELETE assistants endpoint 2024-07-10 11:37:37 -07:00
Ishaan Jaff
7e82d98299 test assistants endpoint 2024-07-10 11:15:28 -07:00
Ishaan Jaff
5587dbbd32 add async assistants delete support 2024-07-10 11:14:40 -07:00
Ishaan Jaff
3480382495 test - delete assistants 2024-07-10 10:35:30 -07:00
Ishaan Jaff
5bf430f201 add delete assistant SDK 2024-07-10 10:33:00 -07:00
Ishaan Jaff
e20b540dac add validation on slack 2024-07-10 10:10:32 -07:00
Fabian Reinold
3de1743cd8
Change prisma configuration to capsulate all binaries inside application directory with no connection to active user 2024-07-10 17:27:01 +02:00
Krrish Dholakia
e201b06d9c docs(prod.md): update redis url doc for best prod practices 2024-07-10 08:03:50 -07:00
Fabian Reinold
bac795218d
Add prisma binary_cache_dir specification to pyproject.toml 2024-07-10 16:26:42 +02:00
Krrish Dholakia
5d6e172d5c feat(anthropic_adapter.py): support for translating anthropic params to openai format 2024-07-10 00:32:28 -07:00
Krrish Dholakia
d077148135 style(litellm_license.py): add debug statement for litellm license 2024-07-09 22:43:33 -07:00
Ishaan Jaff
78e67f36e7 fix bedrock better debugging for credentials 2024-07-09 22:02:17 -07:00
Ishaan Jaff
de13d06ce6 docs - show how to use with azure openai 2024-07-09 18:33:22 -07:00
Ishaan Jaff
3a06e2e425 fix show exact prisma exception when starting proxy 2024-07-09 18:20:09 -07:00
Ishaan Jaff
5e12364fad update docker compose to show how to pass a config.yaml 2024-07-09 17:59:02 -07:00
Ishaan Jaff
f10f589ddd bump: version 1.41.14 → 1.41.15 2024-07-09 16:33:17 -07:00
Ishaan Jaff
a250e99934 ui new build 2024-07-09 16:33:00 -07:00
Ishaan Jaff
59666f3d92
Merge pull request #4628 from BerriAI/dependabot/pip/zipp-3.19.1
build(deps): bump zipp from 3.18.2 to 3.19.1
2024-07-09 16:28:47 -07:00
Ishaan Jaff
8d0eddf87b
Merge pull request #4632 from BerriAI/litellm_set_ip_address_on_ui
ui - allow setting allowed ip addresses
2024-07-09 16:28:14 -07:00
Ishaan Jaff
8195a4eacc check if premium user for sso / allowed ip 2024-07-09 16:25:23 -07:00
Ishaan Jaff
f7002ecd08 fixes when no ip addresses enabled 2024-07-09 16:15:05 -07:00
Ishaan Jaff
e966d9fd0f fix text hierarhcy 2024-07-09 16:04:03 -07:00
Ishaan Jaff
b04c4da12e fix allowed ip screen 2024-07-09 15:57:53 -07:00
Ishaan Jaff
f3dddd234d ui - get, set, delete allowed ip addresses 2024-07-09 15:43:44 -07:00
Ishaan Jaff
22df67edb7 feat - add mgtm endpoint routes 2024-07-09 15:29:41 -07:00
Ishaan Jaff
362c01c21f ui - add Create, get, delete endpoints for IP Addresses 2024-07-09 15:12:08 -07:00
Ishaan Jaff
2c338296c1 ui - allow setting allowed ip 2024-07-09 14:46:46 -07:00