Commit graph

1856 commits

Author SHA1 Message Date
Krrish Dholakia
9deb9b4e3f feat(guardrails): Flag for PII Masking on Logging
Fixes https://github.com/BerriAI/litellm/issues/4580
2024-07-11 16:09:34 -07:00
Ishaan Jaff
28cfca87c1
Merge pull request #4647 from msabramo/msabramo/remove-unnecessary-imports
Remove unnecessary imports
2024-07-11 15:07:30 -07:00
Krrish Dholakia
070ab9f469 docs(model_management.md): update docs to clarify calling /model/info 2024-07-11 09:47:50 -07:00
Krish Dholakia
dacce3d78b
Merge pull request #4635 from BerriAI/litellm_anthropic_adapter
Anthropic `/v1/messages` endpoint support
2024-07-10 22:41:53 -07:00
Krrish Dholakia
31829855c0 feat(proxy_server.py): working /v1/messages with config.yaml
Adds async router support for adapter_completion call
2024-07-10 18:53:54 -07:00
Krrish Dholakia
2f8dbbeb97 feat(proxy_server.py): working /v1/messages endpoint
Works with claude engineer
2024-07-10 18:15:38 -07:00
Marc Abramowitz
dd0c07d2a1 Move JSX stuff so first line of file is heading
This prevents VS Code from displaying a warning about the file not starting with
a heading.
2024-07-10 17:02:56 -07:00
Ishaan Jaff
265ec00d0f fix test routes on litellm proxy 2024-07-10 16:51:47 -07:00
Ishaan Jaff
a313174ecb
Merge pull request #4648 from BerriAI/litellm_add_remaining_file_endpoints
[Feat] Add LIST, DELETE, GET `/files`
2024-07-10 16:42:05 -07:00
Marc Abramowitz
3a2cb151aa Proxy: Add x-litellm-call-id response header
This gives the value of `logging_obj.litellm_call_id` and one particular use of
this is to correlate the HTTP response from a request with a trace in an LLM
logging tool like Langfuse, Langsmith, etc.

For example, if a user in my environment (w/ Langfuse) gets back this in the
response headers:

```
x-litellm-call-id: ffcb49e7-bd6e-4e56-9c08-a7243802b26e
```

then they know that they can see the trace for this request in Langfuse by
visiting https://langfuse.domain.com/trace/ffcb49e7-bd6e-4e56-9c08-a7243802b26e

They can also use this ID to submit scores for this request to the Langfuse
scoring API.
2024-07-10 16:05:37 -07:00
Marc Abramowitz
2db9c23bce Remove unnecessary imports
from `litellm/proxy/proxy_server.py`
2024-07-10 15:06:47 -07:00
Ishaan Jaff
393ce7df14 add /files endpoints 2024-07-10 14:55:10 -07:00
Krrish Dholakia
aace0b22a3 fix(proxy_server.py): fix proxy_server.py premium user check for encrypted license key 2024-07-10 12:25:31 -07:00
Ishaan Jaff
62f475919b feat - add DELETE assistants endpoint 2024-07-10 11:37:37 -07:00
Ishaan Jaff
22df67edb7 feat - add mgtm endpoint routes 2024-07-09 15:29:41 -07:00
Ishaan Jaff
6bce7e73a3
Merge pull request #4627 from BerriAI/litellm_fix_thread_auth
[Fix] Authentication on /thread endpoints on Proxy
2024-07-09 12:19:19 -07:00
Ishaan Jaff
12f207b499 feat - support /create assistants endpoint 2024-07-09 10:03:47 -07:00
Ishaan Jaff
6479442d5b feat - support acreate_assistants endpoint 2024-07-09 09:49:38 -07:00
Krrish Dholakia
3045a2d9b3 fix(proxy_server.py): add license protection for 'allowed_ip' address feature 2024-07-08 16:04:44 -07:00
Ishaan Jaff
25ddbbb41d raise budget_exceeded in user_api_key_auth 2024-07-08 12:45:39 -07:00
Krrish Dholakia
20e39d6acc fix(utils.py): cleanup 'additionalProperties=False' for tool calling with zod
Fixes issue with zod passing in additionalProperties=False, causing vertex ai / gemini calls to fail
2024-07-06 17:27:37 -07:00
Ishaan Jaff
d61cc598b0
Merge pull request #4576 from BerriAI/litellm_encrypt_decrypt_using_salt
[Refactor] Use helper function to encrypt/decrypt model credentials
2024-07-06 15:11:09 -07:00
Krish Dholakia
127f08ee67
Merge branch 'main' into litellm_tts_pricing 2024-07-06 14:57:34 -07:00
Ishaan Jaff
7f6f85ec3c improve sign up flow - show missing env vars 2024-07-06 13:57:19 -07:00
Ishaan Jaff
a5d9a6cebd init litellm router 2024-07-06 13:14:04 -07:00
Ishaan Jaff
2e956f7c01 fix checking store_model_in_deb 2024-07-06 11:35:28 -07:00
Krrish Dholakia
71edb28d6a fix(proxy_server.py): fix embedding model exception mapping 2024-07-06 11:14:41 -07:00
Ishaan Jaff
8f0b55879f move encrypt / decrypt to helper 2024-07-06 11:09:47 -07:00
Krrish Dholakia
d528b66db0 fix(proxy_server.py): fix callback check order 2024-07-05 14:06:33 -07:00
Krrish Dholakia
7618ec43b3 fix(proxy_server.py): support langfuse logging for rejected requests on /v1/chat/completions 2024-07-05 13:07:09 -07:00
Ishaan Jaff
c1c2f9faa4 fix backend invite link sign up 2024-07-04 15:55:04 -07:00
Ishaan Jaff
a457aec761 add doc string for v1/model/info 2024-07-04 13:21:02 -07:00
Ishaan Jaff
12343650e4 feat - allow looking up model_id in model info 2024-07-04 13:13:42 -07:00
Ishaan Jaff
129c2e0c4f init guardrails on proxy 2024-07-03 14:18:12 -07:00
Ishaan Jaff
174b2b69df
Merge pull request #4518 from BerriAI/litellm_fix_background_health_checks
[Fix-Proxy] Background health checks use deep copy of model list for _run_background_health_check
2024-07-02 16:42:34 -07:00
Krrish Dholakia
66c6992f8a fix(slack_alerting.py): use in-memory cache for checking request status 2024-07-02 13:01:59 -07:00
Ishaan Jaff
cd6b121642 use deep copy of router for _run_background_health_check 2024-07-02 11:29:24 -07:00
Ishaan Jaff
4910029eeb
Merge pull request #4479 from BerriAI/litellm_fix_admin_ui_login_issue
[Fix] Admin UI - fix error users we're seeing when logging in (use correct  user_id  when creating key for admin ui)
2024-06-29 20:00:45 -07:00
Ishaan Jaff
37352ba692 fix - use correct user_id when creating key for admin ui 2024-06-29 14:36:58 -07:00
Ishaan Jaff
c7e89a571e fix error logs store exception in DB 2024-06-29 12:15:14 -07:00
Ishaan Jaff
954c6ec9ed fix support pass through endpoints 2024-06-28 15:06:51 -07:00
Krish Dholakia
869275585a
Merge branch 'main' into litellm_response_cost_headers 2024-06-27 21:33:09 -07:00
Ishaan Jaff
0cd25c250d
Merge pull request #4447 from BerriAI/litellm_fix_error_on_v2_model_info
[fix] error message on /v2/model info when no models exist
2024-06-27 16:33:19 -07:00
Ishaan Jaff
0c6cd2c894 fix error message on v2/model info 2024-06-27 16:29:11 -07:00
Ishaan Jaff
80960facfa fix secret redaction logic 2024-06-27 13:19:54 -07:00
Krrish Dholakia
f533e1da09 fix(utils.py): return 'response_cost' in completion call
Closes https://github.com/BerriAI/litellm/issues/4335
2024-06-26 17:55:57 -07:00
Ishaan Jaff
e22c9ba0cc init secret detection callback 2024-06-25 16:25:14 -07:00
Ishaan Jaff
b9f83f48ba
Merge pull request #4364 from BerriAI/litellm_use_tracemalloc_mem_usage
[Feat]  add endpoint to debug memory util
2024-06-25 10:32:52 -07:00
Ishaan Jaff
dd048cf342
Merge pull request #4382 from BerriAI/litellm_ui_fix_login_bug
[Fix] - Admin UI login bug
2024-06-24 08:19:37 -07:00
Ishaan Jaff
acbcfef5af fix ui login bug 2024-06-24 07:54:08 -07:00