Commit graph

1904 commits

Author SHA1 Message Date
Krrish Dholakia
7b38278e69 docs(model_management.md): update docs to clarify calling /model/info 2024-07-11 09:47:50 -07:00
Krish Dholakia
f4d140efec Merge pull request #4635 from BerriAI/litellm_anthropic_adapter
Anthropic `/v1/messages` endpoint support
2024-07-10 22:41:53 -07:00
Krrish Dholakia
48be4ce805 feat(proxy_server.py): working /v1/messages with config.yaml
Adds async router support for adapter_completion call
2024-07-10 18:53:54 -07:00
Krrish Dholakia
4ba30abb63 feat(proxy_server.py): working /v1/messages endpoint
Works with claude engineer
2024-07-10 18:15:38 -07:00
Marc Abramowitz
2ce0edcca9 Move JSX stuff so first line of file is heading
This prevents VS Code from displaying a warning about the file not starting with
a heading.
2024-07-10 17:02:56 -07:00
Ishaan Jaff
cc4434192f fix test routes on litellm proxy 2024-07-10 16:51:47 -07:00
Ishaan Jaff
4675983f42 Merge pull request #4648 from BerriAI/litellm_add_remaining_file_endpoints
[Feat] Add LIST, DELETE, GET `/files`
2024-07-10 16:42:05 -07:00
Marc Abramowitz
be4b7629b5 Proxy: Add x-litellm-call-id response header
This gives the value of `logging_obj.litellm_call_id` and one particular use of
this is to correlate the HTTP response from a request with a trace in an LLM
logging tool like Langfuse, Langsmith, etc.

For example, if a user in my environment (w/ Langfuse) gets back this in the
response headers:

```
x-litellm-call-id: ffcb49e7-bd6e-4e56-9c08-a7243802b26e
```

then they know that they can see the trace for this request in Langfuse by
visiting https://langfuse.domain.com/trace/ffcb49e7-bd6e-4e56-9c08-a7243802b26e

They can also use this ID to submit scores for this request to the Langfuse
scoring API.
2024-07-10 16:05:37 -07:00
Marc Abramowitz
416bca4a3f Remove unnecessary imports
from `litellm/proxy/proxy_server.py`
2024-07-10 15:06:47 -07:00
Ishaan Jaff
f118123ae1 add /files endpoints 2024-07-10 14:55:10 -07:00
Krrish Dholakia
3f4f5ae994 fix(proxy_server.py): fix proxy_server.py premium user check for encrypted license key 2024-07-10 12:25:31 -07:00
Ishaan Jaff
a9e15dad62 feat - add DELETE assistants endpoint 2024-07-10 11:37:37 -07:00
Ishaan Jaff
a784b54245 feat - add mgtm endpoint routes 2024-07-09 15:29:41 -07:00
Ishaan Jaff
6000687601 Merge pull request #4627 from BerriAI/litellm_fix_thread_auth
[Fix] Authentication on /thread endpoints on Proxy
2024-07-09 12:19:19 -07:00
Ishaan Jaff
bce7b5f8c8 feat - support /create assistants endpoint 2024-07-09 10:03:47 -07:00
Ishaan Jaff
0f43869706 feat - support acreate_assistants endpoint 2024-07-09 09:49:38 -07:00
Krrish Dholakia
0ecf94d32e fix(proxy_server.py): add license protection for 'allowed_ip' address feature 2024-07-08 16:04:44 -07:00
Ishaan Jaff
4202be8e1f raise budget_exceeded in user_api_key_auth 2024-07-08 12:45:39 -07:00
Krrish Dholakia
1dae0a5b6a fix(utils.py): cleanup 'additionalProperties=False' for tool calling with zod
Fixes issue with zod passing in additionalProperties=False, causing vertex ai / gemini calls to fail
2024-07-06 17:27:37 -07:00
Ishaan Jaff
f96c0efd90 Merge pull request #4576 from BerriAI/litellm_encrypt_decrypt_using_salt
[Refactor] Use helper function to encrypt/decrypt model credentials
2024-07-06 15:11:09 -07:00
Krish Dholakia
ece24015cc Merge branch 'main' into litellm_tts_pricing 2024-07-06 14:57:34 -07:00
Ishaan Jaff
752fe3ac7c improve sign up flow - show missing env vars 2024-07-06 13:57:19 -07:00
Ishaan Jaff
3f3594f106 init litellm router 2024-07-06 13:14:04 -07:00
Ishaan Jaff
9d7b03d31c fix checking store_model_in_deb 2024-07-06 11:35:28 -07:00
Krrish Dholakia
47ce6ccac0 fix(proxy_server.py): fix embedding model exception mapping 2024-07-06 11:14:41 -07:00
Ishaan Jaff
561a30dd59 move encrypt / decrypt to helper 2024-07-06 11:09:47 -07:00
Krrish Dholakia
9f039a9776 fix(proxy_server.py): fix callback check order 2024-07-05 14:06:33 -07:00
Krrish Dholakia
56410cfcd0 fix(proxy_server.py): support langfuse logging for rejected requests on /v1/chat/completions 2024-07-05 13:07:09 -07:00
Ishaan Jaff
bf674b7a9a fix backend invite link sign up 2024-07-04 15:55:04 -07:00
Ishaan Jaff
53412d4dde add doc string for v1/model/info 2024-07-04 13:21:02 -07:00
Ishaan Jaff
4af5531834 feat - allow looking up model_id in model info 2024-07-04 13:13:42 -07:00
Ishaan Jaff
976ce2b286 init guardrails on proxy 2024-07-03 14:18:12 -07:00
Ishaan Jaff
424825be48 Merge pull request #4518 from BerriAI/litellm_fix_background_health_checks
[Fix-Proxy] Background health checks use deep copy of model list for _run_background_health_check
2024-07-02 16:42:34 -07:00
Krrish Dholakia
d09a78d7fd fix(slack_alerting.py): use in-memory cache for checking request status 2024-07-02 13:01:59 -07:00
Ishaan Jaff
46eb95da23 use deep copy of router for _run_background_health_check 2024-07-02 11:29:24 -07:00
Ishaan Jaff
51e02464a2 Merge pull request #4479 from BerriAI/litellm_fix_admin_ui_login_issue
[Fix] Admin UI - fix error users we're seeing when logging in (use correct  user_id  when creating key for admin ui)
2024-06-29 20:00:45 -07:00
Ishaan Jaff
f5177e276a fix - use correct user_id when creating key for admin ui 2024-06-29 14:36:58 -07:00
Ishaan Jaff
6396c30a7b fix error logs store exception in DB 2024-06-29 12:15:14 -07:00
Ishaan Jaff
cf9636cc59 fix support pass through endpoints 2024-06-28 15:06:51 -07:00
Krish Dholakia
fa1cb7d903 Merge branch 'main' into litellm_response_cost_headers 2024-06-27 21:33:09 -07:00
Ishaan Jaff
6fdbe82bfb Merge pull request #4447 from BerriAI/litellm_fix_error_on_v2_model_info
[fix] error message on /v2/model info when no models exist
2024-06-27 16:33:19 -07:00
Ishaan Jaff
096ec0aade fix error message on v2/model info 2024-06-27 16:29:11 -07:00
Ishaan Jaff
fde80eb117 fix secret redaction logic 2024-06-27 13:19:54 -07:00
Krrish Dholakia
be5d918211 fix(utils.py): return 'response_cost' in completion call
Closes https://github.com/BerriAI/litellm/issues/4335
2024-06-26 17:55:57 -07:00
Ishaan Jaff
14220712be init secret detection callback 2024-06-25 16:25:14 -07:00
Ishaan Jaff
3bce8a2395 Merge pull request #4364 from BerriAI/litellm_use_tracemalloc_mem_usage
[Feat]  add endpoint to debug memory util
2024-06-25 10:32:52 -07:00
Ishaan Jaff
26790352db Merge pull request #4382 from BerriAI/litellm_ui_fix_login_bug
[Fix] - Admin UI login bug
2024-06-24 08:19:37 -07:00
Ishaan Jaff
f59804d04b fix ui login bug 2024-06-24 07:54:08 -07:00
Krrish Dholakia
4055381cdb fix(add-exception-mapping-+-langfuse-exception-logging-for-streaming-exceptions): add exception mapping + langfuse exception logging for streaming exceptions
Fixes https://github.com/BerriAI/litellm/issues/4338
2024-06-22 21:26:15 -07:00
Krish Dholakia
63d0defa6d Merge branch 'main' into litellm_dynamic_tpm_limits 2024-06-22 19:14:59 -07:00