Commit graph

1928 commits

Author SHA1 Message Date
Marc Abramowitz
bc00e4c9ff Allow not displaying feedback box
by setting an env var called `LITELLM_DONT_SHOW_FEEDBACK_BOX` to `"true"`.

I liked the feedback box when I first started using LiteLLM, because it showed
me that the authors care about customers. But now that I've seen it a bunch of
times, I don't need to see it every time I start the server and I'd rather have
less output on startup.
2024-07-24 16:50:10 -07:00
Ishaan Jaff
d3d4124357 fix add better debugging _PROXY_track_cost_callback 2024-07-23 15:25:46 -07:00
Krrish Dholakia
271407400a fix(utils.py): support raw response headers for streaming requests 2024-07-23 11:58:58 -07:00
Ishaan Jaff
fed7ad44ae feat - set alert_to_webhook_url 2024-07-23 10:08:21 -07:00
Ishaan Jaff
f1e0919446 feat - add endpoint to set team callbacks 2024-07-22 18:18:09 -07:00
Ishaan Jaff
75e48c84f4 fix add fix to update spend logs 2024-07-19 12:49:23 -07:00
Ishaan Jaff
dbd4e7ac56 fix ui - make default session 24 hours 2024-07-19 10:17:45 -07:00
Ishaan Jaff
372c83a148 fix ui make ui session last 24 hours 2024-07-18 18:22:40 -07:00
Ishaan Jaff
dfb5e816a6 Merge branch 'main' into litellm_run_moderation_check_on_embedding 2024-07-18 12:44:30 -07:00
Florian Greinacher
d83a22b392 feat(proxy): support hiding health check details 2024-07-18 17:21:12 +02:00
Ishaan Jaff
ebcd6ec232 fix run moderation check on embedding 2024-07-17 17:59:20 -07:00
Ishaan Jaff
7944450074 Merge pull request #4724 from BerriAI/litellm_Set_max_file_size_transc
[Feat] - set max file size on /audio/transcriptions
2024-07-15 20:42:24 -07:00
Ishaan Jaff
c755522724 Merge pull request #4719 from BerriAI/litellm_fix_audio_transcript
[Fix] /audio/transcription - don't write to the local file system
2024-07-15 20:05:42 -07:00
Ishaan Jaff
57f0633b99 use helper to check check_file_size_under_limit 2024-07-15 19:40:05 -07:00
Krrish Dholakia
cdfe1e6847 fix(utils.py): update get_model_info docstring
Fixes https://github.com/BerriAI/litellm/issues/4711
2024-07-15 18:18:50 -07:00
Ishaan Jaff
0b43d37f76 fix - don't write file.filename 2024-07-15 14:56:01 -07:00
Krrish Dholakia
e0c2940f22 fix(proxy_server.py): fix returning response headers on exception 2024-07-13 19:11:30 -07:00
Krrish Dholakia
17635450cd feat(proxy_server.py): return 'retry-after' param for rate limited requests
Closes https://github.com/BerriAI/litellm/issues/4695
2024-07-13 17:15:20 -07:00
Krrish Dholakia
d4ba87840e fix(proxy_server.py): fix linting errors 2024-07-11 22:12:33 -07:00
Krish Dholakia
35a17b7d99 Merge pull request #4669 from BerriAI/litellm_logging_only_masking
Flag for PII masking on Logging only
2024-07-11 22:03:37 -07:00
Krish Dholakia
8a4c428a7c Merge branch 'main' into litellm_call_id_in_response 2024-07-11 21:54:49 -07:00
Krish Dholakia
12e0f50812 Merge pull request #4651 from msabramo/docs-logging-cleanup
Docs: Miscellaneous cleanup of `docs/my-website/docs/proxy/logging.md`
2024-07-11 21:52:20 -07:00
Krrish Dholakia
abd682323c feat(guardrails): Flag for PII Masking on Logging
Fixes https://github.com/BerriAI/litellm/issues/4580
2024-07-11 16:09:34 -07:00
Ishaan Jaff
92228d9104 Merge pull request #4647 from msabramo/msabramo/remove-unnecessary-imports
Remove unnecessary imports
2024-07-11 15:07:30 -07:00
Krrish Dholakia
7b38278e69 docs(model_management.md): update docs to clarify calling /model/info 2024-07-11 09:47:50 -07:00
Krish Dholakia
f4d140efec Merge pull request #4635 from BerriAI/litellm_anthropic_adapter
Anthropic `/v1/messages` endpoint support
2024-07-10 22:41:53 -07:00
Krrish Dholakia
48be4ce805 feat(proxy_server.py): working /v1/messages with config.yaml
Adds async router support for adapter_completion call
2024-07-10 18:53:54 -07:00
Krrish Dholakia
4ba30abb63 feat(proxy_server.py): working /v1/messages endpoint
Works with claude engineer
2024-07-10 18:15:38 -07:00
Marc Abramowitz
2ce0edcca9 Move JSX stuff so first line of file is heading
This prevents VS Code from displaying a warning about the file not starting with
a heading.
2024-07-10 17:02:56 -07:00
Ishaan Jaff
cc4434192f fix test routes on litellm proxy 2024-07-10 16:51:47 -07:00
Ishaan Jaff
4675983f42 Merge pull request #4648 from BerriAI/litellm_add_remaining_file_endpoints
[Feat] Add LIST, DELETE, GET `/files`
2024-07-10 16:42:05 -07:00
Marc Abramowitz
be4b7629b5 Proxy: Add x-litellm-call-id response header
This gives the value of `logging_obj.litellm_call_id` and one particular use of
this is to correlate the HTTP response from a request with a trace in an LLM
logging tool like Langfuse, Langsmith, etc.

For example, if a user in my environment (w/ Langfuse) gets back this in the
response headers:

```
x-litellm-call-id: ffcb49e7-bd6e-4e56-9c08-a7243802b26e
```

then they know that they can see the trace for this request in Langfuse by
visiting https://langfuse.domain.com/trace/ffcb49e7-bd6e-4e56-9c08-a7243802b26e

They can also use this ID to submit scores for this request to the Langfuse
scoring API.
2024-07-10 16:05:37 -07:00
Marc Abramowitz
416bca4a3f Remove unnecessary imports
from `litellm/proxy/proxy_server.py`
2024-07-10 15:06:47 -07:00
Ishaan Jaff
f118123ae1 add /files endpoints 2024-07-10 14:55:10 -07:00
Krrish Dholakia
3f4f5ae994 fix(proxy_server.py): fix proxy_server.py premium user check for encrypted license key 2024-07-10 12:25:31 -07:00
Ishaan Jaff
a9e15dad62 feat - add DELETE assistants endpoint 2024-07-10 11:37:37 -07:00
Ishaan Jaff
a784b54245 feat - add mgtm endpoint routes 2024-07-09 15:29:41 -07:00
Ishaan Jaff
6000687601 Merge pull request #4627 from BerriAI/litellm_fix_thread_auth
[Fix] Authentication on /thread endpoints on Proxy
2024-07-09 12:19:19 -07:00
Ishaan Jaff
bce7b5f8c8 feat - support /create assistants endpoint 2024-07-09 10:03:47 -07:00
Ishaan Jaff
0f43869706 feat - support acreate_assistants endpoint 2024-07-09 09:49:38 -07:00
Krrish Dholakia
0ecf94d32e fix(proxy_server.py): add license protection for 'allowed_ip' address feature 2024-07-08 16:04:44 -07:00
Ishaan Jaff
4202be8e1f raise budget_exceeded in user_api_key_auth 2024-07-08 12:45:39 -07:00
Krrish Dholakia
1dae0a5b6a fix(utils.py): cleanup 'additionalProperties=False' for tool calling with zod
Fixes issue with zod passing in additionalProperties=False, causing vertex ai / gemini calls to fail
2024-07-06 17:27:37 -07:00
Ishaan Jaff
f96c0efd90 Merge pull request #4576 from BerriAI/litellm_encrypt_decrypt_using_salt
[Refactor] Use helper function to encrypt/decrypt model credentials
2024-07-06 15:11:09 -07:00
Krish Dholakia
ece24015cc Merge branch 'main' into litellm_tts_pricing 2024-07-06 14:57:34 -07:00
Ishaan Jaff
752fe3ac7c improve sign up flow - show missing env vars 2024-07-06 13:57:19 -07:00
Ishaan Jaff
3f3594f106 init litellm router 2024-07-06 13:14:04 -07:00
Ishaan Jaff
9d7b03d31c fix checking store_model_in_deb 2024-07-06 11:35:28 -07:00
Krrish Dholakia
47ce6ccac0 fix(proxy_server.py): fix embedding model exception mapping 2024-07-06 11:14:41 -07:00
Ishaan Jaff
561a30dd59 move encrypt / decrypt to helper 2024-07-06 11:09:47 -07:00