Ishaan Jaff
805d04f7f3
feat check check_response_size_is_safe
2024-07-27 16:53:39 -07:00
Ishaan Jaff
e3a66f2c62
feat - clearly show version litellm enterprise
2024-07-27 09:50:03 -07:00
Krish Dholakia
fb80839e8c
Merge pull request #4907 from BerriAI/litellm_proxy_get_secret
...
fix(proxy_server.py): fix get secret for environment_variables
2024-07-26 22:17:11 -07:00
Ishaan Jaff
56cf8e2798
feat link to model cost map on swagger
2024-07-26 21:34:42 -07:00
Ishaan Jaff
864f803ccf
fix for GET /v1/batches{batch_id:path}
2024-07-26 18:23:15 -07:00
Ishaan Jaff
2b889b83b3
fix /v1/batches POST
2024-07-26 18:06:00 -07:00
Krrish Dholakia
1a172b7636
fix(proxy_server.py): fix get secret for environment_variables
2024-07-26 13:33:02 -07:00
Krrish Dholakia
9d87767639
feat(proxy_server.py): handle pydantic mockselvar error
...
Fixes https://github.com/BerriAI/litellm/issues/4898#issuecomment-2252105485
2024-07-26 08:38:51 -07:00
Ishaan Jaff
1103c614a0
Merge branch 'main' into litellm_proxy_support_all_providers
2024-07-25 20:15:37 -07:00
Ishaan Jaff
d589d8e4ac
fix using pass_through_all_models
2024-07-25 19:32:49 -07:00
Ishaan Jaff
a46c463dee
router support setting pass_through_all_models
2024-07-25 18:34:12 -07:00
Krrish Dholakia
84ef8c11ff
feat(proxy_server.py): support custom llm handler on proxy
2024-07-25 17:56:34 -07:00
Krrish Dholakia
85dd423de6
fix(proxy_server.py): check if input list > 0 before indexing into it
...
resolves 'list index out of range' error
2024-07-25 14:23:07 -07:00
Ishaan Jaff
d3d4124357
fix add better debugging _PROXY_track_cost_callback
2024-07-23 15:25:46 -07:00
Krrish Dholakia
271407400a
fix(utils.py): support raw response headers for streaming requests
2024-07-23 11:58:58 -07:00
Ishaan Jaff
fed7ad44ae
feat - set alert_to_webhook_url
2024-07-23 10:08:21 -07:00
Ishaan Jaff
f1e0919446
feat - add endpoint to set team callbacks
2024-07-22 18:18:09 -07:00
Ishaan Jaff
75e48c84f4
fix add fix to update spend logs
2024-07-19 12:49:23 -07:00
Ishaan Jaff
dbd4e7ac56
fix ui - make default session 24 hours
2024-07-19 10:17:45 -07:00
Ishaan Jaff
372c83a148
fix ui make ui session last 24 hours
2024-07-18 18:22:40 -07:00
Ishaan Jaff
dfb5e816a6
Merge branch 'main' into litellm_run_moderation_check_on_embedding
2024-07-18 12:44:30 -07:00
Florian Greinacher
d83a22b392
feat(proxy): support hiding health check details
2024-07-18 17:21:12 +02:00
Ishaan Jaff
ebcd6ec232
fix run moderation check on embedding
2024-07-17 17:59:20 -07:00
Ishaan Jaff
7944450074
Merge pull request #4724 from BerriAI/litellm_Set_max_file_size_transc
...
[Feat] - set max file size on /audio/transcriptions
2024-07-15 20:42:24 -07:00
Ishaan Jaff
c755522724
Merge pull request #4719 from BerriAI/litellm_fix_audio_transcript
...
[Fix] /audio/transcription - don't write to the local file system
2024-07-15 20:05:42 -07:00
Ishaan Jaff
57f0633b99
use helper to check check_file_size_under_limit
2024-07-15 19:40:05 -07:00
Krrish Dholakia
cdfe1e6847
fix(utils.py): update get_model_info docstring
...
Fixes https://github.com/BerriAI/litellm/issues/4711
2024-07-15 18:18:50 -07:00
Ishaan Jaff
0b43d37f76
fix - don't write file.filename
2024-07-15 14:56:01 -07:00
Krrish Dholakia
e0c2940f22
fix(proxy_server.py): fix returning response headers on exception
2024-07-13 19:11:30 -07:00
Krrish Dholakia
17635450cd
feat(proxy_server.py): return 'retry-after' param for rate limited requests
...
Closes https://github.com/BerriAI/litellm/issues/4695
2024-07-13 17:15:20 -07:00
Krrish Dholakia
d4ba87840e
fix(proxy_server.py): fix linting errors
2024-07-11 22:12:33 -07:00
Krish Dholakia
35a17b7d99
Merge pull request #4669 from BerriAI/litellm_logging_only_masking
...
Flag for PII masking on Logging only
2024-07-11 22:03:37 -07:00
Krish Dholakia
8a4c428a7c
Merge branch 'main' into litellm_call_id_in_response
2024-07-11 21:54:49 -07:00
Krish Dholakia
12e0f50812
Merge pull request #4651 from msabramo/docs-logging-cleanup
...
Docs: Miscellaneous cleanup of `docs/my-website/docs/proxy/logging.md`
2024-07-11 21:52:20 -07:00
Krrish Dholakia
abd682323c
feat(guardrails): Flag for PII Masking on Logging
...
Fixes https://github.com/BerriAI/litellm/issues/4580
2024-07-11 16:09:34 -07:00
Ishaan Jaff
92228d9104
Merge pull request #4647 from msabramo/msabramo/remove-unnecessary-imports
...
Remove unnecessary imports
2024-07-11 15:07:30 -07:00
Krrish Dholakia
7b38278e69
docs(model_management.md): update docs to clarify calling /model/info
2024-07-11 09:47:50 -07:00
Krish Dholakia
f4d140efec
Merge pull request #4635 from BerriAI/litellm_anthropic_adapter
...
Anthropic `/v1/messages` endpoint support
2024-07-10 22:41:53 -07:00
Krrish Dholakia
48be4ce805
feat(proxy_server.py): working /v1/messages
with config.yaml
...
Adds async router support for adapter_completion call
2024-07-10 18:53:54 -07:00
Krrish Dholakia
4ba30abb63
feat(proxy_server.py): working /v1/messages
endpoint
...
Works with claude engineer
2024-07-10 18:15:38 -07:00
Marc Abramowitz
2ce0edcca9
Move JSX stuff so first line of file is heading
...
This prevents VS Code from displaying a warning about the file not starting with
a heading.
2024-07-10 17:02:56 -07:00
Ishaan Jaff
cc4434192f
fix test routes on litellm proxy
2024-07-10 16:51:47 -07:00
Ishaan Jaff
4675983f42
Merge pull request #4648 from BerriAI/litellm_add_remaining_file_endpoints
...
[Feat] Add LIST, DELETE, GET `/files`
2024-07-10 16:42:05 -07:00
Marc Abramowitz
be4b7629b5
Proxy: Add x-litellm-call-id
response header
...
This gives the value of `logging_obj.litellm_call_id` and one particular use of
this is to correlate the HTTP response from a request with a trace in an LLM
logging tool like Langfuse, Langsmith, etc.
For example, if a user in my environment (w/ Langfuse) gets back this in the
response headers:
```
x-litellm-call-id: ffcb49e7-bd6e-4e56-9c08-a7243802b26e
```
then they know that they can see the trace for this request in Langfuse by
visiting https://langfuse.domain.com/trace/ffcb49e7-bd6e-4e56-9c08-a7243802b26e
They can also use this ID to submit scores for this request to the Langfuse
scoring API.
2024-07-10 16:05:37 -07:00
Marc Abramowitz
416bca4a3f
Remove unnecessary imports
...
from `litellm/proxy/proxy_server.py`
2024-07-10 15:06:47 -07:00
Ishaan Jaff
f118123ae1
add /files endpoints
2024-07-10 14:55:10 -07:00
Krrish Dholakia
3f4f5ae994
fix(proxy_server.py): fix proxy_server.py premium user check for encrypted license key
2024-07-10 12:25:31 -07:00
Ishaan Jaff
a9e15dad62
feat - add DELETE assistants endpoint
2024-07-10 11:37:37 -07:00
Ishaan Jaff
a784b54245
feat - add mgtm endpoint routes
2024-07-09 15:29:41 -07:00
Ishaan Jaff
6000687601
Merge pull request #4627 from BerriAI/litellm_fix_thread_auth
...
[Fix] Authentication on /thread endpoints on Proxy
2024-07-09 12:19:19 -07:00