Ishaan Jaff
693bcfac39
fix using pass_through_all_models
2024-07-25 19:32:49 -07:00
Ishaan Jaff
8f4c5437b8
router support setting pass_through_all_models
2024-07-25 18:34:12 -07:00
Krrish Dholakia
bd7af04a72
feat(proxy_server.py): support custom llm handler on proxy
2024-07-25 17:56:34 -07:00
Krrish Dholakia
bfdda089c8
fix(proxy_server.py): check if input list > 0 before indexing into it
...
resolves 'list index out of range' error
2024-07-25 14:23:07 -07:00
Marc Abramowitz
6faaa8aa50
Allow not displaying feedback box
...
by setting an env var called `LITELLM_DONT_SHOW_FEEDBACK_BOX` to `"true"`.
I liked the feedback box when I first started using LiteLLM, because it showed
me that the authors care about customers. But now that I've seen it a bunch of
times, I don't need to see it every time I start the server and I'd rather have
less output on startup.
2024-07-24 16:50:10 -07:00
Ishaan Jaff
4c1ee1e282
fix add better debugging _PROXY_track_cost_callback
2024-07-23 15:25:46 -07:00
Krrish Dholakia
f64a3309d1
fix(utils.py): support raw response headers for streaming requests
2024-07-23 11:58:58 -07:00
Ishaan Jaff
d116ff280e
feat - set alert_to_webhook_url
2024-07-23 10:08:21 -07:00
Ishaan Jaff
c34c123fe3
feat - add endpoint to set team callbacks
2024-07-22 18:18:09 -07:00
Ishaan Jaff
df1ac92222
fix add fix to update spend logs
2024-07-19 12:49:23 -07:00
Ishaan Jaff
ae316d2d9a
fix ui - make default session 24 hours
2024-07-19 10:17:45 -07:00
Ishaan Jaff
51525254e8
fix ui make ui session last 24 hours
2024-07-18 18:22:40 -07:00
Ishaan Jaff
eedacf5193
Merge branch 'main' into litellm_run_moderation_check_on_embedding
2024-07-18 12:44:30 -07:00
Florian Greinacher
f8bec3a86c
feat(proxy): support hiding health check details
2024-07-18 17:21:12 +02:00
Ishaan Jaff
9753c3676a
fix run moderation check on embedding
2024-07-17 17:59:20 -07:00
Ishaan Jaff
254ac37f65
Merge pull request #4724 from BerriAI/litellm_Set_max_file_size_transc
...
[Feat] - set max file size on /audio/transcriptions
2024-07-15 20:42:24 -07:00
Ishaan Jaff
979b5d8eea
Merge pull request #4719 from BerriAI/litellm_fix_audio_transcript
...
[Fix] /audio/transcription - don't write to the local file system
2024-07-15 20:05:42 -07:00
Ishaan Jaff
b5a2090720
use helper to check check_file_size_under_limit
2024-07-15 19:40:05 -07:00
Krrish Dholakia
9cc2daeec9
fix(utils.py): update get_model_info docstring
...
Fixes https://github.com/BerriAI/litellm/issues/4711
2024-07-15 18:18:50 -07:00
Ishaan Jaff
a900f352b5
fix - don't write file.filename
2024-07-15 14:56:01 -07:00
Krrish Dholakia
de8230ed41
fix(proxy_server.py): fix returning response headers on exception
2024-07-13 19:11:30 -07:00
Krrish Dholakia
fde434be66
feat(proxy_server.py): return 'retry-after' param for rate limited requests
...
Closes https://github.com/BerriAI/litellm/issues/4695
2024-07-13 17:15:20 -07:00
Krrish Dholakia
cff66d6151
fix(proxy_server.py): fix linting errors
2024-07-11 22:12:33 -07:00
Krish Dholakia
d72bcdbce3
Merge pull request #4669 from BerriAI/litellm_logging_only_masking
...
Flag for PII masking on Logging only
2024-07-11 22:03:37 -07:00
Krish Dholakia
72f1c9181d
Merge branch 'main' into litellm_call_id_in_response
2024-07-11 21:54:49 -07:00
Krish Dholakia
79d6b69d1c
Merge pull request #4651 from msabramo/docs-logging-cleanup
...
Docs: Miscellaneous cleanup of `docs/my-website/docs/proxy/logging.md`
2024-07-11 21:52:20 -07:00
Krrish Dholakia
9deb9b4e3f
feat(guardrails): Flag for PII Masking on Logging
...
Fixes https://github.com/BerriAI/litellm/issues/4580
2024-07-11 16:09:34 -07:00
Ishaan Jaff
28cfca87c1
Merge pull request #4647 from msabramo/msabramo/remove-unnecessary-imports
...
Remove unnecessary imports
2024-07-11 15:07:30 -07:00
Krrish Dholakia
070ab9f469
docs(model_management.md): update docs to clarify calling /model/info
2024-07-11 09:47:50 -07:00
Krish Dholakia
dacce3d78b
Merge pull request #4635 from BerriAI/litellm_anthropic_adapter
...
Anthropic `/v1/messages` endpoint support
2024-07-10 22:41:53 -07:00
Krrish Dholakia
31829855c0
feat(proxy_server.py): working /v1/messages
with config.yaml
...
Adds async router support for adapter_completion call
2024-07-10 18:53:54 -07:00
Krrish Dholakia
2f8dbbeb97
feat(proxy_server.py): working /v1/messages
endpoint
...
Works with claude engineer
2024-07-10 18:15:38 -07:00
Marc Abramowitz
dd0c07d2a1
Move JSX stuff so first line of file is heading
...
This prevents VS Code from displaying a warning about the file not starting with
a heading.
2024-07-10 17:02:56 -07:00
Ishaan Jaff
265ec00d0f
fix test routes on litellm proxy
2024-07-10 16:51:47 -07:00
Ishaan Jaff
a313174ecb
Merge pull request #4648 from BerriAI/litellm_add_remaining_file_endpoints
...
[Feat] Add LIST, DELETE, GET `/files`
2024-07-10 16:42:05 -07:00
Marc Abramowitz
3a2cb151aa
Proxy: Add x-litellm-call-id
response header
...
This gives the value of `logging_obj.litellm_call_id` and one particular use of
this is to correlate the HTTP response from a request with a trace in an LLM
logging tool like Langfuse, Langsmith, etc.
For example, if a user in my environment (w/ Langfuse) gets back this in the
response headers:
```
x-litellm-call-id: ffcb49e7-bd6e-4e56-9c08-a7243802b26e
```
then they know that they can see the trace for this request in Langfuse by
visiting https://langfuse.domain.com/trace/ffcb49e7-bd6e-4e56-9c08-a7243802b26e
They can also use this ID to submit scores for this request to the Langfuse
scoring API.
2024-07-10 16:05:37 -07:00
Marc Abramowitz
2db9c23bce
Remove unnecessary imports
...
from `litellm/proxy/proxy_server.py`
2024-07-10 15:06:47 -07:00
Ishaan Jaff
393ce7df14
add /files endpoints
2024-07-10 14:55:10 -07:00
Krrish Dholakia
aace0b22a3
fix(proxy_server.py): fix proxy_server.py premium user check for encrypted license key
2024-07-10 12:25:31 -07:00
Ishaan Jaff
62f475919b
feat - add DELETE assistants endpoint
2024-07-10 11:37:37 -07:00
Ishaan Jaff
22df67edb7
feat - add mgtm endpoint routes
2024-07-09 15:29:41 -07:00
Ishaan Jaff
6bce7e73a3
Merge pull request #4627 from BerriAI/litellm_fix_thread_auth
...
[Fix] Authentication on /thread endpoints on Proxy
2024-07-09 12:19:19 -07:00
Ishaan Jaff
12f207b499
feat - support /create assistants endpoint
2024-07-09 10:03:47 -07:00
Ishaan Jaff
6479442d5b
feat - support acreate_assistants endpoint
2024-07-09 09:49:38 -07:00
Krrish Dholakia
3045a2d9b3
fix(proxy_server.py): add license protection for 'allowed_ip' address feature
2024-07-08 16:04:44 -07:00
Ishaan Jaff
25ddbbb41d
raise budget_exceeded in user_api_key_auth
2024-07-08 12:45:39 -07:00
Krrish Dholakia
20e39d6acc
fix(utils.py): cleanup 'additionalProperties=False' for tool calling with zod
...
Fixes issue with zod passing in additionalProperties=False, causing vertex ai / gemini calls to fail
2024-07-06 17:27:37 -07:00
Ishaan Jaff
d61cc598b0
Merge pull request #4576 from BerriAI/litellm_encrypt_decrypt_using_salt
...
[Refactor] Use helper function to encrypt/decrypt model credentials
2024-07-06 15:11:09 -07:00
Krish Dholakia
127f08ee67
Merge branch 'main' into litellm_tts_pricing
2024-07-06 14:57:34 -07:00
Ishaan Jaff
7f6f85ec3c
improve sign up flow - show missing env vars
2024-07-06 13:57:19 -07:00