Krish Dholakia
88a2428ee6
Merge branch 'main' into litellm_anthropic_api_streaming
2024-08-03 21:16:50 -07:00
Krrish Dholakia
5810708c71
feat(anthropic_adapter.py): support streaming requests for /v1/messages
endpoint
...
Fixes https://github.com/BerriAI/litellm/issues/5011
2024-08-03 20:16:19 -07:00
Ishaan Jaff
c206a9c2d7
set native vertex endpoints
2024-08-03 16:52:03 -07:00
Krrish Dholakia
e6bc7e938a
fix(utils.py): handle scenario where model="azure/*" and custom_llm_provider="azure"
...
Fixes https://github.com/BerriAI/litellm/issues/4912
2024-08-02 17:48:53 -07:00
Krrish Dholakia
e60bec92ac
feat(ui): add ability to enable traceloop + langsmith via ui
2024-07-31 21:40:29 -07:00
Krrish Dholakia
c5e38b3bec
feat(ui): add braintrust logging to ui
2024-07-31 21:15:19 -07:00
Ishaan Jaff
4f21ce2873
Merge pull request #4987 from BerriAI/litellm_add_ft_endpoints
...
[Feat-Proxy] Add List fine-tuning jobs
2024-07-31 16:49:59 -07:00
Ishaan Jaff
f90f3f28fb
fix fine tuning endpoint postion on swagger
2024-07-31 16:46:58 -07:00
Ishaan Jaff
8a266a0493
fix routes order
2024-07-31 16:36:50 -07:00
Ishaan Jaff
d7c0e0c471
allow setting files config
2024-07-31 15:28:41 -07:00
Krrish Dholakia
368af6975a
fix(http_handler.py): correctly re-raise timeout exception
2024-07-31 14:51:28 -07:00
Ishaan Jaff
967d18d03d
read ft config
2024-07-31 13:31:00 -07:00
Ishaan Jaff
2a87aa0463
fix endpoint to create fine tuning jobs
2024-07-31 12:44:01 -07:00
Ishaan Jaff
0a3dbe0659
fix linting errors
2024-07-30 12:51:39 -07:00
Ishaan Jaff
b64e0fad81
test batches endpoint on proxy
2024-07-30 09:46:30 -07:00
Ishaan Jaff
93f3143dc5
Merge pull request #4916 from BerriAI/litellm_fix_ui_login
...
Feat UI - allow using custom header for litellm api key
2024-07-29 17:08:53 -07:00
Ishaan Jaff
34ad3dec68
check litellm header in login on ui
2024-07-29 17:03:04 -07:00
Ishaan Jaff
096844c258
Merge pull request #4927 from BerriAI/litellm_set_max_request_response_size_ui
...
Feat Enterprise - set max request / response size UI
2024-07-27 20:06:09 -07:00
Ishaan Jaff
64bc224d63
Merge pull request #4928 from BerriAI/litellm_check_response_size
...
[Feat Enterprise] - check max response size
2024-07-27 17:03:56 -07:00
Ishaan Jaff
b5451eaf21
allow setting max request / response size on admin UI
2024-07-27 17:00:39 -07:00
Ishaan Jaff
805d04f7f3
feat check check_response_size_is_safe
2024-07-27 16:53:39 -07:00
Ishaan Jaff
aade38760d
use common helpers for writing to otel
2024-07-27 11:40:39 -07:00
Ishaan Jaff
cde46a4a09
feat - use log_to_opentelemetry for _PROXY_track_cost_callback
2024-07-27 11:08:22 -07:00
Ishaan Jaff
e3a66f2c62
feat - clearly show version litellm enterprise
2024-07-27 09:50:03 -07:00
Krish Dholakia
fb80839e8c
Merge pull request #4907 from BerriAI/litellm_proxy_get_secret
...
fix(proxy_server.py): fix get secret for environment_variables
2024-07-26 22:17:11 -07:00
Ishaan Jaff
56cf8e2798
feat link to model cost map on swagger
2024-07-26 21:34:42 -07:00
Ishaan Jaff
864f803ccf
fix for GET /v1/batches{batch_id:path}
2024-07-26 18:23:15 -07:00
Ishaan Jaff
2b889b83b3
fix /v1/batches POST
2024-07-26 18:06:00 -07:00
Krrish Dholakia
1a172b7636
fix(proxy_server.py): fix get secret for environment_variables
2024-07-26 13:33:02 -07:00
Krrish Dholakia
9d87767639
feat(proxy_server.py): handle pydantic mockselvar error
...
Fixes https://github.com/BerriAI/litellm/issues/4898#issuecomment-2252105485
2024-07-26 08:38:51 -07:00
Ishaan Jaff
1103c614a0
Merge branch 'main' into litellm_proxy_support_all_providers
2024-07-25 20:15:37 -07:00
Ishaan Jaff
d589d8e4ac
fix using pass_through_all_models
2024-07-25 19:32:49 -07:00
Ishaan Jaff
a46c463dee
router support setting pass_through_all_models
2024-07-25 18:34:12 -07:00
Krrish Dholakia
84ef8c11ff
feat(proxy_server.py): support custom llm handler on proxy
2024-07-25 17:56:34 -07:00
Krrish Dholakia
85dd423de6
fix(proxy_server.py): check if input list > 0 before indexing into it
...
resolves 'list index out of range' error
2024-07-25 14:23:07 -07:00
Marc Abramowitz
bc00e4c9ff
Allow not displaying feedback box
...
by setting an env var called `LITELLM_DONT_SHOW_FEEDBACK_BOX` to `"true"`.
I liked the feedback box when I first started using LiteLLM, because it showed
me that the authors care about customers. But now that I've seen it a bunch of
times, I don't need to see it every time I start the server and I'd rather have
less output on startup.
2024-07-24 16:50:10 -07:00
Ishaan Jaff
d3d4124357
fix add better debugging _PROXY_track_cost_callback
2024-07-23 15:25:46 -07:00
Krrish Dholakia
271407400a
fix(utils.py): support raw response headers for streaming requests
2024-07-23 11:58:58 -07:00
Ishaan Jaff
fed7ad44ae
feat - set alert_to_webhook_url
2024-07-23 10:08:21 -07:00
Ishaan Jaff
f1e0919446
feat - add endpoint to set team callbacks
2024-07-22 18:18:09 -07:00
Ishaan Jaff
75e48c84f4
fix add fix to update spend logs
2024-07-19 12:49:23 -07:00
Ishaan Jaff
dbd4e7ac56
fix ui - make default session 24 hours
2024-07-19 10:17:45 -07:00
Ishaan Jaff
372c83a148
fix ui make ui session last 24 hours
2024-07-18 18:22:40 -07:00
Ishaan Jaff
dfb5e816a6
Merge branch 'main' into litellm_run_moderation_check_on_embedding
2024-07-18 12:44:30 -07:00
Florian Greinacher
d83a22b392
feat(proxy): support hiding health check details
2024-07-18 17:21:12 +02:00
Ishaan Jaff
ebcd6ec232
fix run moderation check on embedding
2024-07-17 17:59:20 -07:00
Ishaan Jaff
7944450074
Merge pull request #4724 from BerriAI/litellm_Set_max_file_size_transc
...
[Feat] - set max file size on /audio/transcriptions
2024-07-15 20:42:24 -07:00
Ishaan Jaff
c755522724
Merge pull request #4719 from BerriAI/litellm_fix_audio_transcript
...
[Fix] /audio/transcription - don't write to the local file system
2024-07-15 20:05:42 -07:00
Ishaan Jaff
57f0633b99
use helper to check check_file_size_under_limit
2024-07-15 19:40:05 -07:00
Krrish Dholakia
cdfe1e6847
fix(utils.py): update get_model_info docstring
...
Fixes https://github.com/BerriAI/litellm/issues/4711
2024-07-15 18:18:50 -07:00