Krrish Dholakia
|
ac6c39c283
|
feat(anthropic_adapter.py): support streaming requests for /v1/messages endpoint
Fixes https://github.com/BerriAI/litellm/issues/5011
|
2024-08-03 20:16:19 -07:00 |
|
Krrish Dholakia
|
5d96ff6694
|
fix(utils.py): handle scenario where model="azure/*" and custom_llm_provider="azure"
Fixes https://github.com/BerriAI/litellm/issues/4912
|
2024-08-02 17:48:53 -07:00 |
|
Krrish Dholakia
|
d8a8cd2961
|
feat(ui): add ability to enable traceloop + langsmith via ui
|
2024-07-31 21:40:29 -07:00 |
|
Krrish Dholakia
|
d914aa558d
|
feat(ui): add braintrust logging to ui
|
2024-07-31 21:15:19 -07:00 |
|
Ishaan Jaff
|
d833c69acb
|
Merge pull request #4987 from BerriAI/litellm_add_ft_endpoints
[Feat-Proxy] Add List fine-tuning jobs
|
2024-07-31 16:49:59 -07:00 |
|
Ishaan Jaff
|
cd3a01c03f
|
fix fine tuning endpoint postion on swagger
|
2024-07-31 16:46:58 -07:00 |
|
Ishaan Jaff
|
424a70c331
|
fix routes order
|
2024-07-31 16:36:50 -07:00 |
|
Ishaan Jaff
|
2371df9ded
|
allow setting files config
|
2024-07-31 15:28:41 -07:00 |
|
Krrish Dholakia
|
6202f9bbb0
|
fix(http_handler.py): correctly re-raise timeout exception
|
2024-07-31 14:51:28 -07:00 |
|
Ishaan Jaff
|
bd7b485d09
|
read ft config
|
2024-07-31 13:31:00 -07:00 |
|
Ishaan Jaff
|
9d90f174a7
|
fix endpoint to create fine tuning jobs
|
2024-07-31 12:44:01 -07:00 |
|
Ishaan Jaff
|
66211b42db
|
fix linting errors
|
2024-07-30 12:51:39 -07:00 |
|
Ishaan Jaff
|
563d59a305
|
test batches endpoint on proxy
|
2024-07-30 09:46:30 -07:00 |
|
Ishaan Jaff
|
1a34756159
|
Merge pull request #4916 from BerriAI/litellm_fix_ui_login
Feat UI - allow using custom header for litellm api key
|
2024-07-29 17:08:53 -07:00 |
|
Ishaan Jaff
|
0c25aaf9df
|
check litellm header in login on ui
|
2024-07-29 17:03:04 -07:00 |
|
Ishaan Jaff
|
9b69e500e5
|
Merge pull request #4927 from BerriAI/litellm_set_max_request_response_size_ui
Feat Enterprise - set max request / response size UI
|
2024-07-27 20:06:09 -07:00 |
|
Ishaan Jaff
|
10e70f842d
|
Merge pull request #4928 from BerriAI/litellm_check_response_size
[Feat Enterprise] - check max response size
|
2024-07-27 17:03:56 -07:00 |
|
Ishaan Jaff
|
3511aadf99
|
allow setting max request / response size on admin UI
|
2024-07-27 17:00:39 -07:00 |
|
Ishaan Jaff
|
b2f72338f6
|
feat check check_response_size_is_safe
|
2024-07-27 16:53:39 -07:00 |
|
Ishaan Jaff
|
19fb5cc11c
|
use common helpers for writing to otel
|
2024-07-27 11:40:39 -07:00 |
|
Ishaan Jaff
|
61c10e60a4
|
feat - use log_to_opentelemetry for _PROXY_track_cost_callback
|
2024-07-27 11:08:22 -07:00 |
|
Ishaan Jaff
|
1adf71b9b7
|
feat - clearly show version litellm enterprise
|
2024-07-27 09:50:03 -07:00 |
|
Krish Dholakia
|
9bdcef238b
|
Merge pull request #4907 from BerriAI/litellm_proxy_get_secret
fix(proxy_server.py): fix get secret for environment_variables
|
2024-07-26 22:17:11 -07:00 |
|
Ishaan Jaff
|
2501b4eccd
|
feat link to model cost map on swagger
|
2024-07-26 21:34:42 -07:00 |
|
Ishaan Jaff
|
f627fa9b40
|
fix for GET /v1/batches{batch_id:path}
|
2024-07-26 18:23:15 -07:00 |
|
Ishaan Jaff
|
159a880dcc
|
fix /v1/batches POST
|
2024-07-26 18:06:00 -07:00 |
|
Krrish Dholakia
|
9943c6d607
|
fix(proxy_server.py): fix get secret for environment_variables
|
2024-07-26 13:33:02 -07:00 |
|
Krrish Dholakia
|
1d6c39a607
|
feat(proxy_server.py): handle pydantic mockselvar error
Fixes https://github.com/BerriAI/litellm/issues/4898#issuecomment-2252105485
|
2024-07-26 08:38:51 -07:00 |
|
Ishaan Jaff
|
079a41fbe1
|
Merge branch 'main' into litellm_proxy_support_all_providers
|
2024-07-25 20:15:37 -07:00 |
|
Ishaan Jaff
|
693bcfac39
|
fix using pass_through_all_models
|
2024-07-25 19:32:49 -07:00 |
|
Ishaan Jaff
|
8f4c5437b8
|
router support setting pass_through_all_models
|
2024-07-25 18:34:12 -07:00 |
|
Krrish Dholakia
|
bd7af04a72
|
feat(proxy_server.py): support custom llm handler on proxy
|
2024-07-25 17:56:34 -07:00 |
|
Krrish Dholakia
|
bfdda089c8
|
fix(proxy_server.py): check if input list > 0 before indexing into it
resolves 'list index out of range' error
|
2024-07-25 14:23:07 -07:00 |
|
Ishaan Jaff
|
4c1ee1e282
|
fix add better debugging _PROXY_track_cost_callback
|
2024-07-23 15:25:46 -07:00 |
|
Krrish Dholakia
|
f64a3309d1
|
fix(utils.py): support raw response headers for streaming requests
|
2024-07-23 11:58:58 -07:00 |
|
Ishaan Jaff
|
d116ff280e
|
feat - set alert_to_webhook_url
|
2024-07-23 10:08:21 -07:00 |
|
Ishaan Jaff
|
c34c123fe3
|
feat - add endpoint to set team callbacks
|
2024-07-22 18:18:09 -07:00 |
|
Ishaan Jaff
|
df1ac92222
|
fix add fix to update spend logs
|
2024-07-19 12:49:23 -07:00 |
|
Ishaan Jaff
|
ae316d2d9a
|
fix ui - make default session 24 hours
|
2024-07-19 10:17:45 -07:00 |
|
Ishaan Jaff
|
51525254e8
|
fix ui make ui session last 24 hours
|
2024-07-18 18:22:40 -07:00 |
|
Ishaan Jaff
|
eedacf5193
|
Merge branch 'main' into litellm_run_moderation_check_on_embedding
|
2024-07-18 12:44:30 -07:00 |
|
Florian Greinacher
|
f8bec3a86c
|
feat(proxy): support hiding health check details
|
2024-07-18 17:21:12 +02:00 |
|
Ishaan Jaff
|
9753c3676a
|
fix run moderation check on embedding
|
2024-07-17 17:59:20 -07:00 |
|
Ishaan Jaff
|
254ac37f65
|
Merge pull request #4724 from BerriAI/litellm_Set_max_file_size_transc
[Feat] - set max file size on /audio/transcriptions
|
2024-07-15 20:42:24 -07:00 |
|
Ishaan Jaff
|
979b5d8eea
|
Merge pull request #4719 from BerriAI/litellm_fix_audio_transcript
[Fix] /audio/transcription - don't write to the local file system
|
2024-07-15 20:05:42 -07:00 |
|
Ishaan Jaff
|
b5a2090720
|
use helper to check check_file_size_under_limit
|
2024-07-15 19:40:05 -07:00 |
|
Krrish Dholakia
|
9cc2daeec9
|
fix(utils.py): update get_model_info docstring
Fixes https://github.com/BerriAI/litellm/issues/4711
|
2024-07-15 18:18:50 -07:00 |
|
Ishaan Jaff
|
a900f352b5
|
fix - don't write file.filename
|
2024-07-15 14:56:01 -07:00 |
|
Krrish Dholakia
|
de8230ed41
|
fix(proxy_server.py): fix returning response headers on exception
|
2024-07-13 19:11:30 -07:00 |
|
Krrish Dholakia
|
fde434be66
|
feat(proxy_server.py): return 'retry-after' param for rate limited requests
Closes https://github.com/BerriAI/litellm/issues/4695
|
2024-07-13 17:15:20 -07:00 |
|