Commit graph

3055 commits

Author SHA1 Message Date
Ishaan Jaff
30b339739d add cancel endpoint 2024-07-31 16:12:36 -07:00
Ishaan Jaff
956a082215 add GET fine_tuning/jobs 2024-07-31 15:58:35 -07:00
Ishaan Jaff
7cded198c2 add test for ft endpoints on azure 2024-07-31 15:30:26 -07:00
Ishaan Jaff
91c787a77b add examples on config 2024-07-31 15:29:06 -07:00
Ishaan Jaff
d7c0e0c471 allow setting files config 2024-07-31 15:28:41 -07:00
Ishaan Jaff
647f436c51 add /fine_tuning/jobs routes 2024-07-31 13:50:51 -07:00
Ishaan Jaff
47cbd62347 validation for passing config file 2024-07-31 13:32:18 -07:00
Ishaan Jaff
967d18d03d read ft config 2024-07-31 13:31:00 -07:00
Ishaan Jaff
2a87aa0463 fix endpoint to create fine tuning jobs 2024-07-31 12:44:01 -07:00
Ishaan Jaff
b630ff6286 fix pydantic obj for FT endpoints 2024-07-31 12:41:39 -07:00
Ishaan Jaff
8e6b30ceea feat add POST /v1/fine_tuning/jobs 2024-07-31 11:57:52 -07:00
Ishaan Jaff
67bce867f6 Merge pull request #4975 from BerriAI/ui_switch_off_console_logs_in_prod
Proxy Admin UI - switch off console logs in production mode
2024-07-30 13:34:59 -07:00
Ishaan Jaff
bb8b2951aa ui new build 2024-07-30 13:33:34 -07:00
Ishaan Jaff
af89f1e283 Merge pull request #4973 from BerriAI/litellm_return_code_as_str
[Fix-Proxy] ProxyException code as str - Make OpenAI Compatible
2024-07-30 13:27:33 -07:00
Ishaan Jaff
0a3dbe0659 fix linting errors 2024-07-30 12:51:39 -07:00
Ishaan Jaff
d26ffbdf8c add docs on status code from exceptions 2024-07-30 12:38:33 -07:00
Ishaan Jaff
daa0b10f51 return ProxyException code as str 2024-07-30 12:35:46 -07:00
Ishaan Jaff
b64e0fad81 test batches endpoint on proxy 2024-07-30 09:46:30 -07:00
Ishaan Jaff
5979c17d07 ui new build 2024-07-29 21:20:27 -07:00
Ishaan Jaff
93f3143dc5 Merge pull request #4916 from BerriAI/litellm_fix_ui_login
Feat UI - allow using custom header for litellm api key
2024-07-29 17:08:53 -07:00
Ishaan Jaff
34ad3dec68 check litellm header in login on ui 2024-07-29 17:03:04 -07:00
Ishaan Jaff
59b91c35d7 better debugging for custom headers 2024-07-29 16:59:15 -07:00
Krrish Dholakia
7de70a19e4 fix(exceptions.py): use correct status code for content policy exceptions
Fixes https://github.com/BerriAI/litellm/issues/4941#issuecomment-2256578732
2024-07-29 12:01:54 -07:00
Ishaan Jaff
1985d6ce0e Merge pull request #4939 from BerriAI/litellm_log_transcription_resp_langfuse
[Feat-Proxy] - Langfuse log /audio/transcription on langfuse
2024-07-29 08:58:40 -07:00
Ishaan Jaff
4c427a3793 fix default input/output values for /audio/trancription logging 2024-07-29 08:03:08 -07:00
Ishaan Jaff
cc0e790863 log file_size_in_mb in metadata 2024-07-29 08:00:28 -07:00
Krrish Dholakia
80c3759719 fix(auth_checks.py): handle writing team object to redis caching correctly 2024-07-29 07:51:44 -07:00
Ishaan Jaff
096844c258 Merge pull request #4927 from BerriAI/litellm_set_max_request_response_size_ui
Feat Enterprise -  set max request  / response size UI
2024-07-27 20:06:09 -07:00
Ishaan Jaff
64bc224d63 Merge pull request #4928 from BerriAI/litellm_check_response_size
[Feat Enterprise] - check max response size
2024-07-27 17:03:56 -07:00
Ishaan Jaff
003108a074 Merge pull request #4926 from BerriAI/litellm_check_max_request_size
Proxy Enterprise - security - check max request size
2024-07-27 17:02:12 -07:00
Ishaan Jaff
b5451eaf21 allow setting max request / response size on admin UI 2024-07-27 17:00:39 -07:00
Ishaan Jaff
5cc97f3c5d set max_response_size_mb 2024-07-27 16:54:31 -07:00
Ishaan Jaff
805d04f7f3 feat check check_response_size_is_safe 2024-07-27 16:53:39 -07:00
Ishaan Jaff
5f07afa268 feat - check max response size 2024-07-27 16:53:00 -07:00
Ishaan Jaff
a18f5bd5c8 security - check max request size 2024-07-27 16:08:41 -07:00
Ishaan Jaff
ee11aff6e2 Merge pull request #4924 from BerriAI/litellm_log_writing_spend_to_db_otel
[Feat] - log writing BatchSpendUpdate events on OTEL
2024-07-27 16:07:56 -07:00
Ishaan Jaff
aade38760d use common helpers for writing to otel 2024-07-27 11:40:39 -07:00
Ishaan Jaff
bb7fc3e426 use _get_parent_otel_span_from_kwargs 2024-07-27 11:14:06 -07:00
Ishaan Jaff
cde46a4a09 feat - use log_to_opentelemetry for _PROXY_track_cost_callback 2024-07-27 11:08:22 -07:00
Krrish Dholakia
2c76524a19 build(model_prices_and_context_window.json): add mistral-large on vertex ai pricing 2024-07-27 10:37:18 -07:00
Ishaan Jaff
e3a66f2c62 feat - clearly show version litellm enterprise 2024-07-27 09:50:03 -07:00
Ishaan Jaff
a9561a1451 fix update public key 2024-07-27 09:45:58 -07:00
Krish Dholakia
fb80839e8c Merge pull request #4907 from BerriAI/litellm_proxy_get_secret
fix(proxy_server.py): fix get secret for environment_variables
2024-07-26 22:17:11 -07:00
Krish Dholakia
f011f48195 Merge pull request #4918 from BerriAI/litellm_ollama_tool_calling
feat(ollama_chat.py): support ollama tool calling
2024-07-26 22:16:58 -07:00
Krrish Dholakia
ce2cd73801 docs(ollama.md): add ollama tool calling to docs 2024-07-26 22:12:52 -07:00
Krrish Dholakia
3a1eedfbf3 feat(ollama_chat.py): support ollama tool calling
Closes https://github.com/BerriAI/litellm/issues/4812
2024-07-26 21:51:54 -07:00
Ishaan Jaff
56cf8e2798 feat link to model cost map on swagger 2024-07-26 21:34:42 -07:00
Ishaan Jaff
d98dd53755 add litellm_header_name endpoint 2024-07-26 21:04:31 -07:00
Ishaan Jaff
5ca8aa89e8 Merge pull request #4913 from BerriAI/litellm_fix_error_limit
[Proxy-Fix] - raise more descriptive errors when crossing tpm / rpm limits on keys, user, global limits
2024-07-26 20:25:28 -07:00
Ishaan Jaff
9a1b454ccc Merge pull request #4914 from BerriAI/litellm_fix_batches
[Proxy-Fix + Test] - /batches endpoint
2024-07-26 20:12:03 -07:00