Commit graph

318 commits

Author SHA1 Message Date
Paul Gauthier
90eb0ea022 Added support for the deepseek api 2024-05-07 11:44:03 -07:00
Krish Dholakia
6be20f5fc6
Merge pull request #3455 from BerriAI/litellm_assistants_support
feat(openai.py): add support for openai assistants
2024-05-04 22:21:44 -07:00
Krrish Dholakia
681a95e37b fix(assistants/main.py): support litellm.create_thread() call 2024-05-04 19:35:37 -07:00
Ishaan Jaff
855c7caf0b fix add get_first_chars_messages in utils 2024-05-04 12:43:09 -07:00
Krish Dholakia
2200900ca2
Merge pull request #3393 from Priva28/main
Add Llama3 tokenizer and allow custom tokenizers.
2024-05-02 16:32:41 -07:00
Krrish Dholakia
16522a5351 fix(utils.py): add missing providers + models to validate_environment
Closes https://github.com/BerriAI/litellm/issues/3190
2024-05-02 08:14:45 -07:00
Christian Privitelli
2d43153efa include methods in init import, add test, fix encode/decode param ordering 2024-05-02 15:49:22 +10:00
Krrish Dholakia
2a9651b3ca feat(openmeter.py): add support for user billing
open-meter supports user based billing. Closes https://github.com/BerriAI/litellm/issues/1268
2024-05-01 17:23:48 -07:00
Krrish Dholakia
b46db8b891 feat(utils.py): json logs for raw request sent by litellm
make it easier to view verbose logs in datadog
2024-04-29 19:21:19 -07:00
Krish Dholakia
1841b74f49
Merge branch 'main' into litellm_common_auth_params 2024-04-28 08:38:06 -07:00
Ishaan Jaff
6762d07c7f
Merge pull request #3330 from BerriAI/litellm_rdct_msgs
[Feat] Redact Logging Messages/Response content on Logging Providers with `litellm.turn_off_message_logging=True`
2024-04-27 11:25:09 -07:00
Krrish Dholakia
48f19cf839 feat(utils.py): unify common auth params across azure/vertex_ai/bedrock/watsonx 2024-04-27 11:06:18 -07:00
Ishaan Jaff
8b6d686e52 feat - turn_off_message_logging 2024-04-27 10:01:09 -07:00
Krish Dholakia
2d976cfabc
Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
2024-04-27 05:48:34 -07:00
Krrish Dholakia
180718c33f fix(router.py): support verify_ssl flag
Fixes https://github.com/BerriAI/litellm/issues/3162#issuecomment-2075273807
2024-04-26 15:38:01 -07:00
Simon Sanchez Viloria
74d2ba0a23 feat - watsonx refractoring, removed dependency, and added support for embedding calls 2024-04-23 12:01:13 +02:00
Simon Sanchez Viloria
6edb133733 Added support for IBM watsonx.ai models 2024-04-20 20:06:46 +02:00
Ishaan Jaff
2c76448756 fix - allow users to opt into langfuse default tags 2024-04-19 16:01:27 -07:00
Krrish Dholakia
3c6b6355c7 fix(ollama_chat.py): accept api key as a param for ollama calls
allows user to call hosted ollama endpoint using bearer token for auth
2024-04-19 13:02:13 -07:00
Ishaan Jaff
462da5a778 fix - support base 64 image conversion for all gemini model 2024-04-15 18:18:55 -07:00
Krrish Dholakia
4e81acf2c6 feat(prometheus_services.py): monitor health of proxy adjacent services (redis / postgres / etc.) 2024-04-13 18:15:02 -07:00
Krrish Dholakia
df62f931e7 fix(proxy_server.py): allow 'upperbound_key_generate' params to be set via 'os.environ/' 2024-04-09 07:48:29 -07:00
Krrish Dholakia
b6cd200676 fix(llm_guard.py): enable request-specific llm guard flag 2024-04-08 21:15:33 -07:00
Krrish Dholakia
460546956d fix(utils.py): fix import 2024-04-06 18:37:38 -07:00
Krrish Dholakia
a410981972 fix(utils.py): fix circular import 2024-04-06 18:29:51 -07:00
Krrish Dholakia
6110d32b1c feat(proxy/utils.py): return api base for request hanging alerts 2024-04-06 15:58:53 -07:00
Ishaan Jaff
2174b240d8
Merge pull request #2861 from BerriAI/litellm_add_azure_command_r_plust
[FEAT] add azure command-r-plus
2024-04-05 15:13:35 -07:00
Ishaan Jaff
5ce80d82d3 fix support azure/mistral models 2024-04-05 09:32:39 -07:00
Krrish Dholakia
f0c4ff6e60 fix(vertex_ai_anthropic.py): support streaming, async completion, async streaming for vertex ai anthropic 2024-04-05 09:27:48 -07:00
Krish Dholakia
eb34306099
Merge pull request #2665 from BerriAI/litellm_claude_vertex_ai
[WIP] feat(vertex_ai_anthropic.py): Add support for claude 3 on vertex ai
2024-04-05 07:06:04 -07:00
Krrish Dholakia
1d341970ba feat(vertex_ai_anthropic.py): add claude 3 on vertex ai support - working .completions call
.completions() call works
2024-04-02 22:07:39 -07:00
RaGe
c16833e73c (fix) add vertex_language_models to model_list 2024-04-02 20:02:46 -04:00
RaGe
a250aedf71 (fix) restore missing comma 2024-04-02 20:02:28 -04:00
Krrish Dholakia
203e2776f8 fix(proxy_server.py): allow user to set in-memory + redis ttl
addresses - https://github.com/BerriAI/litellm/issues/2700
2024-04-01 19:13:23 -07:00
Krrish Dholakia
5280fc809f fix(proxy_server.py): enforce end user budgets with 'litellm.max_end_user_budget' param 2024-03-29 17:14:40 -07:00
Krrish Dholakia
6d418a2920 fix(llm_guard.py): working llm-guard 'key-specific' mode 2024-03-26 17:47:20 -07:00
Krrish Dholakia
e10eb8f6fe feat(llm_guard.py): enable key-specific llm guard check 2024-03-26 17:21:51 -07:00
Krrish Dholakia
f153889738 fix(utils.py): allow user to disable streaming logging
fixes event loop issue for litellm.disable_streaming_logging
2024-03-25 14:28:46 -07:00
Krrish Dholakia
bc66ef9d5c fix(utils.py): fix aws secret manager + support key_management_settings
fixes the aws secret manager implementation and allows the user to set which keys they want to check thr
ough it
2024-03-16 16:47:50 -07:00
Krrish Dholakia
9909f44015 feat(utils.py): add native fireworks ai support
addresses - https://github.com/BerriAI/litellm/issues/777, https://github.com/BerriAI/litellm/issues/2486
2024-03-15 09:09:59 -07:00
Krrish Dholakia
a634424fb2 fix(utils.py): move to using litellm.modify_params to enable max output token trimming fix 2024-03-14 12:17:56 -07:00
Ishaan Jaff
5172fb1de9
Merge pull request #2474 from BerriAI/litellm_support_command_r
[New-Model] Cohere/command-r
2024-03-12 11:11:56 -07:00
ishaan-jaff
777cf094e5 (feat) use model json to get cohere_models 2024-03-12 10:53:26 -07:00
ishaan-jaff
042a71cdc7 (feat) v0 support command-r 2024-03-12 10:26:58 -07:00
ishaan-jaff
b193b01f40 (feat) support azure/gpt-instruct models 2024-03-12 09:30:15 -07:00
Krrish Dholakia
daa371ade9 fix(utils.py): add support for anthropic params in get_supported_openai_params 2024-03-08 23:06:40 -08:00
Krrish Dholakia
0e7b30bec9 fix(utils.py): return function name for ollama_chat function calls 2024-03-08 08:01:10 -08:00
Krish Dholakia
5b3459d759
Merge branch 'main' into litellm_claude_3_bedrock_access 2024-03-05 07:10:45 -08:00
Krrish Dholakia
f277b204a3 fix(init.py): expose 'get_model_params' function 2024-03-04 21:22:09 -08:00
Krrish Dholakia
0ac652a771 fix(bedrock.py): add claude 3 support 2024-03-04 17:15:47 -08:00