Ishaan Jaff
|
1fe035c6dd
|
feat - add open ai moderations check
|
2024-05-23 13:08:06 -07:00 |
|
Ishaan Jaff
|
2519879e67
|
add ImageObject
|
2024-05-20 10:45:37 -07:00 |
|
Ishaan Jaff
|
a4f906b464
|
feat - add litellm.ImageResponse
|
2024-05-20 10:09:41 -07:00 |
|
Krrish Dholakia
|
3acb31fa49
|
docs(lago.md): add lago usage-based billing quick-start to docs
|
2024-05-16 13:24:04 -07:00 |
|
Krrish Dholakia
|
e273e66618
|
feat(lago.py): adding support for usage-based billing with lago
Closes https://github.com/BerriAI/litellm/issues/3639
|
2024-05-16 10:54:18 -07:00 |
|
Krrish Dholakia
|
9eee2f3889
|
docs(prod.md): add 'disable load_dotenv' tutorial to docs
|
2024-05-14 19:13:22 -07:00 |
|
Krrish Dholakia
|
1ab4974773
|
fix: disable 'load_dotenv' for prod environments
|
2024-05-14 19:09:36 -07:00 |
|
Krrish Dholakia
|
7557b3e2ff
|
fix(init.py): set 'default_fallbacks' as a litellm_setting
|
2024-05-14 11:15:53 -07:00 |
|
Krrish Dholakia
|
20456968e9
|
fix(openai.py): creat MistralConfig with response_format mapping for mistral api
|
2024-05-13 13:29:58 -07:00 |
|
Krrish Dholakia
|
c3293474dd
|
fix(proxy_server.py): return 'allowed-model-region' in headers
|
2024-05-13 08:48:16 -07:00 |
|
Krish Dholakia
|
1d651c6049
|
Merge branch 'main' into litellm_bedrock_command_r_support
|
2024-05-11 21:24:42 -07:00 |
|
Krrish Dholakia
|
59c8c0adff
|
feat(bedrock_httpx.py): working cohere command r async calls
|
2024-05-11 15:04:38 -07:00 |
|
Krish Dholakia
|
86d0c0ae4e
|
Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting
feat(router.py): allow setting model_region in litellm_params
|
2024-05-11 11:36:22 -07:00 |
|
Krrish Dholakia
|
ebc927f1c8
|
feat(router.py): allow setting model_region in litellm_params
Closes https://github.com/BerriAI/litellm/issues/3580
|
2024-05-11 10:18:08 -07:00 |
|
Krish Dholakia
|
8f6ae9a059
|
Merge pull request #3369 from mogith-pn/main
Clarifai-LiteLLM : Added clarifai as LLM Provider.
|
2024-05-11 09:31:46 -07:00 |
|
Ishaan Jaff
|
18ed87edc4
|
add triton embedding to _init
|
2024-05-10 18:46:25 -07:00 |
|
Krish Dholakia
|
a671046b45
|
Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
|
2024-05-09 22:21:16 -07:00 |
|
CyanideByte
|
4a7be9163b
|
Globally filtering pydantic conflict warnings
|
2024-05-09 17:42:19 -07:00 |
|
Krrish Dholakia
|
186c0ec77b
|
feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
|
2024-05-09 16:39:43 -07:00 |
|
Paul Gauthier
|
90eb0ea022
|
Added support for the deepseek api
|
2024-05-07 11:44:03 -07:00 |
|
Krish Dholakia
|
6be20f5fc6
|
Merge pull request #3455 from BerriAI/litellm_assistants_support
feat(openai.py): add support for openai assistants
|
2024-05-04 22:21:44 -07:00 |
|
Krrish Dholakia
|
681a95e37b
|
fix(assistants/main.py): support litellm.create_thread() call
|
2024-05-04 19:35:37 -07:00 |
|
Ishaan Jaff
|
855c7caf0b
|
fix add get_first_chars_messages in utils
|
2024-05-04 12:43:09 -07:00 |
|
Krish Dholakia
|
2200900ca2
|
Merge pull request #3393 from Priva28/main
Add Llama3 tokenizer and allow custom tokenizers.
|
2024-05-02 16:32:41 -07:00 |
|
Krrish Dholakia
|
16522a5351
|
fix(utils.py): add missing providers + models to validate_environment
Closes https://github.com/BerriAI/litellm/issues/3190
|
2024-05-02 08:14:45 -07:00 |
|
Christian Privitelli
|
2d43153efa
|
include methods in init import, add test, fix encode/decode param ordering
|
2024-05-02 15:49:22 +10:00 |
|
Krrish Dholakia
|
2a9651b3ca
|
feat(openmeter.py): add support for user billing
open-meter supports user based billing. Closes https://github.com/BerriAI/litellm/issues/1268
|
2024-05-01 17:23:48 -07:00 |
|
mogith-pn
|
d770df2259
|
Merge branch 'main' into main
|
2024-04-30 22:48:52 +05:30 |
|
mogith-pn
|
318b4813f2
|
Clarifai-LiteLLM integration (#1)
* intg v1 clarifai-litellm
* Added more community models and testcase
* Clarifai-updated markdown docs
|
2024-04-30 22:38:33 +05:30 |
|
Krrish Dholakia
|
b46db8b891
|
feat(utils.py): json logs for raw request sent by litellm
make it easier to view verbose logs in datadog
|
2024-04-29 19:21:19 -07:00 |
|
Krish Dholakia
|
1841b74f49
|
Merge branch 'main' into litellm_common_auth_params
|
2024-04-28 08:38:06 -07:00 |
|
Ishaan Jaff
|
6762d07c7f
|
Merge pull request #3330 from BerriAI/litellm_rdct_msgs
[Feat] Redact Logging Messages/Response content on Logging Providers with `litellm.turn_off_message_logging=True`
|
2024-04-27 11:25:09 -07:00 |
|
Krrish Dholakia
|
48f19cf839
|
feat(utils.py): unify common auth params across azure/vertex_ai/bedrock/watsonx
|
2024-04-27 11:06:18 -07:00 |
|
Ishaan Jaff
|
8b6d686e52
|
feat - turn_off_message_logging
|
2024-04-27 10:01:09 -07:00 |
|
Krish Dholakia
|
2d976cfabc
|
Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
|
2024-04-27 05:48:34 -07:00 |
|
Krrish Dholakia
|
180718c33f
|
fix(router.py): support verify_ssl flag
Fixes https://github.com/BerriAI/litellm/issues/3162#issuecomment-2075273807
|
2024-04-26 15:38:01 -07:00 |
|
Simon Sanchez Viloria
|
74d2ba0a23
|
feat - watsonx refractoring, removed dependency, and added support for embedding calls
|
2024-04-23 12:01:13 +02:00 |
|
Simon Sanchez Viloria
|
6edb133733
|
Added support for IBM watsonx.ai models
|
2024-04-20 20:06:46 +02:00 |
|
Ishaan Jaff
|
2c76448756
|
fix - allow users to opt into langfuse default tags
|
2024-04-19 16:01:27 -07:00 |
|
Krrish Dholakia
|
3c6b6355c7
|
fix(ollama_chat.py): accept api key as a param for ollama calls
allows user to call hosted ollama endpoint using bearer token for auth
|
2024-04-19 13:02:13 -07:00 |
|
Ishaan Jaff
|
462da5a778
|
fix - support base 64 image conversion for all gemini model
|
2024-04-15 18:18:55 -07:00 |
|
Krrish Dholakia
|
4e81acf2c6
|
feat(prometheus_services.py): monitor health of proxy adjacent services (redis / postgres / etc.)
|
2024-04-13 18:15:02 -07:00 |
|
Krrish Dholakia
|
df62f931e7
|
fix(proxy_server.py): allow 'upperbound_key_generate' params to be set via 'os.environ/'
|
2024-04-09 07:48:29 -07:00 |
|
Krrish Dholakia
|
b6cd200676
|
fix(llm_guard.py): enable request-specific llm guard flag
|
2024-04-08 21:15:33 -07:00 |
|
Krrish Dholakia
|
460546956d
|
fix(utils.py): fix import
|
2024-04-06 18:37:38 -07:00 |
|
Krrish Dholakia
|
a410981972
|
fix(utils.py): fix circular import
|
2024-04-06 18:29:51 -07:00 |
|
Krrish Dholakia
|
6110d32b1c
|
feat(proxy/utils.py): return api base for request hanging alerts
|
2024-04-06 15:58:53 -07:00 |
|
Ishaan Jaff
|
2174b240d8
|
Merge pull request #2861 from BerriAI/litellm_add_azure_command_r_plust
[FEAT] add azure command-r-plus
|
2024-04-05 15:13:35 -07:00 |
|
Ishaan Jaff
|
5ce80d82d3
|
fix support azure/mistral models
|
2024-04-05 09:32:39 -07:00 |
|
Krrish Dholakia
|
f0c4ff6e60
|
fix(vertex_ai_anthropic.py): support streaming, async completion, async streaming for vertex ai anthropic
|
2024-04-05 09:27:48 -07:00 |
|