Ishaan Jaff
|
ed14b02b07
|
feat - working audit logs for create, update delete team
|
2024-06-05 17:50:27 -07:00 |
|
Krrish Dholakia
|
7432c6a4d9
|
fix(utils.py): fix cost calculation for openai-compatible streaming object
|
2024-06-04 10:36:25 -07:00 |
|
Krish Dholakia
|
127d1457de
|
Merge pull request #3996 from BerriAI/litellm_azure_assistants_api_support
feat(assistants/main.py): Azure Assistants API support
|
2024-06-03 21:05:03 -07:00 |
|
Krrish Dholakia
|
a2ba63955a
|
feat(assistants/main.py): Closes https://github.com/BerriAI/litellm/issues/3993
|
2024-06-03 18:47:05 -07:00 |
|
Krrish Dholakia
|
a2cf59a308
|
fix(langfuse.py): log litellm response cost as part of langfuse metadata
|
2024-06-03 12:58:30 -07:00 |
|
Ishaan Jaff
|
dd25d83087
|
Merge pull request #3962 from BerriAI/litellm_return_num_rets_max_exceptions
[Feat] return `num_retries` and `max_retries` in exceptions
|
2024-06-01 17:48:38 -07:00 |
|
Ishaan Jaff
|
2341d99bdc
|
feat - add num retries and max retries in exception
|
2024-06-01 16:53:00 -07:00 |
|
Krrish Dholakia
|
69244aabf3
|
fix(http_handler.py): allow setting ca bundle path
|
2024-06-01 14:48:53 -07:00 |
|
Krish Dholakia
|
1529f665cc
|
Merge pull request #3954 from BerriAI/litellm_simple_request_prioritization
feat(scheduler.py): add request prioritization scheduler
|
2024-05-31 23:29:09 -07:00 |
|
Krish Dholakia
|
f2ca86b0e7
|
Merge pull request #3944 from BerriAI/litellm_fix_parallel_streaming
fix: fix streaming with httpx client
|
2024-05-31 21:42:37 -07:00 |
|
Krrish Dholakia
|
27c2753aaf
|
docs(scheduler.md): add request prioritization to docs
|
2024-05-31 19:35:47 -07:00 |
|
Krrish Dholakia
|
3896e3e88f
|
fix: fix streaming with httpx client
prevent overwriting streams in parallel streaming calls
|
2024-05-31 10:55:18 -07:00 |
|
Ishaan Jaff
|
d4d9b098b1
|
fix - vertex ai cache clients
|
2024-05-30 21:22:32 -07:00 |
|
Krrish Dholakia
|
1e89a1f56e
|
feat(main.py): support openai tts endpoint
Closes https://github.com/BerriAI/litellm/issues/3094
|
2024-05-30 14:28:28 -07:00 |
|
Krrish Dholakia
|
741bfb9cef
|
fix(proxy_cli.py): enable json logging via litellm_settings param on config
allows user to enable json logs without needing to figure out env variables
|
2024-05-29 21:41:20 -07:00 |
|
Ishaan Jaff
|
4dc7bfebd4
|
feat - import batches in __init__
|
2024-05-28 15:35:11 -07:00 |
|
Krrish Dholakia
|
23542fc1d2
|
fix(utils.py): support deepinfra optional params
Fixes https://github.com/BerriAI/litellm/issues/3855
|
2024-05-27 09:16:56 -07:00 |
|
Krrish Dholakia
|
c50074a0b7
|
feat(ui/model_dashboard.tsx): add databricks models via admin ui
|
2024-05-23 20:28:54 -07:00 |
|
Krish Dholakia
|
edb349a9ab
|
Merge pull request #3808 from BerriAI/litellm_databricks_api
feat(databricks.py): adds databricks support - completion, async, streaming
|
2024-05-23 19:23:19 -07:00 |
|
Krrish Dholakia
|
e3c5e004c5
|
feat(databricks.py): add embedding model support
|
2024-05-23 18:22:03 -07:00 |
|
Krrish Dholakia
|
143a44823a
|
feat(databricks.py): adds databricks support - completion, async, streaming
Closes https://github.com/BerriAI/litellm/issues/2160
|
2024-05-23 16:29:46 -07:00 |
|
Ishaan Jaff
|
759852b1b8
|
feat - add open ai moderations check
|
2024-05-23 13:08:06 -07:00 |
|
Ishaan Jaff
|
76a1444621
|
add ImageObject
|
2024-05-20 10:45:37 -07:00 |
|
Ishaan Jaff
|
597734bfae
|
feat - add litellm.ImageResponse
|
2024-05-20 10:09:41 -07:00 |
|
Krrish Dholakia
|
840027f001
|
docs(lago.md): add lago usage-based billing quick-start to docs
|
2024-05-16 13:24:04 -07:00 |
|
Krrish Dholakia
|
d167a9ea99
|
feat(lago.py): adding support for usage-based billing with lago
Closes https://github.com/BerriAI/litellm/issues/3639
|
2024-05-16 10:54:18 -07:00 |
|
Krrish Dholakia
|
943432c758
|
docs(prod.md): add 'disable load_dotenv' tutorial to docs
|
2024-05-14 19:13:22 -07:00 |
|
Krrish Dholakia
|
c2fa620088
|
fix: disable 'load_dotenv' for prod environments
|
2024-05-14 19:09:36 -07:00 |
|
Krrish Dholakia
|
b054f39bab
|
fix(init.py): set 'default_fallbacks' as a litellm_setting
|
2024-05-14 11:15:53 -07:00 |
|
Krrish Dholakia
|
96336cdd49
|
fix(openai.py): creat MistralConfig with response_format mapping for mistral api
|
2024-05-13 13:29:58 -07:00 |
|
Krrish Dholakia
|
5f5fdb439b
|
fix(proxy_server.py): return 'allowed-model-region' in headers
|
2024-05-13 08:48:16 -07:00 |
|
Krish Dholakia
|
784ae85ba0
|
Merge branch 'main' into litellm_bedrock_command_r_support
|
2024-05-11 21:24:42 -07:00 |
|
Krrish Dholakia
|
5185580e3d
|
feat(bedrock_httpx.py): working cohere command r async calls
|
2024-05-11 15:04:38 -07:00 |
|
Krish Dholakia
|
7f64c61275
|
Merge pull request #3582 from BerriAI/litellm_explicit_region_name_setting
feat(router.py): allow setting model_region in litellm_params
|
2024-05-11 11:36:22 -07:00 |
|
Krrish Dholakia
|
2ed155b4d4
|
feat(router.py): allow setting model_region in litellm_params
Closes https://github.com/BerriAI/litellm/issues/3580
|
2024-05-11 10:18:08 -07:00 |
|
Krish Dholakia
|
8ab9c861c9
|
Merge pull request #3369 from mogith-pn/main
Clarifai-LiteLLM : Added clarifai as LLM Provider.
|
2024-05-11 09:31:46 -07:00 |
|
Ishaan Jaff
|
7451b8f664
|
add triton embedding to _init
|
2024-05-10 18:46:25 -07:00 |
|
Krish Dholakia
|
ddf09a3193
|
Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
|
2024-05-09 22:21:16 -07:00 |
|
CyanideByte
|
1e794714b4
|
Globally filtering pydantic conflict warnings
|
2024-05-09 17:42:19 -07:00 |
|
Krrish Dholakia
|
f660d21743
|
feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
|
2024-05-09 16:39:43 -07:00 |
|
Paul Gauthier
|
c72e7e85e2
|
Added support for the deepseek api
|
2024-05-07 11:44:03 -07:00 |
|
Krish Dholakia
|
fd06009199
|
Merge pull request #3455 from BerriAI/litellm_assistants_support
feat(openai.py): add support for openai assistants
|
2024-05-04 22:21:44 -07:00 |
|
Krrish Dholakia
|
b0845d82cd
|
fix(assistants/main.py): support litellm.create_thread() call
|
2024-05-04 19:35:37 -07:00 |
|
Ishaan Jaff
|
adf09bdd45
|
fix add get_first_chars_messages in utils
|
2024-05-04 12:43:09 -07:00 |
|
Krish Dholakia
|
7e04447159
|
Merge pull request #3393 from Priva28/main
Add Llama3 tokenizer and allow custom tokenizers.
|
2024-05-02 16:32:41 -07:00 |
|
Krrish Dholakia
|
c0487b16af
|
fix(utils.py): add missing providers + models to validate_environment
Closes https://github.com/BerriAI/litellm/issues/3190
|
2024-05-02 08:14:45 -07:00 |
|
Christian Privitelli
|
8b4bc4c832
|
include methods in init import, add test, fix encode/decode param ordering
|
2024-05-02 15:49:22 +10:00 |
|
Krrish Dholakia
|
e7b3ac8e06
|
feat(openmeter.py): add support for user billing
open-meter supports user based billing. Closes https://github.com/BerriAI/litellm/issues/1268
|
2024-05-01 17:23:48 -07:00 |
|
mogith-pn
|
d2a438a451
|
Merge branch 'main' into main
|
2024-04-30 22:48:52 +05:30 |
|
mogith-pn
|
f36e0d13a0
|
Clarifai-LiteLLM integration (#1)
* intg v1 clarifai-litellm
* Added more community models and testcase
* Clarifai-updated markdown docs
|
2024-04-30 22:38:33 +05:30 |
|