Commit graph

876 commits

Author SHA1 Message Date
Krrish Dholakia
24ffb1e601 fix(main.py): fix together ai text completion call 2024-05-08 09:10:45 -07:00
Ishaan Jaff
d399947111 Merge pull request #3470 from mbektas/fix-ollama-embeddings
support sync ollama embeddings
2024-05-07 19:21:37 -07:00
Paul Gauthier
c72e7e85e2 Added support for the deepseek api 2024-05-07 11:44:03 -07:00
Simon Sanchez Viloria
9a95fa9348 Merge branch 'main' into feature/watsonx-integration 2024-05-06 17:27:14 +02:00
Simon Sanchez Viloria
361188b436 (feat) support for async stream to watsonx provider 2024-05-06 17:08:40 +02:00
Lucca Zenóbio
146a49103f Merge branch 'main' into main 2024-05-06 09:40:23 -03:00
Mehmet Bektas
1236638266 support sync ollama embeddings 2024-05-05 19:44:25 -07:00
Krrish Dholakia
572f426839 test: skip hanging test 2024-05-05 00:27:38 -07:00
Ishaan Jaff
433333400f support dynamic retry policies 2024-05-04 18:10:15 -07:00
Ishaan Jaff
65538818a7 Revert "Add return_exceptions to litellm.batch_completion" 2024-05-04 13:01:17 -07:00
Ishaan Jaff
1a96bbea64 Merge pull request #1530 from TanaroSch/main
change max_tokens type to int
2024-05-04 12:47:15 -07:00
Ishaan Jaff
d7f376563e Merge pull request #3397 from ffreemt/main
Add return_exceptions to litellm.batch_completion
2024-05-04 12:41:21 -07:00
Krrish Dholakia
27eec16f32 refactor(main.py): trigger new build 2024-05-03 21:53:51 -07:00
Krrish Dholakia
cfb6df4987 fix(main.py): support new 'supports_system_message=False' param
Fixes https://github.com/BerriAI/litellm/issues/3325
2024-05-03 21:31:45 -07:00
Krrish Dholakia
7715a9d333 fix(bedrock.py): convert httpx.timeout to boto3 valid timeout
Closes https://github.com/BerriAI/litellm/issues/3398
2024-05-03 16:24:21 -07:00
mogith-pn
c6f9cb9346 Clarifai - Added streaming and async completion support 2024-05-03 14:03:38 +00:00
mikey
0c0c0bb689 Merge branch 'BerriAI:main' into main 2024-05-03 11:30:03 +08:00
ffreemt
ddeaaae7a0 Add litellm\tests\test_batch_completion_return_exceptions.py 2024-05-03 11:28:38 +08:00
Krish Dholakia
7e04447159 Merge pull request #3393 from Priva28/main
Add Llama3 tokenizer and allow custom tokenizers.
2024-05-02 16:32:41 -07:00
Lucca Zenóbio
bf2a319713 Merge branch 'main' into main 2024-05-02 09:46:34 -03:00
ffreemt
cfec731fda Add return_exceptions to litellm.batch_completion for optionally returing exceptions and partial resuslt instead of throwing exceptions 2024-05-02 19:30:01 +08:00
Christian Privitelli
8b4bc4c832 include methods in init import, add test, fix encode/decode param ordering 2024-05-02 15:49:22 +10:00
Krrish Dholakia
588751ec86 refactor(main.py): trigger new build 2024-05-01 21:59:33 -07:00
Krrish Dholakia
885eb4584a fix(main.py): fix mock completion response 2024-04-30 19:30:18 -07:00
Krrish Dholakia
a12878b0f8 fix(router.py): cooldown deployments, for 401 errors 2024-04-30 17:54:00 -07:00
mogith-pn
d2a438a451 Merge branch 'main' into main 2024-04-30 22:48:52 +05:30
mogith-pn
f36e0d13a0 Clarifai-LiteLLM integration (#1)
* intg v1 clarifai-litellm

* Added more community models and testcase

* Clarifai-updated markdown docs
2024-04-30 22:38:33 +05:30
Krrish Dholakia
ec0bd566ef fix(watsonx.py): use common litellm params for api key, api base, etc. 2024-04-27 10:15:27 -07:00
Krish Dholakia
b7beab2e39 Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
2024-04-27 05:48:34 -07:00
Krrish Dholakia
6d141f6dac refactor(main.py): trigger new build 2024-04-25 19:51:38 -07:00
Lucca Zenobio
e73978b0d9 merge 2024-04-25 15:00:07 -03:00
Krrish Dholakia
abd35d6b60 refactor(main.py): trigger new build 2024-04-24 22:04:24 -07:00
Krrish Dholakia
b10f03706d fix(utils.py): fix streaming to not return usage dict
Fixes https://github.com/BerriAI/litellm/issues/3237
2024-04-24 08:06:07 -07:00
Simon S. Viloria
79855b372d Merge branch 'BerriAI:main' into feature/watsonx-integration 2024-04-23 12:18:34 +02:00
Simon Sanchez Viloria
572cbef43b feat - watsonx refractoring, removed dependency, and added support for embedding calls 2024-04-23 12:01:13 +02:00
Krrish Dholakia
3b6d204314 fix(vertex_ai.py): fix streaming logic 2024-04-22 19:15:20 -07:00
Krrish Dholakia
0f8cf067ea refactor(main.py): trigger new build 2024-04-22 10:54:35 -07:00
Simon S. Viloria
0c4cf91c79 Merge branch 'BerriAI:main' into feature/watsonx-integration 2024-04-21 10:35:51 +02:00
Krrish Dholakia
79056690f3 fix(main.py): ignore max_parallel_requests as a litellm param 2024-04-20 12:15:04 -07:00
Simon Sanchez Viloria
9b3a1b3f35 Added support for IBM watsonx.ai models 2024-04-20 20:06:46 +02:00
Krrish Dholakia
9dc0871023 fix(ollama_chat.py): accept api key as a param for ollama calls
allows user to call hosted ollama endpoint using bearer token for auth
2024-04-19 13:02:13 -07:00
Krrish Dholakia
41c02028a7 refactor(main.py): trigger new build 2024-04-18 22:17:19 -07:00
Ishaan Jaff
1ba216627a fix - pass kwargs to exception_type 2024-04-18 12:58:30 -07:00
Krrish Dholakia
388ecadd5d refactor(main.py): trigger new build 2024-04-18 07:34:09 -07:00
Krrish Dholakia
6d508468ef fix(vertex_ai.py): accept credentials as a json string 2024-04-16 17:34:25 -07:00
Krrish Dholakia
88c8ef6aa0 refactor(main.py): trigger new build 2024-04-15 18:36:51 -07:00
Krrish Dholakia
8c3c45fbb5 fix(vertex_ai_anthropic.py): set vertex_credentials for vertex ai anthropic calls
allows setting vertex credentials as a json string for vertex ai anthropic calls
2024-04-15 14:16:28 -07:00
Krrish Dholakia
3d645f95a5 fix(main.py): accept vertex service account credentials as json string
allows us to dynamically set vertex ai credentials
2024-04-15 13:28:59 -07:00
Krrish Dholakia
1cd0551a1e fix(anthropic_text.py): add support for async text completion calls 2024-04-15 08:15:00 -07:00
Krrish Dholakia
9004e9d0ab refactor(main.py): trigger new build 2024-04-13 19:35:39 -07:00