Krrish Dholakia
|
4a3b084961
|
feat(bedrock_httpx.py): moves to using httpx client for bedrock cohere calls
|
2024-05-11 13:43:08 -07:00 |
|
Krrish Dholakia
|
cd1a470b9a
|
refactor(main.py): trigger new build
|
2024-05-10 20:17:39 -07:00 |
|
Ishaan Jaff
|
d3550379b0
|
feat - triton embeddings
|
2024-05-10 18:57:06 -07:00 |
|
Krrish Dholakia
|
cdec7a414f
|
test(test_router_fallbacks.py): fix test
|
2024-05-10 09:58:40 -07:00 |
|
Krrish Dholakia
|
9a31f3d3d9
|
fix(main.py): support env var 'VERTEX_PROJECT' and 'VERTEX_LOCATION'
|
2024-05-10 07:57:56 -07:00 |
|
Krish Dholakia
|
a671046b45
|
Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
|
2024-05-09 22:21:16 -07:00 |
|
Krrish Dholakia
|
425efc60f4
|
fix(main.py): fix linting error
|
2024-05-09 18:12:28 -07:00 |
|
Ishaan Jaff
|
5eb12e30cc
|
Merge pull request #3547 from BerriAI/litellm_support_stream_options_text_completion
[Feat] support `stream_options` on `litellm.text_completion`
|
2024-05-09 18:05:58 -07:00 |
|
Krrish Dholakia
|
d7189c21fd
|
feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
|
2024-05-09 17:41:27 -07:00 |
|
Krrish Dholakia
|
186c0ec77b
|
feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
|
2024-05-09 16:39:43 -07:00 |
|
Krrish Dholakia
|
5c6a382d3b
|
refactor(main.py): trigger new build
|
2024-05-09 15:41:33 -07:00 |
|
Ishaan Jaff
|
4d5b4a5293
|
add stream_options to text_completion
|
2024-05-09 08:35:35 -07:00 |
|
Ishaan Jaff
|
0b1885ca99
|
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
[Feat] support `stream_options` param for OpenAI
|
2024-05-09 08:34:08 -07:00 |
|
Ishaan Jaff
|
dfd6361310
|
fix completion vs acompletion params
|
2024-05-09 07:59:37 -07:00 |
|
Krish Dholakia
|
8015bc1c47
|
Revert "Add support for async streaming to watsonx provider "
|
2024-05-09 07:44:15 -07:00 |
|
Krish Dholakia
|
8ad979cdfe
|
Merge branch 'main' into litellm_region_based_routing
|
2024-05-08 22:19:51 -07:00 |
|
Krish Dholakia
|
3f13251241
|
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
Add support for async streaming to watsonx provider
|
2024-05-08 22:19:05 -07:00 |
|
Krrish Dholakia
|
3d18897d69
|
feat(router.py): enable filtering model group by 'allowed_model_region'
|
2024-05-08 22:10:17 -07:00 |
|
Ishaan Jaff
|
edb10198ef
|
feat - add stream_options support litellm
|
2024-05-08 21:25:40 -07:00 |
|
Krrish Dholakia
|
6ce13ab364
|
refactor(main.py): trigger new build
|
2024-05-08 09:24:01 -07:00 |
|
Krrish Dholakia
|
a854824c02
|
fix(main.py): fix together ai text completion call
|
2024-05-08 09:10:45 -07:00 |
|
Ishaan Jaff
|
2725a55e7a
|
Merge pull request #3470 from mbektas/fix-ollama-embeddings
support sync ollama embeddings
|
2024-05-07 19:21:37 -07:00 |
|
Paul Gauthier
|
90eb0ea022
|
Added support for the deepseek api
|
2024-05-07 11:44:03 -07:00 |
|
Simon Sanchez Viloria
|
6181d1eaad
|
Merge branch 'main' into feature/watsonx-integration
|
2024-05-06 17:27:14 +02:00 |
|
Simon Sanchez Viloria
|
83a274b54b
|
(feat) support for async stream to watsonx provider
|
2024-05-06 17:08:40 +02:00 |
|
Lucca Zenóbio
|
b22517845e
|
Merge branch 'main' into main
|
2024-05-06 09:40:23 -03:00 |
|
Mehmet Bektas
|
3acad270e5
|
support sync ollama embeddings
|
2024-05-05 19:44:25 -07:00 |
|
Krrish Dholakia
|
918367cc7b
|
test: skip hanging test
|
2024-05-05 00:27:38 -07:00 |
|
Ishaan Jaff
|
009f7c9bfc
|
support dynamic retry policies
|
2024-05-04 18:10:15 -07:00 |
|
Ishaan Jaff
|
df8e33739d
|
Revert "Add return_exceptions to litellm.batch_completion"
|
2024-05-04 13:01:17 -07:00 |
|
Ishaan Jaff
|
d968dedd77
|
Merge pull request #1530 from TanaroSch/main
change max_tokens type to int
|
2024-05-04 12:47:15 -07:00 |
|
Ishaan Jaff
|
7094ac9557
|
Merge pull request #3397 from ffreemt/main
Add return_exceptions to litellm.batch_completion
|
2024-05-04 12:41:21 -07:00 |
|
Krrish Dholakia
|
b7ca9a53c9
|
refactor(main.py): trigger new build
|
2024-05-03 21:53:51 -07:00 |
|
Krrish Dholakia
|
8249c986bf
|
fix(main.py): support new 'supports_system_message=False' param
Fixes https://github.com/BerriAI/litellm/issues/3325
|
2024-05-03 21:31:45 -07:00 |
|
Krrish Dholakia
|
a732d8772a
|
fix(bedrock.py): convert httpx.timeout to boto3 valid timeout
Closes https://github.com/BerriAI/litellm/issues/3398
|
2024-05-03 16:24:21 -07:00 |
|
mikey
|
de7fe98556
|
Merge branch 'BerriAI:main' into main
|
2024-05-03 11:30:03 +08:00 |
|
ffreemt
|
a7ec1772b1
|
Add litellm\tests\test_batch_completion_return_exceptions.py
|
2024-05-03 11:28:38 +08:00 |
|
Krish Dholakia
|
2200900ca2
|
Merge pull request #3393 from Priva28/main
Add Llama3 tokenizer and allow custom tokenizers.
|
2024-05-02 16:32:41 -07:00 |
|
Lucca Zenóbio
|
78303b79ee
|
Merge branch 'main' into main
|
2024-05-02 09:46:34 -03:00 |
|
ffreemt
|
64d229caaa
|
Add return_exceptions to litellm.batch_completion for optionally returing exceptions and partial resuslt instead of throwing exceptions
|
2024-05-02 19:30:01 +08:00 |
|
Christian Privitelli
|
2d43153efa
|
include methods in init import, add test, fix encode/decode param ordering
|
2024-05-02 15:49:22 +10:00 |
|
Krrish Dholakia
|
0251543e7a
|
refactor(main.py): trigger new build
|
2024-05-01 21:59:33 -07:00 |
|
Krrish Dholakia
|
4761345311
|
fix(main.py): fix mock completion response
|
2024-04-30 19:30:18 -07:00 |
|
Krrish Dholakia
|
1baad80c7d
|
fix(router.py): cooldown deployments, for 401 errors
|
2024-04-30 17:54:00 -07:00 |
|
Krrish Dholakia
|
c9d7437d16
|
fix(watsonx.py): use common litellm params for api key, api base, etc.
|
2024-04-27 10:15:27 -07:00 |
|
Krish Dholakia
|
2d976cfabc
|
Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
|
2024-04-27 05:48:34 -07:00 |
|
Krrish Dholakia
|
caaa05703d
|
refactor(main.py): trigger new build
|
2024-04-25 19:51:38 -07:00 |
|
Lucca Zenobio
|
6127d9f488
|
merge
|
2024-04-25 15:00:07 -03:00 |
|
Krrish Dholakia
|
0a9cdf6f9b
|
refactor(main.py): trigger new build
|
2024-04-24 22:04:24 -07:00 |
|
Krrish Dholakia
|
48c2c3d78a
|
fix(utils.py): fix streaming to not return usage dict
Fixes https://github.com/BerriAI/litellm/issues/3237
|
2024-04-24 08:06:07 -07:00 |
|