Krrish Dholakia
|
918367cc7b
|
test: skip hanging test
|
2024-05-05 00:27:38 -07:00 |
|
ffreemt
|
2713272bba
|
Add return_exceptions to batch_completion (retry)
|
2024-05-05 13:11:21 +08:00 |
|
Ishaan Jaff
|
009f7c9bfc
|
support dynamic retry policies
|
2024-05-04 18:10:15 -07:00 |
|
Ishaan Jaff
|
df8e33739d
|
Revert "Add return_exceptions to litellm.batch_completion"
|
2024-05-04 13:01:17 -07:00 |
|
Ishaan Jaff
|
d968dedd77
|
Merge pull request #1530 from TanaroSch/main
change max_tokens type to int
|
2024-05-04 12:47:15 -07:00 |
|
Ishaan Jaff
|
7094ac9557
|
Merge pull request #3397 from ffreemt/main
Add return_exceptions to litellm.batch_completion
|
2024-05-04 12:41:21 -07:00 |
|
Krrish Dholakia
|
b7ca9a53c9
|
refactor(main.py): trigger new build
|
2024-05-03 21:53:51 -07:00 |
|
Krrish Dholakia
|
8249c986bf
|
fix(main.py): support new 'supports_system_message=False' param
Fixes https://github.com/BerriAI/litellm/issues/3325
|
2024-05-03 21:31:45 -07:00 |
|
Krrish Dholakia
|
a732d8772a
|
fix(bedrock.py): convert httpx.timeout to boto3 valid timeout
Closes https://github.com/BerriAI/litellm/issues/3398
|
2024-05-03 16:24:21 -07:00 |
|
mogith-pn
|
723ef9963e
|
Clarifai - Added streaming and async completion support
|
2024-05-03 14:03:38 +00:00 |
|
mikey
|
de7fe98556
|
Merge branch 'BerriAI:main' into main
|
2024-05-03 11:30:03 +08:00 |
|
ffreemt
|
a7ec1772b1
|
Add litellm\tests\test_batch_completion_return_exceptions.py
|
2024-05-03 11:28:38 +08:00 |
|
Krish Dholakia
|
2200900ca2
|
Merge pull request #3393 from Priva28/main
Add Llama3 tokenizer and allow custom tokenizers.
|
2024-05-02 16:32:41 -07:00 |
|
Lucca Zenóbio
|
78303b79ee
|
Merge branch 'main' into main
|
2024-05-02 09:46:34 -03:00 |
|
ffreemt
|
64d229caaa
|
Add return_exceptions to litellm.batch_completion for optionally returing exceptions and partial resuslt instead of throwing exceptions
|
2024-05-02 19:30:01 +08:00 |
|
Christian Privitelli
|
2d43153efa
|
include methods in init import, add test, fix encode/decode param ordering
|
2024-05-02 15:49:22 +10:00 |
|
Krrish Dholakia
|
0251543e7a
|
refactor(main.py): trigger new build
|
2024-05-01 21:59:33 -07:00 |
|
Krrish Dholakia
|
4761345311
|
fix(main.py): fix mock completion response
|
2024-04-30 19:30:18 -07:00 |
|
Krrish Dholakia
|
1baad80c7d
|
fix(router.py): cooldown deployments, for 401 errors
|
2024-04-30 17:54:00 -07:00 |
|
mogith-pn
|
d770df2259
|
Merge branch 'main' into main
|
2024-04-30 22:48:52 +05:30 |
|
mogith-pn
|
318b4813f2
|
Clarifai-LiteLLM integration (#1)
* intg v1 clarifai-litellm
* Added more community models and testcase
* Clarifai-updated markdown docs
|
2024-04-30 22:38:33 +05:30 |
|
Krrish Dholakia
|
c9d7437d16
|
fix(watsonx.py): use common litellm params for api key, api base, etc.
|
2024-04-27 10:15:27 -07:00 |
|
Krish Dholakia
|
2d976cfabc
|
Merge pull request #3270 from simonsanvil/feature/watsonx-integration
(feat) add IBM watsonx.ai as an llm provider
|
2024-04-27 05:48:34 -07:00 |
|
Krrish Dholakia
|
caaa05703d
|
refactor(main.py): trigger new build
|
2024-04-25 19:51:38 -07:00 |
|
Lucca Zenobio
|
6127d9f488
|
merge
|
2024-04-25 15:00:07 -03:00 |
|
Krrish Dholakia
|
0a9cdf6f9b
|
refactor(main.py): trigger new build
|
2024-04-24 22:04:24 -07:00 |
|
Krrish Dholakia
|
48c2c3d78a
|
fix(utils.py): fix streaming to not return usage dict
Fixes https://github.com/BerriAI/litellm/issues/3237
|
2024-04-24 08:06:07 -07:00 |
|
Simon S. Viloria
|
2ef4fb2efa
|
Merge branch 'BerriAI:main' into feature/watsonx-integration
|
2024-04-23 12:18:34 +02:00 |
|
Simon Sanchez Viloria
|
74d2ba0a23
|
feat - watsonx refractoring, removed dependency, and added support for embedding calls
|
2024-04-23 12:01:13 +02:00 |
|
Krrish Dholakia
|
ec2c70e362
|
fix(vertex_ai.py): fix streaming logic
|
2024-04-22 19:15:20 -07:00 |
|
Krrish Dholakia
|
1e9487f639
|
refactor(main.py): trigger new build
|
2024-04-22 10:54:35 -07:00 |
|
Simon S. Viloria
|
a77537ddd4
|
Merge branch 'BerriAI:main' into feature/watsonx-integration
|
2024-04-21 10:35:51 +02:00 |
|
Krrish Dholakia
|
26579303e0
|
fix(main.py): ignore max_parallel_requests as a litellm param
|
2024-04-20 12:15:04 -07:00 |
|
Simon Sanchez Viloria
|
6edb133733
|
Added support for IBM watsonx.ai models
|
2024-04-20 20:06:46 +02:00 |
|
Krrish Dholakia
|
3c6b6355c7
|
fix(ollama_chat.py): accept api key as a param for ollama calls
allows user to call hosted ollama endpoint using bearer token for auth
|
2024-04-19 13:02:13 -07:00 |
|
Krrish Dholakia
|
9e91541b8a
|
refactor(main.py): trigger new build
|
2024-04-18 22:17:19 -07:00 |
|
Ishaan Jaff
|
b66e4595e6
|
fix - pass kwargs to exception_type
|
2024-04-18 12:58:30 -07:00 |
|
Krrish Dholakia
|
0e208b435f
|
refactor(main.py): trigger new build
|
2024-04-18 07:34:09 -07:00 |
|
Krrish Dholakia
|
060ac995d6
|
fix(vertex_ai.py): accept credentials as a json string
|
2024-04-16 17:34:25 -07:00 |
|
Krrish Dholakia
|
c66b59a71e
|
refactor(main.py): trigger new build
|
2024-04-15 18:36:51 -07:00 |
|
Krrish Dholakia
|
1ec7118e1f
|
fix(vertex_ai_anthropic.py): set vertex_credentials for vertex ai anthropic calls
allows setting vertex credentials as a json string for vertex ai anthropic calls
|
2024-04-15 14:16:28 -07:00 |
|
Krrish Dholakia
|
50081479f9
|
fix(main.py): accept vertex service account credentials as json string
allows us to dynamically set vertex ai credentials
|
2024-04-15 13:28:59 -07:00 |
|
Krrish Dholakia
|
26286a54b8
|
fix(anthropic_text.py): add support for async text completion calls
|
2024-04-15 08:15:00 -07:00 |
|
Krrish Dholakia
|
c08d6a961a
|
refactor(main.py): trigger new build
|
2024-04-13 19:35:39 -07:00 |
|
Krrish Dholakia
|
74aa230eac
|
fix(main.py): automatically infer mode for text completion models
|
2024-04-12 14:16:21 -07:00 |
|
Krrish Dholakia
|
623613203a
|
refactor(main.py): trigger new build
contains fixes for async batch get
|
2024-04-10 21:45:06 -07:00 |
|
Ishaan Jaff
|
bc50b0a4a1
|
Merge pull request #2923 from BerriAI/litellm_return_better_error_from_health
fix - return stack trace on failing /health checks - first 1000 chars
|
2024-04-10 17:48:13 -07:00 |
|
Krrish Dholakia
|
a943f21f75
|
refactor(main.py): trigger new build
|
2024-04-09 21:15:33 -07:00 |
|
Krrish Dholakia
|
855e7ed9d2
|
fix(main.py): handle translating text completion openai to chat completion for async requests
also adds testing for this, to prevent future regressions
|
2024-04-09 16:47:49 -07:00 |
|
Ishaan Jaff
|
c9108d43e0
|
fix - return stack trace on failing /health checks
|
2024-04-09 15:12:09 -07:00 |
|