Krrish Dholakia
|
cad049b6a8
|
build(bump-helm-chart-app-version): bump helm chart app version to latest
|
2024-05-06 10:26:01 -07:00 |
|
Krrish Dholakia
|
7ebe00599b
|
build(host-helm-chart-on-root): allows helm repo add to work with litellm
|
2024-05-06 10:11:56 -07:00 |
|
Ishaan Jaff
|
ec63a30095
|
docs - deploy litellm on gcp cloud run
|
2024-05-06 08:10:35 -07:00 |
|
Ishaan Jaff
|
e0001a9121
|
docs - add using vertex embedding models
|
2024-05-06 07:56:17 -07:00 |
|
Krish Dholakia
|
9f58583888
|
Merge pull request #3299 from themrzmaster/main
Allowing extra headers for bedrock
|
2024-05-06 07:45:53 -07:00 |
|
Krrish Dholakia
|
b5f3f198f2
|
fix(utils.py): anthropic error handling
|
2024-05-06 07:25:12 -07:00 |
|
Krrish Dholakia
|
d83f0b02da
|
test: fix local tests
|
2024-05-06 07:14:33 -07:00 |
|
Krish Dholakia
|
5f119f2abb
|
Merge pull request #3469 from jackmpcollins/fix-ollama-streamed-tool-calls
Fix Ollama streamed tool calls. Set finish_reason to tool_calls for all tool_calls responses
|
2024-05-06 07:13:37 -07:00 |
|
Ishaan Jaff
|
817a77b23f
|
Merge pull request #3463 from RoniGurvichCycode/main
gunicorn version bump
|
2024-05-06 07:11:01 -07:00 |
|
Lucca Zenóbio
|
b22517845e
|
Merge branch 'main' into main
|
2024-05-06 09:40:23 -03:00 |
|
Jack Collins
|
07b13ff7c5
|
Remove unused ModelResponse import
|
2024-05-06 00:16:58 -07:00 |
|
Jack Collins
|
51c02fdadf
|
Add tests for ollama + ollama chat tool calls +/- stream
|
2024-05-06 00:13:42 -07:00 |
|
Jack Collins
|
bb6132eee1
|
Fix: get format from data not optional_params ollama non-stream completion
|
2024-05-05 18:59:26 -07:00 |
|
Jack Collins
|
81b1c46c6f
|
Add missing import itertools.chain
|
2024-05-05 18:54:08 -07:00 |
|
Jack Collins
|
03b82b78c1
|
Fix: Set finish_reason to tool_calls for non-stream responses in ollama
|
2024-05-05 18:52:31 -07:00 |
|
Jack Collins
|
297543e3e5
|
Parse streamed function calls as single delta in ollama
|
2024-05-05 18:52:20 -07:00 |
|
Jack Collins
|
dffe616267
|
Make newline same in async function
|
2024-05-05 18:51:53 -07:00 |
|
Jack Collins
|
c217a07d5e
|
Fix: Set finish_reason to tool_calls for non-stream responses
|
2024-05-05 18:47:58 -07:00 |
|
Jack Collins
|
107a77368f
|
Parse streamed function calls as single delta
|
2024-05-05 18:47:16 -07:00 |
|
Roni Gurvich
|
32a085accf
|
gunicorn update
|
2024-05-05 13:25:59 +03:00 |
|
Krrish Dholakia
|
918367cc7b
|
test: skip hanging test
|
2024-05-05 00:27:38 -07:00 |
|
Krrish Dholakia
|
e95be13f10
|
fix(router.py): fix router retry policy logic
|
2024-05-04 23:02:50 -07:00 |
|
Krrish Dholakia
|
0529d7eaa3
|
bump: version 1.35.38 → 1.36.0
|
2024-05-04 22:23:20 -07:00 |
|
Krish Dholakia
|
6be20f5fc6
|
Merge pull request #3455 from BerriAI/litellm_assistants_support
feat(openai.py): add support for openai assistants
|
2024-05-04 22:21:44 -07:00 |
|
Ishaan Jaff
|
d4c2eb1797
|
fix post call rule test
|
2024-05-04 22:04:53 -07:00 |
|
Krrish Dholakia
|
06ae584473
|
fix(types/openai.py): fix python3.8 typing issue
|
2024-05-04 22:04:17 -07:00 |
|
Krrish Dholakia
|
66129bc921
|
fix(typing/openai.py): fix openai typing error (version-related)
|
2024-05-04 22:02:43 -07:00 |
|
Krrish Dholakia
|
2deac08ff1
|
fix(types/openai.py): fix typing import
|
2024-05-04 21:53:08 -07:00 |
|
Krrish Dholakia
|
1195bf296b
|
fix(openai.py): fix typing import for python 3.8
|
2024-05-04 21:49:30 -07:00 |
|
Krrish Dholakia
|
f2bf6411d8
|
fix(openai.py): fix linting error
|
2024-05-04 21:48:42 -07:00 |
|
Krish Dholakia
|
47078f4d84
|
Merge pull request #3452 from tothandras/fix/openmeter
Fix OpenMeter sync logger
|
2024-05-04 21:41:44 -07:00 |
|
Ishaan Jaff
|
4d1806bc95
|
fix - vertex ai exceptions
|
2024-05-04 21:32:10 -07:00 |
|
Krrish Dholakia
|
5406205e4b
|
test(test_assistants.py): cleanup tests
|
2024-05-04 21:31:07 -07:00 |
|
Krrish Dholakia
|
8fe6c9b401
|
feat(assistants/main.py): support litellm.get_assistants() and litellm.get_messages()
|
2024-05-04 21:30:28 -07:00 |
|
Krrish Dholakia
|
cad01fb586
|
feat(assistants/main.py): support 'litellm.get_threads'
|
2024-05-04 21:14:03 -07:00 |
|
Ishaan Jaff
|
157b2b3a06
|
fix python 3.8 install
|
2024-05-04 21:00:39 -07:00 |
|
Ishaan Jaff
|
4dcb4b81f1
|
fix - python 3.8 error
|
2024-05-04 20:44:40 -07:00 |
|
Ishaan Jaff
|
fc63c3f555
|
bump: version 1.35.38 → 1.35.39
|
2024-05-04 20:43:05 -07:00 |
|
Ishaan Jaff
|
713e04848d
|
Merge pull request #3460 from BerriAI/litellm_use_retry_policy_per_mg
[Feat] Set a Retry Policy per model group
|
2024-05-04 20:42:40 -07:00 |
|
Ishaan Jaff
|
ba065653ca
|
Merge pull request #3457 from BerriAI/litellm_return_num_retries_exceptions
[Feat] return num_retries in litellm.Router exceptions
|
2024-05-04 20:41:54 -07:00 |
|
Ishaan Jaff
|
f09da3f14c
|
test - test setting retry policies per model groups
|
2024-05-04 20:40:56 -07:00 |
|
Ishaan Jaff
|
90ac1e3fd9
|
feat - set retry policy per model group
|
2024-05-04 20:39:51 -07:00 |
|
Ishaan Jaff
|
95864a1d37
|
fix router debug logs
|
2024-05-04 20:24:15 -07:00 |
|
Krrish Dholakia
|
b7796c7487
|
feat(assistants/main.py): add 'add_message' endpoint
|
2024-05-04 19:56:11 -07:00 |
|
Krrish Dholakia
|
681a95e37b
|
fix(assistants/main.py): support litellm.create_thread() call
|
2024-05-04 19:35:37 -07:00 |
|
Ishaan Jaff
|
dfc22194b2
|
fix - undo local dev changes
|
2024-05-04 19:11:57 -07:00 |
|
Ishaan Jaff
|
6b59aeb603
|
fix return num retries in exceptions
|
2024-05-04 19:09:34 -07:00 |
|
Ishaan Jaff
|
5be8c95c6e
|
fix don't return num retries in utils.py
|
2024-05-04 19:07:28 -07:00 |
|
Ishaan Jaff
|
0f03e53348
|
feat return num retries in exceptions
|
2024-05-04 18:50:38 -07:00 |
|
Ishaan Jaff
|
87e165e413
|
Merge pull request #3456 from BerriAI/litellm_router_set_retry_policy_errors
[FEAT] router set custom num retries for ContentPolicyViolationErrorRetries, RateLimitErrorRetries, BadRequestErrorRetries etc
|
2024-05-04 18:26:03 -07:00 |
|