Krrish Dholakia
|
c12af219af
|
fix: fix linting
|
2024-05-10 20:23:33 -07:00 |
|
Krrish Dholakia
|
94f3d361b0
|
fix(vertex_ai.py): fix list tool call responses
Closes https://github.com/BerriAI/litellm/issues/3147
|
2024-05-10 20:05:58 -07:00 |
|
Ishaan Jaff
|
b09075da53
|
Merge pull request #3577 from BerriAI/litellm_add_triton_server
[Feat] Add Triton Embeddings to LiteLLM
|
2024-05-10 19:20:23 -07:00 |
|
Ishaan Jaff
|
d3550379b0
|
feat - triton embeddings
|
2024-05-10 18:57:06 -07:00 |
|
Krrish Dholakia
|
d4d175030f
|
docs(huggingface.md): add text-classification to huggingface docs
|
2024-05-10 14:39:14 -07:00 |
|
Krrish Dholakia
|
c17f221b89
|
test(test_completion.py): reintegrate testing for huggingface tgi + non-tgi
|
2024-05-10 14:07:01 -07:00 |
|
Krish Dholakia
|
a671046b45
|
Merge pull request #3552 from BerriAI/litellm_predibase_support
feat(predibase.py): add support for predibase provider
|
2024-05-09 22:21:16 -07:00 |
|
Krrish Dholakia
|
714370956f
|
fix(predibase.py): fix async streaming
|
2024-05-09 22:18:16 -07:00 |
|
Krrish Dholakia
|
76d4290591
|
fix(predibase.py): fix event loop closed error
|
2024-05-09 19:07:19 -07:00 |
|
Krrish Dholakia
|
491e177348
|
fix(predibase.py): fix async completion call
|
2024-05-09 18:44:19 -07:00 |
|
Ishaan Jaff
|
5eb12e30cc
|
Merge pull request #3547 from BerriAI/litellm_support_stream_options_text_completion
[Feat] support `stream_options` on `litellm.text_completion`
|
2024-05-09 18:05:58 -07:00 |
|
Krrish Dholakia
|
9083d8e490
|
fix: fix linting errors
|
2024-05-09 17:55:27 -07:00 |
|
Krrish Dholakia
|
d7189c21fd
|
feat(predibase.py): support async_completion + streaming (sync + async)
finishes up pr
|
2024-05-09 17:41:27 -07:00 |
|
Krrish Dholakia
|
186c0ec77b
|
feat(predibase.py): add support for predibase provider
Closes https://github.com/BerriAI/litellm/issues/1253
|
2024-05-09 16:39:43 -07:00 |
|
Ishaan Jaff
|
66053f14ae
|
stream_options for text-completionopenai
|
2024-05-09 08:37:40 -07:00 |
|
Ishaan Jaff
|
0b1885ca99
|
Merge pull request #3537 from BerriAI/litellm_support_stream_options_param
[Feat] support `stream_options` param for OpenAI
|
2024-05-09 08:34:08 -07:00 |
|
Krish Dholakia
|
8015bc1c47
|
Revert "Add support for async streaming to watsonx provider "
|
2024-05-09 07:44:15 -07:00 |
|
Krish Dholakia
|
8ad979cdfe
|
Merge branch 'main' into litellm_region_based_routing
|
2024-05-08 22:19:51 -07:00 |
|
Krish Dholakia
|
3f13251241
|
Merge pull request #3479 from simonsanvil/feature/watsonx-integration
Add support for async streaming to watsonx provider
|
2024-05-08 22:19:05 -07:00 |
|
Krrish Dholakia
|
3d18897d69
|
feat(router.py): enable filtering model group by 'allowed_model_region'
|
2024-05-08 22:10:17 -07:00 |
|
Ishaan Jaff
|
1042051602
|
support stream_options for chat completion models
|
2024-05-08 21:52:25 -07:00 |
|
Ishaan Jaff
|
6d955ef457
|
Merge branch 'main' into ui_show_spend_end_user
|
2024-05-08 18:29:25 -07:00 |
|
Ishaan Jaff
|
faab704d28
|
update global/spend/end_users
|
2024-05-08 17:03:09 -07:00 |
|
Krrish Dholakia
|
6575143460
|
feat(proxy_server.py): return litellm version in response headers
|
2024-05-08 16:00:08 -07:00 |
|
Krish Dholakia
|
303e0c6226
|
Revert "* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role"
|
2024-05-07 21:42:18 -07:00 |
|
Krish Dholakia
|
a325bf2fb8
|
Merge pull request #3478 from nkvch/Issue-#3474-anthropic-roles-alternation-issue
* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role
|
2024-05-07 21:24:47 -07:00 |
|
Ishaan Jaff
|
2725a55e7a
|
Merge pull request #3470 from mbektas/fix-ollama-embeddings
support sync ollama embeddings
|
2024-05-07 19:21:37 -07:00 |
|
nkvch
|
06d1aec49e
|
* fix(factory.py): fix conditional statement for merging messages with previous ones
|
2024-05-07 13:42:42 +02:00 |
|
nkvch
|
7d7b59ff78
|
* feat(factory.py): add support for merging consecutive messages of one role when separated with empty message of another role
|
2024-05-07 12:51:30 +02:00 |
|
Krrish Dholakia
|
c6d830d377
|
fix(anthropic.py): remove raise error on 'empty content'
Fixes https://github.com/BerriAI/litellm/issues/3453
|
2024-05-06 11:42:09 -07:00 |
|
Simon Sanchez Viloria
|
6181d1eaad
|
Merge branch 'main' into feature/watsonx-integration
|
2024-05-06 17:27:14 +02:00 |
|
Simon Sanchez Viloria
|
83a274b54b
|
(feat) support for async stream to watsonx provider
|
2024-05-06 17:08:40 +02:00 |
|
Simon Sanchez Viloria
|
62b3f25398
|
(fix) fix ibm_granite_pt and use hf_chat_template for llama-3 instruct prompts
|
2024-05-06 17:08:40 +02:00 |
|
Krish Dholakia
|
9f58583888
|
Merge pull request #3299 from themrzmaster/main
Allowing extra headers for bedrock
|
2024-05-06 07:45:53 -07:00 |
|
Krrish Dholakia
|
b5f3f198f2
|
fix(utils.py): anthropic error handling
|
2024-05-06 07:25:12 -07:00 |
|
Lucca Zenóbio
|
b22517845e
|
Merge branch 'main' into main
|
2024-05-06 09:40:23 -03:00 |
|
Mehmet Bektas
|
3acad270e5
|
support sync ollama embeddings
|
2024-05-05 19:44:25 -07:00 |
|
Jack Collins
|
bb6132eee1
|
Fix: get format from data not optional_params ollama non-stream completion
|
2024-05-05 18:59:26 -07:00 |
|
Jack Collins
|
81b1c46c6f
|
Add missing import itertools.chain
|
2024-05-05 18:54:08 -07:00 |
|
Jack Collins
|
03b82b78c1
|
Fix: Set finish_reason to tool_calls for non-stream responses in ollama
|
2024-05-05 18:52:31 -07:00 |
|
Jack Collins
|
297543e3e5
|
Parse streamed function calls as single delta in ollama
|
2024-05-05 18:52:20 -07:00 |
|
Jack Collins
|
dffe616267
|
Make newline same in async function
|
2024-05-05 18:51:53 -07:00 |
|
Jack Collins
|
c217a07d5e
|
Fix: Set finish_reason to tool_calls for non-stream responses
|
2024-05-05 18:47:58 -07:00 |
|
Jack Collins
|
107a77368f
|
Parse streamed function calls as single delta
|
2024-05-05 18:47:16 -07:00 |
|
Krish Dholakia
|
6be20f5fc6
|
Merge pull request #3455 from BerriAI/litellm_assistants_support
feat(openai.py): add support for openai assistants
|
2024-05-04 22:21:44 -07:00 |
|
Krrish Dholakia
|
1195bf296b
|
fix(openai.py): fix typing import for python 3.8
|
2024-05-04 21:49:30 -07:00 |
|
Krrish Dholakia
|
f2bf6411d8
|
fix(openai.py): fix linting error
|
2024-05-04 21:48:42 -07:00 |
|
Krrish Dholakia
|
8fe6c9b401
|
feat(assistants/main.py): support litellm.get_assistants() and litellm.get_messages()
|
2024-05-04 21:30:28 -07:00 |
|
Krrish Dholakia
|
cad01fb586
|
feat(assistants/main.py): support 'litellm.get_threads'
|
2024-05-04 21:14:03 -07:00 |
|
Krrish Dholakia
|
b7796c7487
|
feat(assistants/main.py): add 'add_message' endpoint
|
2024-05-04 19:56:11 -07:00 |
|