Krish Dholakia
|
ad81e19282
|
Merge pull request #2013 from afbarbaro/litellm_gemini_safety_settings
Add safety_settings parameter to gemini generate_content calls
|
2024-02-17 16:47:29 -08:00 |
|
Krrish Dholakia
|
2a4a6995ac
|
feat(llama_guard.py): add llama guard support for content moderation + new async_moderation_hook endpoint
|
2024-02-16 18:45:25 -08:00 |
|
Krrish Dholakia
|
f57483ea70
|
fix(utils.py): support image gen logging to langfuse
|
2024-02-16 16:12:52 -08:00 |
|
Krrish Dholakia
|
5f9e141d1e
|
fix(huggingface_restapi.py): return streamed response correctly
|
2024-02-16 13:25:13 -08:00 |
|
Andres Barbaro
|
1f054203bf
|
Add safety_settings parameter to gemini generate_content calls
|
2024-02-16 12:22:18 -06:00 |
|
Krish Dholakia
|
999fab82f7
|
Merge branch 'main' into litellm_moderations_improvements
|
2024-02-15 23:08:25 -08:00 |
|
Krish Dholakia
|
9b60ef9a3c
|
Merge pull request #1916 from RenaLu/main
Add support for Vertex AI custom models deployed on private endpoint
|
2024-02-15 22:47:36 -08:00 |
|
Krrish Dholakia
|
1b844aafdc
|
fix(huggingface_restapi.py): fix hf streaming to raise exceptions
|
2024-02-15 21:25:12 -08:00 |
|
Krrish Dholakia
|
0bf35b4a91
|
fix(huggingface_restapi.py): catch streaming errors
|
2024-02-15 20:55:21 -08:00 |
|
Krrish Dholakia
|
eb45df16f1
|
fix(test_streaming.py): handle hf tgi zephyr not loading for streaming issue
|
2024-02-15 19:24:02 -08:00 |
|
Krish Dholakia
|
57654f4533
|
Merge branch 'main' into litellm_aioboto3_sagemaker
|
2024-02-14 21:46:58 -08:00 |
|
Rena Lu
|
02c58a9760
|
update request strings
|
2024-02-14 17:15:16 -05:00 |
|
Rena Lu
|
8c39a631d3
|
update request string
|
2024-02-14 22:11:55 +00:00 |
|
Krrish Dholakia
|
fe1fe70c64
|
fix(vertex_ai.py): map finish reason
|
2024-02-14 11:42:13 -08:00 |
|
Krrish Dholakia
|
cb5a13ed49
|
fix(bedrock.py): fix amazon titan prompt formatting
|
2024-02-13 22:02:25 -08:00 |
|
Krrish Dholakia
|
3ef391800a
|
fix(sagemaker.py): fix token iterator default flag
|
2024-02-13 21:41:09 -08:00 |
|
Rena Lu
|
22cca2c106
|
rm ipynb checkpoints
|
2024-02-13 20:57:07 +00:00 |
|
Rena Lu
|
ad366438c6
|
Merge branch 'BerriAI:main' into main
|
2024-02-13 15:55:48 -05:00 |
|
Rena Lu
|
ce7ce3b719
|
fix optional params
|
2024-02-13 20:50:26 +00:00 |
|
Rena Lu
|
3fc0d8fda9
|
delete print
|
2024-02-13 18:49:56 +00:00 |
|
Rena Lu
|
e011f8022a
|
update private async
|
2024-02-13 18:33:52 +00:00 |
|
Krrish Dholakia
|
f09c09ace4
|
docs(pii_masking.md): fix presidio tutorial
|
2024-02-13 07:42:27 -08:00 |
|
Krrish Dholakia
|
2f815705ca
|
fix(sagemaker.py): use __anext__
|
2024-02-12 22:13:35 -08:00 |
|
Krrish Dholakia
|
b1bc30ee16
|
feat(sagemaker.py): aioboto3 streaming support
|
2024-02-12 21:18:34 -08:00 |
|
Krrish Dholakia
|
460b48914e
|
feat(sagemaker.py): initial commit of working sagemaker with aioboto3
|
2024-02-12 17:25:57 -08:00 |
|
Rena Lu
|
60c0bec7b3
|
refactor to separate private mode with custom mode
|
2024-02-12 18:20:48 -05:00 |
|
ishaan-jaff
|
896fd393db
|
(feat) support bedrock timeout
|
2024-02-09 14:36:43 -08:00 |
|
Rena Lu
|
6833f37986
|
remove prints
|
2024-02-09 16:25:29 -05:00 |
|
Rena Lu
|
ae0ede4190
|
Merge branch 'BerriAI:main' into main
|
2024-02-09 16:20:14 -05:00 |
|
Rena Lu
|
0e8a0aefd5
|
add vertex ai private endpoint support
|
2024-02-09 16:19:26 -05:00 |
|
Krish Dholakia
|
51c07e294a
|
Merge pull request #1902 from BerriAI/litellm_mistral_message_list_fix
fix(factory.py): mistral message input fix
|
2024-02-08 23:01:39 -08:00 |
|
Krrish Dholakia
|
841639333b
|
fix(bedrock.py): raise exception for amazon titan null response
|
2024-02-08 21:12:25 -08:00 |
|
Krrish Dholakia
|
c9e5c796ad
|
fix(factory.py): mistral message input fix
|
2024-02-08 20:54:26 -08:00 |
|
David Leen
|
140d915adf
|
Add support for AWS credentials from profile file
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#aws-config-file
|
2024-02-08 15:10:50 -08:00 |
|
Krrish Dholakia
|
73d8e3e640
|
fix(ollama_chat.py): fix token counting
|
2024-02-06 22:18:46 -08:00 |
|
Krrish Dholakia
|
d1db67890c
|
fix(ollama.py): support format for ollama
|
2024-02-06 10:11:52 -08:00 |
|
Krrish Dholakia
|
9e091a0624
|
fix(ollama_chat.py): explicitly state if ollama call is streaming or not
|
2024-02-06 07:43:47 -08:00 |
|
Krrish Dholakia
|
2e3748e6eb
|
fix(ollama_chat.py): fix ollama chat completion token counting
|
2024-02-06 07:30:26 -08:00 |
|
Ishaan Jaff
|
14c9e239a1
|
Merge pull request #1750 from vanpelt/patch-2
Re-raise exception in async ollama streaming
|
2024-02-05 08:12:17 -08:00 |
|
Krish Dholakia
|
28df60b609
|
Merge pull request #1809 from BerriAI/litellm_embedding_caching_updates
Support caching individual items in embedding list (Async embedding only)
|
2024-02-03 21:04:23 -08:00 |
|
Krrish Dholakia
|
312c7462c8
|
refactor(ollama.py): trigger rebuild
|
2024-02-03 20:23:43 -08:00 |
|
Krrish Dholakia
|
01cef1fe9e
|
fix(ollama.py): fix api connection error
https://github.com/BerriAI/litellm/issues/1735
|
2024-02-03 20:22:33 -08:00 |
|
Krrish Dholakia
|
c49c88c8e5
|
fix(utils.py): route together ai calls to openai client
together ai is now openai-compatible
n
|
2024-02-03 19:22:48 -08:00 |
|
Krish Dholakia
|
6408af11b6
|
Merge pull request #1799 from BerriAI/litellm_bedrock_stable_diffusion_support
feat(bedrock.py): add stable diffusion image generation support
|
2024-02-03 12:59:00 -08:00 |
|
Krrish Dholakia
|
36416360c4
|
feat(bedrock.py): add stable diffusion image generation support
|
2024-02-03 12:08:38 -08:00 |
|
Krrish Dholakia
|
0ffdf57dec
|
fix(vertex_ai.py): add async embedding support for vertex ai
|
2024-02-03 10:35:17 -08:00 |
|
Krrish Dholakia
|
d9ba8668f4
|
feat(vertex_ai.py): vertex ai gecko text embedding support
|
2024-02-03 09:48:29 -08:00 |
|
Krrish Dholakia
|
0072d796f6
|
fix(vertex_ai.py): fix params
|
2024-02-01 18:09:49 -08:00 |
|
Krrish Dholakia
|
0f9e793daf
|
feat(vertex_ai.py): add support for custom models via vertex ai model garden
|
2024-02-01 17:47:34 -08:00 |
|
Chris Van Pelt
|
1568b162f5
|
Re-raise exception in async ollama streaming
|
2024-02-01 16:14:07 -08:00 |
|