Ishaan Jaff
|
6feb83eb51
|
Merge pull request #9419 from BerriAI/litellm_streaming_o1_pro
[Feat] OpenAI o1-pro Responses API streaming support
|
2025-03-20 21:54:43 -07:00 |
|
Krish Dholakia
|
bc03378fef
|
Merge pull request #9260 from Grizzly-jobs/fix/voyage-ai-token-usage-tracking
fix: VoyageAI `prompt_token` always empty
|
2025-03-20 14:00:51 -07:00 |
|
Ishaan Jaff
|
3088204ac2
|
fix code quality checks
|
2025-03-20 13:57:35 -07:00 |
|
Krish Dholakia
|
f5f92bf6ae
|
Merge pull request #9366 from JamesGuthrie/jg/vertex-output-dimensionality
fix: VertexAI outputDimensionality configuration
|
2025-03-20 13:55:33 -07:00 |
|
Ishaan Jaff
|
6d4cf6581d
|
MockResponsesAPIStreamingIterator
|
2025-03-20 12:30:09 -07:00 |
|
Ishaan Jaff
|
435a89dd79
|
transform_responses_api_request
|
2025-03-20 12:28:55 -07:00 |
|
Ishaan Jaff
|
6608770e64
|
add fake_stream to llm http handler
|
2025-03-20 09:55:59 -07:00 |
|
Ishaan Jaff
|
1567e52185
|
add should_fake_stream
|
2025-03-20 09:54:26 -07:00 |
|
Krrish Dholakia
|
b228456b67
|
feat(azure/gpt_transformation.py): add azure audio model support
Closes https://github.com/BerriAI/litellm/issues/6305
|
2025-03-19 22:57:49 -07:00 |
|
Ishaan Jaff
|
08cb68c8fb
|
fix import hashlib
|
2025-03-19 21:08:19 -07:00 |
|
Ishaan Jaff
|
c15e38a148
|
Merge branch 'main' into litellm_fix_ssl_verify
|
2025-03-19 21:03:06 -07:00 |
|
James Guthrie
|
4044bca614
|
fix: VertexAI outputDimensionality configuration
VertexAI's API documentation [1] is an absolute mess. In it, they
describe the parameter to configure output dimensionality as
`output_dimensionality`. In the API example, they switch to using snake
case `outputDimensionality`, which is the correct variant.
[1]: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/text-embeddings-api#generative-ai-get-text-embedding-drest
|
2025-03-19 11:07:36 +01:00 |
|
Krish Dholakia
|
887648a364
|
Merge pull request #9363 from BerriAI/litellm_dev_03_18_2025_p3
fix(common_utils.py): handle cris only model
|
2025-03-18 23:36:12 -07:00 |
|
Krrish Dholakia
|
db3a65d52a
|
fix(common_utils.py): handle cris only model
Fixes https://github.com/BerriAI/litellm/issues/9161#issuecomment-2734905153
|
2025-03-18 23:35:43 -07:00 |
|
Ishaan Jaff
|
55e669d7d8
|
get_openai_client_cache_key
|
2025-03-18 18:35:50 -07:00 |
|
Ishaan Jaff
|
4bac8f53a5
|
fix common utils
|
2025-03-18 17:59:46 -07:00 |
|
Ishaan Jaff
|
9f31177a20
|
use common caching logic for openai/azure clients
|
2025-03-18 17:57:03 -07:00 |
|
Ishaan Jaff
|
ef91a0c72b
|
use common logic for re-using openai clients
|
2025-03-18 17:56:32 -07:00 |
|
Ishaan Jaff
|
caca5a1b58
|
Union[TranscriptionResponse, Coroutine[Any, Any, TranscriptionResponse]]:
|
2025-03-18 14:23:14 -07:00 |
|
Ishaan Jaff
|
7e2d383885
|
fix code quality
|
2025-03-18 12:58:59 -07:00 |
|
Ishaan Jaff
|
0173ca3ad3
|
test_azure_instruct
|
2025-03-18 12:56:11 -07:00 |
|
Ishaan Jaff
|
94ac105c23
|
fix test_ensure_initialize_azure_sdk_client_always_used
|
2025-03-18 12:46:55 -07:00 |
|
Ishaan Jaff
|
70e80d2149
|
fix azure chat logic
|
2025-03-18 12:42:24 -07:00 |
|
Ishaan Jaff
|
069d14ad91
|
test_azure_embedding_max_retries_0
|
2025-03-18 12:35:34 -07:00 |
|
Ishaan Jaff
|
b3bc8a3231
|
:test_completion_azure_ad_toke
|
2025-03-18 12:25:32 -07:00 |
|
Ishaan Jaff
|
57bb599b47
|
fix azure embedding test
|
2025-03-18 12:19:12 -07:00 |
|
Ishaan Jaff
|
9d23214e87
|
fix amebedding issue on ssl azure
|
2025-03-18 11:42:11 -07:00 |
|
Ishaan Jaff
|
40736c3263
|
fix linting error
|
2025-03-18 11:38:31 -07:00 |
|
Ishaan Jaff
|
e3752cd32a
|
fix common utils
|
2025-03-18 11:04:02 -07:00 |
|
Ishaan Jaff
|
6bd6a2065a
|
fix using azure openai clients
|
2025-03-18 10:47:29 -07:00 |
|
Ishaan Jaff
|
77d14fa038
|
use get_azure_openai_client
|
2025-03-18 10:28:39 -07:00 |
|
Ishaan Jaff
|
1ede6080ef
|
fix logic for intializing openai clients
|
2025-03-18 10:23:30 -07:00 |
|
Ishaan Jaff
|
7bb90a15be
|
use ssl on initialize_azure_sdk_client
|
2025-03-18 10:14:51 -07:00 |
|
Ishaan Jaff
|
cb55069a70
|
_init_azure_client_for_cloudflare_ai_gateway
|
2025-03-18 10:11:54 -07:00 |
|
Ishaan Jaff
|
860d96a01e
|
fix re-using azure openai client
|
2025-03-18 10:06:56 -07:00 |
|
Ishaan Jaff
|
8b54873e9f
|
fix - correctly re-use azure openai client
|
2025-03-18 09:51:28 -07:00 |
|
Ishaan Jaff
|
d991b3c398
|
_get_azure_openai_client
|
2025-03-18 09:38:27 -07:00 |
|
Ishaan Jaff
|
9e00fa8221
|
rename to _get_azure_openai_client
|
2025-03-18 09:25:26 -07:00 |
|
Ishaan Jaff
|
3aa1e78ec3
|
handle _get_async_http_client for OpenAI
|
2025-03-18 08:56:08 -07:00 |
|
Krish Dholakia
|
58d8d4ca3b
|
Merge pull request #9326 from andjsmi/main
Modify completion handler for SageMaker to use payload from `prepared_request`
|
2025-03-17 22:16:43 -07:00 |
|
Krish Dholakia
|
8ee4eead7f
|
Merge pull request #9333 from BerriAI/litellm_dev_03_17_2025_p2
fix(ollama/completions/transformation.py): pass prompt, untemplated o…
|
2025-03-17 21:48:30 -07:00 |
|
Krrish Dholakia
|
4d56992407
|
fix(ollama/completions/transformation.py): pass prompt, untemplated on /completions request
Fixes https://github.com/BerriAI/litellm/issues/6900
|
2025-03-17 18:35:44 -07:00 |
|
Andrew Smith
|
a96c31c924
|
Update handler.py to use prepared_request.body for input
|
2025-03-18 11:07:38 +11:00 |
|
Krrish Dholakia
|
911b053095
|
fix(http_handler.py): fix typing error
|
2025-03-17 16:42:32 -07:00 |
|
Andrew Smith
|
ddfbb6b7b8
|
Update handler.py to use prepared_request.body
|
2025-03-18 10:23:32 +11:00 |
|
Krrish Dholakia
|
425585f25c
|
fix(http_handler.py): support reading ssl security level from env var
Allows user to specify lower security settings
|
2025-03-17 15:48:31 -07:00 |
|
Krish Dholakia
|
85a2d1e920
|
Merge branch 'main' into litellm_dev_03_16_2025_p1
|
2025-03-17 10:02:53 -07:00 |
|
Krrish Dholakia
|
aafc224802
|
fix(converse_transformation.py): fix linting error
|
2025-03-15 19:33:17 -07:00 |
|
Krrish Dholakia
|
8e7363acf5
|
fix(converse_transformation.py): fix encoding model
|
2025-03-15 14:03:37 -07:00 |
|
Krrish Dholakia
|
424f51cc06
|
fix(utils.py): Prevents final chunk w/ usage from being ignored
Fixes https://github.com/BerriAI/litellm/issues/7112
|
2025-03-15 09:12:14 -07:00 |
|