Krrish Dholakia
|
a34cc2031d
|
fix(response_metadata.py): log the litellm_model_name
make it easier to track the model sent to the provider
|
2025-03-18 17:46:33 -07:00 |
|
Krrish Dholakia
|
453729b6c9
|
fix(litellm_logging.py): fix update
|
2025-03-18 17:18:58 -07:00 |
|
Krrish Dholakia
|
39ac9e3eca
|
fix(lowest_tpm_rpm_v2.py): fix updating limits
|
2025-03-18 17:10:17 -07:00 |
|
Ishaan Jaff
|
78a5dde31f
|
fix code qa
|
2025-03-18 17:07:44 -07:00 |
|
Ishaan Jaff
|
bd122f631e
|
fix arize config
|
2025-03-18 16:54:31 -07:00 |
|
Krrish Dholakia
|
cfe94c86cc
|
fix(lowest_tpm_rpm_routing_v2.py): fix deployment update to use correct keys
|
2025-03-18 16:28:37 -07:00 |
|
Krrish Dholakia
|
9bf6028f14
|
fix(lowest_tpm_rpm_v2.py): update key to use model name
|
2025-03-18 16:19:47 -07:00 |
|
Ishaan Jaff
|
de97cda445
|
refactor create_litellm_proxy_request_started_spen
|
2025-03-18 16:12:16 -07:00 |
|
Ishaan Jaff
|
f4b7b49e9a
|
fix arize logging
|
2025-03-18 15:51:58 -07:00 |
|
Ishaan Jaff
|
7a5726fc88
|
fix - Arize - only log LLM I/O
|
2025-03-18 15:50:38 -07:00 |
|
Ishaan Jaff
|
f8c49175ec
|
fix _get_span_processor
|
2025-03-18 14:59:13 -07:00 |
|
Ishaan Jaff
|
b940c969fd
|
use _get_headers_dictionary
|
2025-03-18 14:55:39 -07:00 |
|
Ishaan Jaff
|
57d08531a1
|
add arize_api_key to StandardCallbackDynamicParams
|
2025-03-18 14:46:55 -07:00 |
|
Ishaan Jaff
|
48663a0920
|
use safe dumps for arize ai
|
2025-03-18 14:30:00 -07:00 |
|
Ishaan Jaff
|
959299c8ea
|
Merge pull request #9338 from nate-mar/arize-integration-fixes
Arize integration Fix
|
2025-03-18 14:28:25 -07:00 |
|
Ishaan Jaff
|
55ea2370ba
|
Union[TranscriptionResponse, Coroutine[Any, Any, TranscriptionResponse]]:
|
2025-03-18 14:23:14 -07:00 |
|
Ishaan Jaff
|
7384d45ef0
|
fix type errors on transcription azure
|
2025-03-18 14:22:30 -07:00 |
|
Ishaan Jaff
|
b20a69f9fc
|
fix code quality
|
2025-03-18 12:58:59 -07:00 |
|
Emerson Gomes
|
1cc832f53e
|
Fix TTFT prioritization for streaming in LowestLatencyLoggingHandler
|
2025-03-18 14:58:55 -05:00 |
|
Ishaan Jaff
|
dc3d7b3afc
|
test_azure_instruct
|
2025-03-18 12:56:11 -07:00 |
|
Ishaan Jaff
|
2cd49ef096
|
fix test_ensure_initialize_azure_sdk_client_always_used
|
2025-03-18 12:46:55 -07:00 |
|
Ishaan Jaff
|
b60178f534
|
fix azure chat logic
|
2025-03-18 12:42:24 -07:00 |
|
Ishaan Jaff
|
80a5cfa01d
|
test_azure_embedding_max_retries_0
|
2025-03-18 12:35:34 -07:00 |
|
Ishaan Jaff
|
b316911120
|
fix typing errors
|
2025-03-18 12:31:44 -07:00 |
|
Ishaan Jaff
|
842625a6f0
|
:test_completion_azure_ad_toke
|
2025-03-18 12:25:32 -07:00 |
|
Ishaan Jaff
|
d4b3082ca2
|
fix azure embedding test
|
2025-03-18 12:19:12 -07:00 |
|
Ishaan Jaff
|
5e892a1e92
|
ui new build
Read Version from pyproject.toml / read-version (push) Successful in 18s
Helm unit test / unit-test (push) Successful in 22s
|
2025-03-18 11:46:23 -07:00 |
|
Ishaan Jaff
|
38e2dd00cc
|
fix amebedding issue on ssl azure
|
2025-03-18 11:42:11 -07:00 |
|
Ishaan Jaff
|
dfd7a7d547
|
fix linting error
|
2025-03-18 11:38:31 -07:00 |
|
Ishaan Jaff
|
3458c69eb0
|
fix common utils
|
2025-03-18 11:04:02 -07:00 |
|
Krrish Dholakia
|
2c094c2356
|
fix(get_llm_provider.py): Fixes https://github.com/BerriAI/litellm/issues/9291
|
2025-03-18 10:48:28 -07:00 |
|
Ishaan Jaff
|
c1e0cb136e
|
fix using azure openai clients
|
2025-03-18 10:47:29 -07:00 |
|
Ishaan Jaff
|
e34be5a3b6
|
use get_azure_openai_client
|
2025-03-18 10:28:39 -07:00 |
|
Ishaan Jaff
|
a0c5fb81b8
|
fix logic for intializing openai clients
|
2025-03-18 10:23:30 -07:00 |
|
Ishaan Jaff
|
0601768bb8
|
use ssl on initialize_azure_sdk_client
|
2025-03-18 10:14:51 -07:00 |
|
Ishaan Jaff
|
34142a1b62
|
_init_azure_client_for_cloudflare_ai_gateway
|
2025-03-18 10:11:54 -07:00 |
|
Ishaan Jaff
|
edfbf21c39
|
fix re-using azure openai client
|
2025-03-18 10:06:56 -07:00 |
|
Ishaan Jaff
|
f2026ef907
|
fix - correctly re-use azure openai client
|
2025-03-18 09:51:28 -07:00 |
|
Ishaan Jaff
|
b74f3cb76c
|
_get_azure_openai_client
|
2025-03-18 09:38:27 -07:00 |
|
Ishaan Jaff
|
26be805ad3
|
rename to _get_azure_openai_client
|
2025-03-18 09:25:26 -07:00 |
|
Ishaan Jaff
|
6e351136d7
|
handle _get_async_http_client for OpenAI
|
2025-03-18 08:56:08 -07:00 |
|
Nate Mar
|
6b821fbbc0
|
restore previous formatting
|
2025-03-18 04:02:43 -07:00 |
|
Nate Mar
|
a1d188ba5e
|
Fix test and add comments
|
2025-03-18 03:46:53 -07:00 |
|
Nate Mar
|
434e262b8c
|
revert space_key change and add tests for arize integration
|
2025-03-18 01:40:10 -07:00 |
|
Ishaan Jaff
|
b5c32c913b
|
Merge pull request #9331 from BerriAI/litellm_patch_disable_spend_updates
Read Version from pyproject.toml / read-version (push) Successful in 46s
Helm unit test / unit-test (push) Successful in 48s
[Patch] - Allow disabling all spend updates / writes to DB
|
2025-03-17 22:22:09 -07:00 |
|
Ishaan Jaff
|
0c1d00864d
|
fix ProxyUpdateSpend
|
2025-03-17 22:17:56 -07:00 |
|
Krish Dholakia
|
bcbb88d802
|
Merge pull request #9326 from andjsmi/main
Modify completion handler for SageMaker to use payload from `prepared_request`
|
2025-03-17 22:16:43 -07:00 |
|
Ishaan Jaff
|
d42f84dd4e
|
Merge pull request #9328 from BerriAI/litellm_control_internal_user_settings_ui
[UI] - Allow controlling default internal user settings on ui
|
2025-03-17 22:07:47 -07:00 |
|
Krish Dholakia
|
cd5024f3b1
|
Merge pull request #9333 from BerriAI/litellm_dev_03_17_2025_p2
fix(ollama/completions/transformation.py): pass prompt, untemplated o…
|
2025-03-17 21:48:30 -07:00 |
|
Ishaan Jaff
|
5400615ce8
|
Merge pull request #9329 from BerriAI/litellm_fix_reset_budget_job
[Bug fix] Reset Budget Job
|
2025-03-17 21:46:08 -07:00 |
|