Ishaan Jaff
|
c44fe8bd90
|
Merge pull request #9419 from BerriAI/litellm_streaming_o1_pro
Read Version from pyproject.toml / read-version (push) Successful in 19s
Helm unit test / unit-test (push) Successful in 21s
[Feat] OpenAI o1-pro Responses API streaming support
|
2025-03-20 21:54:43 -07:00 |
|
Krrish Dholakia
|
647187db11
|
docs(release_cycle.md): clarify release cycle for stable releases on docs
Read Version from pyproject.toml / read-version (push) Successful in 21s
Helm unit test / unit-test (push) Successful in 25s
|
2025-03-20 15:07:14 -07:00 |
|
Ishaan Jaff
|
15048de5e2
|
test_prepare_fake_stream_request
|
2025-03-20 14:50:00 -07:00 |
|
Ishaan Jaff
|
c82b17013f
|
Merge pull request #9422 from BerriAI/dependabot/pip/cookbook/litellm-ollama-docker-image/litellm-1.61.15
build(deps): bump litellm from 1.55.3 to 1.61.15 in /cookbook/litellm-ollama-docker-image
|
2025-03-20 14:42:06 -07:00 |
|
Krrish Dholakia
|
46d68a61c8
|
fix: fix testing
|
2025-03-20 14:37:58 -07:00 |
|
dependabot[bot]
|
8f35bdffb0
|
build(deps): bump litellm in /cookbook/litellm-ollama-docker-image
Bumps [litellm](https://github.com/BerriAI/litellm) from 1.55.3 to 1.61.15.
- [Release notes](https://github.com/BerriAI/litellm/releases)
- [Commits](https://github.com/BerriAI/litellm/commits)
---
updated-dependencies:
- dependency-name: litellm
dependency-type: direct:production
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2025-03-20 21:03:29 +00:00 |
|
Krish Dholakia
|
ab385848c1
|
Merge pull request #9260 from Grizzly-jobs/fix/voyage-ai-token-usage-tracking
fix: VoyageAI `prompt_token` always empty
|
2025-03-20 14:00:51 -07:00 |
|
Krish Dholakia
|
45b9498608
|
Merge pull request #9290 from jollyolliel/patch-2
Update perplexity.md
|
2025-03-20 13:58:45 -07:00 |
|
Ishaan Jaff
|
1829cc2042
|
fix code quality checks
|
2025-03-20 13:57:35 -07:00 |
|
Krish Dholakia
|
c53a8fe7bb
|
Merge pull request #9325 from Shiti/main
docs(vertexai): fix typo in required env variables
|
2025-03-20 13:56:49 -07:00 |
|
Krish Dholakia
|
706bcf4432
|
Merge pull request #9366 from JamesGuthrie/jg/vertex-output-dimensionality
fix: VertexAI outputDimensionality configuration
|
2025-03-20 13:55:33 -07:00 |
|
Ishaan Jaff
|
4b4a0b2612
|
supports_native_streaming
|
2025-03-20 13:52:30 -07:00 |
|
Krrish Dholakia
|
76c3957140
|
fix(main.py): fix OR import
|
2025-03-20 13:52:28 -07:00 |
|
Krish Dholakia
|
cb4155fb16
|
Merge pull request #9369 from graysonchen/feature/add_openrouter_api_base
feat: Add support for custom OPENROUTER_API_BASE via get_secret in co…
|
2025-03-20 13:52:03 -07:00 |
|
Krish Dholakia
|
0b399c391c
|
Merge pull request #9402 from BerriAI/litellm_openai_file_types
fix(types/utils.py): support openai 'file' message type
|
2025-03-20 13:46:44 -07:00 |
|
Ishaan Jaff
|
768dc79df9
|
Merge pull request #9332 from SrilakshmiC/patch-1
docs: Include Phoenix Page into sidebar under observability
|
2025-03-20 13:35:29 -07:00 |
|
Ishaan Jaff
|
0352559c66
|
supports_native_streaming
|
2025-03-20 13:34:57 -07:00 |
|
Ishaan Jaff
|
b04cf226aa
|
test_openai_o1_pro_response_api_streaming
|
2025-03-20 13:04:49 -07:00 |
|
Ishaan Jaff
|
a29587e178
|
MockResponsesAPIStreamingIterator
|
2025-03-20 12:30:09 -07:00 |
|
Ishaan Jaff
|
55115bf520
|
transform_responses_api_request
|
2025-03-20 12:28:55 -07:00 |
|
Ishaan Jaff
|
af8e917eb0
|
MockResponsesAPIStreamingIterator
|
2025-03-20 12:25:58 -07:00 |
|
Ishaan Jaff
|
c64f834272
|
add AsyncMockResponsesAPIStreamingIterator
|
2025-03-20 12:06:36 -07:00 |
|
Ishaan Jaff
|
0cd671785d
|
add fake_stream to llm http handler
|
2025-03-20 09:55:59 -07:00 |
|
Ishaan Jaff
|
bc174adcd0
|
add should_fake_stream
|
2025-03-20 09:54:26 -07:00 |
|
Ishaan Jaff
|
c58aff01e0
|
bump to openai==1.67.0
|
2025-03-20 09:26:05 -07:00 |
|
Ishaan Jaff
|
d915ab3f07
|
test_openai_o1_pro_response_api
|
2025-03-20 09:18:38 -07:00 |
|
Ishaan Jaff
|
7fee847ffc
|
test_openai_o1_pro_incomplete_response
|
2025-03-20 09:14:59 -07:00 |
|
Ishaan Jaff
|
b868509ddb
|
bump to openai==1.67.0
|
2025-03-20 09:06:13 -07:00 |
|
Krrish Dholakia
|
d9abbdc6b5
|
docs(response_api.md): update docs to use new o1-pro model example
|
2025-03-19 23:28:28 -07:00 |
|
Krrish Dholakia
|
8ef9129556
|
fix(types/utils.py): support openai 'file' message type
Closes https://github.com/BerriAI/litellm/issues/9365
|
2025-03-19 23:13:51 -07:00 |
|
Krish Dholakia
|
7cf7e80c5a
|
Merge pull request #9395 from BerriAI/litellm_ui_fixes_03_19_2025
Read Version from pyproject.toml / read-version (push) Successful in 18s
Helm unit test / unit-test (push) Successful in 21s
fix(internal_user_endpoints.py): re-introduce upsert on user not found
|
2025-03-19 22:58:32 -07:00 |
|
Krrish Dholakia
|
fe24b9d90b
|
feat(azure/gpt_transformation.py): add azure audio model support
Closes https://github.com/BerriAI/litellm/issues/6305
|
2025-03-19 22:57:49 -07:00 |
|
Ishaan Jaff
|
ef1bde153e
|
Merge pull request #9399 from BerriAI/litellm_icons_mar_19
[UI Improvement] Use local icons for model providers instead of downloading them
|
2025-03-19 21:19:50 -07:00 |
|
Ishaan Jaff
|
8bb47cb97c
|
Merge pull request #9374 from azdolinski/litellm_fix_icons
Litellm fix icons
|
2025-03-19 21:14:34 -07:00 |
|
Ishaan Jaff
|
9203910ab6
|
fix import hashlib
|
2025-03-19 21:08:19 -07:00 |
|
Ishaan Jaff
|
1bd7443c25
|
Merge pull request #9384 from BerriAI/litellm_prompt_management_custom
[Feat] - Allow building custom prompt management integration
|
2025-03-19 21:06:41 -07:00 |
|
Ishaan Jaff
|
23a09f1359
|
Merge pull request #9341 from BerriAI/litellm_fix_ssl_verify
[Bug Fix] - Azure OpenAI - ensure SSL verification runs
|
2025-03-19 21:03:24 -07:00 |
|
Ishaan Jaff
|
247e4d09ee
|
Merge branch 'main' into litellm_fix_ssl_verify
|
2025-03-19 21:03:06 -07:00 |
|
Ishaan Jaff
|
dae5c24b55
|
get_chat_completion_prompt
|
2025-03-19 20:50:15 -07:00 |
|
Ishaan Jaff
|
90366b5ca1
|
docs custom prompt management
|
2025-03-19 20:36:42 -07:00 |
|
Ishaan Jaff
|
4f6c998de9
|
docs custom prompt management
|
2025-03-19 20:35:21 -07:00 |
|
Ishaan Jaff
|
17e7718efc
|
docs custom prompt management
|
2025-03-19 20:25:09 -07:00 |
|
Krrish Dholakia
|
056d34742f
|
build(model_prices_and_context_window.json): fix native streaming flag
|
2025-03-19 19:53:19 -07:00 |
|
Krish Dholakia
|
3200a8f983
|
Merge pull request #9397 from BerriAI/litellm_add_o1_pro
build(model_prices_and_context_window.json): add o1-pro pricing
|
2025-03-19 19:52:09 -07:00 |
|
Krrish Dholakia
|
78781a0a55
|
build(model_prices_and_context_window.json): add o1-pro pricing
|
2025-03-19 19:35:35 -07:00 |
|
Krrish Dholakia
|
07b1e6aa1c
|
fix(internal_user_endpoints.py): re-introduce upsert on user not found
Fixes https://github.com/BerriAI/litellm/issues/9243
|
2025-03-19 19:28:11 -07:00 |
|
Krish Dholakia
|
220aa66f98
|
Merge pull request #9388 from BerriAI/revert-9387-fix_latency_redis
Read Version from pyproject.toml / read-version (push) Successful in 20s
Helm unit test / unit-test (push) Successful in 24s
Revert "Fix latency redis"
|
2025-03-19 18:11:35 -07:00 |
|
Krish Dholakia
|
0de015b4c5
|
Revert "Fix latency redis"
|
2025-03-19 18:11:22 -07:00 |
|
Krish Dholakia
|
57503954fe
|
Merge pull request #9387 from emerzon/fix_latency_redis
Fix latency redis
|
2025-03-19 18:11:18 -07:00 |
|
Emerson Gomes
|
c91aafeed8
|
Handle empty valid_deployments in LowestLatencyLoggingHandler
|
2025-03-19 19:56:57 -05:00 |
|