Krish Dholakia
|
a2414d09c1
|
Merge pull request #8356 from hakasecurity/feature/aim-post-call
Aim Security post-call guardrails support
|
2025-03-13 10:48:42 -07:00 |
|
Krish Dholakia
|
2c011d9a93
|
Merge pull request #9123 from omrishiv/8911-fix-model-encoding
Fixes bedrock modelId encoding for Inference Profiles
|
2025-03-13 10:42:32 -07:00 |
|
Tomer Bin
|
6ebf11e4b3
|
ruff format
|
2025-03-13 08:58:29 +02:00 |
|
Tomer Bin
|
32dedbe551
|
CR
|
2025-03-13 08:54:00 +02:00 |
|
Tomer Bin
|
4a31b32a88
|
Support post-call guards for stream and non-stream responses
|
2025-03-13 08:53:54 +02:00 |
|
Krrish Dholakia
|
be35c9a663
|
docs(reasoning_content.md): clarify 'thinking' param support is from v1.63.0+
Read Version from pyproject.toml / read-version (push) Successful in 13s
Helm unit test / unit-test (push) Successful in 22s
Fixes https://github.com/BerriAI/litellm/issues/9171
|
2025-03-12 22:30:57 -07:00 |
|
Krish Dholakia
|
58e5f3e0c9
|
Merge pull request #9193 from youngchannelforyou/feat/gemini_response_status_code
(gemini)Handle HTTP 201 status code in Vertex AI response
|
2025-03-12 22:24:01 -07:00 |
|
Krish Dholakia
|
cff1c1f7d8
|
Merge branch 'main' into litellm_dev_03_12_2025_p1
|
2025-03-12 22:14:02 -07:00 |
|
Krish Dholakia
|
1f2bddbc21
|
Merge pull request #9186 from BerriAI/litellm_dev_03_12_2025_p2
feat: initial commit adding support for credentials on proxy ui
|
2025-03-12 22:12:49 -07:00 |
|
sven
|
8d053930e9
|
(gemini)Handle HTTP 201 status code in Vertex AI response
|
2025-03-13 13:44:38 +09:00 |
|
Ishaan Jaff
|
1d31e25816
|
Merge pull request #9183 from BerriAI/litellm_router_responses_api_2
[Feat] - Add Responses API on LiteLLM Proxy
|
2025-03-12 21:28:16 -07:00 |
|
Cole McIntosh
|
501014414b
|
chore(init): update Azure default API version to 2025-02-01-preview
|
2025-03-12 22:02:48 -06:00 |
|
Krrish Dholakia
|
3dae3fa571
|
feat(add_model_tab.tsx): working e2e add model flow on UI
|
2025-03-12 20:42:40 -07:00 |
|
Ishaan Jaff
|
db1f48bbfb
|
responses_api
|
2025-03-12 20:38:05 -07:00 |
|
Cole McIntosh
|
0ea102f9bb
|
chore(init): update Azure default API version to 2024-12-01-preview
|
2025-03-12 21:33:49 -06:00 |
|
Krrish Dholakia
|
d604f52884
|
feat: refactor add models tab on UI to enable setting credentials
|
2025-03-12 20:32:01 -07:00 |
|
Ishaan Jaff
|
da2669154a
|
_update_kwargs_with_default_litellm_params
|
2025-03-12 19:26:12 -07:00 |
|
Ishaan Jaff
|
bcf8ecc9fc
|
_update_kwargs_with_default_litellm_params
|
2025-03-12 19:10:19 -07:00 |
|
Krrish Dholakia
|
52926408cd
|
feat(credential_accessor.py): fix upserting new credentials via accessor
|
2025-03-12 19:03:37 -07:00 |
|
Ishaan Jaff
|
31a470b7f3
|
get_custom_headers
|
2025-03-12 18:57:41 -07:00 |
|
Ishaan Jaff
|
e7e44694cc
|
fix response api handling
|
2025-03-12 18:49:17 -07:00 |
|
Ishaan Jaff
|
be25b298e6
|
fix async_moderation_hook
|
2025-03-12 18:45:54 -07:00 |
|
Ishaan Jaff
|
ed68ad7775
|
fix linting
|
2025-03-12 18:44:51 -07:00 |
|
Ishaan Jaff
|
a28ac57233
|
fix linting error
|
2025-03-12 18:42:42 -07:00 |
|
Ishaan Jaff
|
9e821c915c
|
_update_kwargs_with_default_litellm_params
|
2025-03-12 18:33:56 -07:00 |
|
Ishaan Jaff
|
28bcbe0fb1
|
LITELLM_METADATA_ROUTES
|
2025-03-12 18:20:07 -07:00 |
|
Ishaan Jaff
|
c82ef41dc4
|
test_openai_responses_litellm_router_no_metadata
|
2025-03-12 18:18:07 -07:00 |
|
Ishaan Jaff
|
8689a25396
|
_handle_llm_api_exception
|
2025-03-12 17:47:11 -07:00 |
|
Ishaan Jaff
|
2ca2ecacb5
|
fix responses_api
|
2025-03-12 17:39:07 -07:00 |
|
Ishaan Jaff
|
d4ed384984
|
working spend tracking + logging for response api
|
2025-03-12 17:29:25 -07:00 |
|
Krrish Dholakia
|
c039a1e948
|
feat: initial commit adding support for credentials on proxy ui
|
2025-03-12 17:28:18 -07:00 |
|
Ishaan Jaff
|
98718f6f09
|
working responses_api
|
2025-03-12 17:26:12 -07:00 |
|
Ishaan Jaff
|
7546dfde41
|
use correct get custom headers
|
2025-03-12 17:16:51 -07:00 |
|
Ishaan Jaff
|
f2041762e6
|
add responses_api
|
2025-03-12 17:08:16 -07:00 |
|
Ishaan Jaff
|
b2d9935567
|
use ProxyBaseLLMRequestProcessing
|
2025-03-12 16:54:33 -07:00 |
|
Krrish Dholakia
|
1250ae6645
|
build(model_prices_and_context_window.json): add data zone pricing
|
2025-03-12 16:40:11 -07:00 |
|
Krrish Dholakia
|
84b4d1509a
|
build(model_prices_and_context_window.json): add gpt-4o-mini-realtime-preview azure model cost
|
2025-03-12 16:27:27 -07:00 |
|
Krrish Dholakia
|
c5256d3da6
|
build(model_prices_and_context_window.json): add o1 data zone pricing
|
2025-03-12 16:14:12 -07:00 |
|
Ishaan Jaff
|
d808fa3c23
|
test_openai_responses_litellm_router
|
2025-03-12 16:13:48 -07:00 |
|
Krrish Dholakia
|
a14ac0505c
|
build(model_prices_and_context_window.json): add azure o1 eu/us pricing
|
2025-03-12 16:09:49 -07:00 |
|
Krrish Dholakia
|
1606e72440
|
build(model_prices_and_context_window.json): add azure eu/us data zone pricing
|
2025-03-12 16:04:04 -07:00 |
|
Ishaan Jaff
|
89d30d39f6
|
factory_function
|
2025-03-12 15:27:34 -07:00 |
|
Ishaan Jaff
|
32688df0c2
|
_generic_api_call_with_fallbacks
|
2025-03-12 15:26:37 -07:00 |
|
Krrish Dholakia
|
88e9edf7db
|
refactor: update method signature
|
2025-03-12 15:23:38 -07:00 |
|
Krrish Dholakia
|
738c0b873d
|
fix(azure_ai/transformation.py): support passing api version to azure ai services endpoint
Fixes https://github.com/BerriAI/litellm/issues/7275
|
2025-03-12 15:16:42 -07:00 |
|
Krish Dholakia
|
2d957a0ed9
|
Merge branch 'main' into litellm_dev_03_10_2025_p3
|
2025-03-12 14:56:01 -07:00 |
|
Ishaan Jaff
|
2460f3cbab
|
test_validate_environment
|
2025-03-12 12:57:40 -07:00 |
|
Krrish Dholakia
|
b8d1166e0c
|
fix(llm_caching_handler.py): handle no current event loop error
|
2025-03-12 12:29:25 -07:00 |
|
Ishaan Jaff
|
39d391d8e7
|
Optional[Dict]
|
2025-03-12 12:29:13 -07:00 |
|
Krrish Dholakia
|
7a8165eaba
|
fix(llm_caching_handler.py): Add event loop to llm client cache info
Fixes https://github.com/BerriAI/litellm/issues/7667
|
2025-03-12 12:24:24 -07:00 |
|