Ishaan Jaff
|
90366b5ca1
|
docs custom prompt management
|
2025-03-19 20:36:42 -07:00 |
|
Ishaan Jaff
|
4f6c998de9
|
docs custom prompt management
|
2025-03-19 20:35:21 -07:00 |
|
Ishaan Jaff
|
17e7718efc
|
docs custom prompt management
|
2025-03-19 20:25:09 -07:00 |
|
Krrish Dholakia
|
056d34742f
|
build(model_prices_and_context_window.json): fix native streaming flag
|
2025-03-19 19:53:19 -07:00 |
|
Krish Dholakia
|
3200a8f983
|
Merge pull request #9397 from BerriAI/litellm_add_o1_pro
build(model_prices_and_context_window.json): add o1-pro pricing
|
2025-03-19 19:52:09 -07:00 |
|
Krrish Dholakia
|
78781a0a55
|
build(model_prices_and_context_window.json): add o1-pro pricing
|
2025-03-19 19:35:35 -07:00 |
|
Krrish Dholakia
|
07b1e6aa1c
|
fix(internal_user_endpoints.py): re-introduce upsert on user not found
Fixes https://github.com/BerriAI/litellm/issues/9243
|
2025-03-19 19:28:11 -07:00 |
|
Krish Dholakia
|
220aa66f98
|
Merge pull request #9388 from BerriAI/revert-9387-fix_latency_redis
Read Version from pyproject.toml / read-version (push) Successful in 20s
Helm unit test / unit-test (push) Successful in 24s
Revert "Fix latency redis"
|
2025-03-19 18:11:35 -07:00 |
|
Krish Dholakia
|
0de015b4c5
|
Revert "Fix latency redis"
|
2025-03-19 18:11:22 -07:00 |
|
Krish Dholakia
|
57503954fe
|
Merge pull request #9387 from emerzon/fix_latency_redis
Fix latency redis
|
2025-03-19 18:11:18 -07:00 |
|
Emerson Gomes
|
c91aafeed8
|
Handle empty valid_deployments in LowestLatencyLoggingHandler
|
2025-03-19 19:56:57 -05:00 |
|
Ishaan Jaff
|
e9d9c526fa
|
docs custom prompt management
|
2025-03-19 17:49:00 -07:00 |
|
Ishaan Jaff
|
30fdd934a4
|
TestCustomPromptManagement
|
2025-03-19 17:40:15 -07:00 |
|
Ishaan Jaff
|
a49bd460e0
|
get_custom_logger_for_prompt_management
|
2025-03-19 17:38:24 -07:00 |
|
Ishaan Jaff
|
805542005a
|
CustomPromptManagement
|
2025-03-19 16:38:59 -07:00 |
|
Krish Dholakia
|
08e4614842
|
Merge pull request #9382 from BerriAI/litellm_dev_03_19_2025_p2
Support 'prisma migrate' for db schema changes
|
2025-03-19 16:37:47 -07:00 |
|
Ishaan Jaff
|
730f61f476
|
get_custom_loggers_for_type
|
2025-03-19 16:36:41 -07:00 |
|
Ishaan Jaff
|
f3d850c79a
|
example X42PromptManagement
|
2025-03-19 16:24:41 -07:00 |
|
Ishaan Jaff
|
5d3bb86f07
|
define CustomPromptManagement
|
2025-03-19 16:22:23 -07:00 |
|
Ishaan Jaff
|
6b4b93caa0
|
docs prompt mgtm
|
2025-03-19 15:51:26 -07:00 |
|
Krish Dholakia
|
9432d1a865
|
Merge pull request #9357 from BerriAI/litellm_dev_03_18_2025_p2
fix(lowest_tpm_rpm_v2.py): support batch writing increments to redis
|
2025-03-19 15:45:10 -07:00 |
|
Krrish Dholakia
|
61cd02200c
|
fix: remove unused import
|
2025-03-19 15:33:04 -07:00 |
|
Ishaan Jaff
|
973ad94550
|
docs prompt management
|
2025-03-19 14:37:32 -07:00 |
|
Artur Zdolinski
|
f3e782e3aa
|
remove linter changes to match old code style
|
2025-03-19 22:35:14 +01:00 |
|
Ishaan Jaff
|
560518cc44
|
custom Prompt Management - Overview
|
2025-03-19 14:31:56 -07:00 |
|
Krrish Dholakia
|
665fdfc788
|
feat(prisma_client.py): initial commit add prisma migration support to proxy
|
2025-03-19 14:26:59 -07:00 |
|
Krish Dholakia
|
0d5abefe5f
|
Merge pull request #9376 from BerriAI/litellm_fix_cred_check
Feature flag checking LiteLLM_CredentialsTable
|
2025-03-19 12:47:44 -07:00 |
|
Krrish Dholakia
|
8464f06df7
|
docs(config_settings.md): cleanup
|
2025-03-19 12:06:07 -07:00 |
|
Krrish Dholakia
|
041d5391eb
|
test(test_proxy_server.py): make test work on ci/cd
|
2025-03-19 12:01:37 -07:00 |
|
Krrish Dholakia
|
79db96dd13
|
docs(config_settings.md): update env var information
|
2025-03-19 11:47:14 -07:00 |
|
Krrish Dholakia
|
858da57b3c
|
test(test_proxy_server.py): add unit test to ensure get credentials only called behind feature flag
|
2025-03-19 11:44:00 -07:00 |
|
Krrish Dholakia
|
c13f6514bf
|
fix(proxy_server.py): move db credential check to only run behind feature flag
|
2025-03-19 11:35:14 -07:00 |
|
Ishaan Jaff
|
08c362e1b1
|
Merge pull request #9375 from colesmcintosh/fix-litellm-proxy-docs
docs(litellm_proxy): correct parameter assignment in litellm proxy docs
|
2025-03-19 11:14:19 -07:00 |
|
Cole McIntosh
|
c5687ce94b
|
docs(litellm_proxy): update parameter assignment for streaming call
|
2025-03-19 11:13:17 -06:00 |
|
Cole McIntosh
|
e2caf21131
|
docs(litellm_proxy): correct parameter assignment in sample litellm proxy call docs
|
2025-03-19 11:08:34 -06:00 |
|
Krrish Dholakia
|
edc38280ea
|
docs(enterprise.md): clarify sla's
Read Version from pyproject.toml / read-version (push) Successful in 18s
Helm unit test / unit-test (push) Successful in 20s
|
2025-03-19 09:09:02 -07:00 |
|
azdolinski
|
95c762e572
|
sync_latest_changes
|
2025-03-19 14:47:34 +00:00 |
|
Artur Zdolinski
|
3be8e07d46
|
Merge branch 'BerriAI:main' into litellm_fix_icons
|
2025-03-19 15:40:49 +01:00 |
|
azdolinski
|
044508e075
|
set_local_icons
|
2025-03-19 14:37:57 +00:00 |
|
Grayson Chen
|
f3a0261bb4
|
feat: Add support for custom OPENROUTER_API_BASE via get_secret in completion function
|
2025-03-19 21:09:03 +08:00 |
|
James Guthrie
|
437dbe7246
|
fix: VertexAI outputDimensionality configuration
VertexAI's API documentation [1] is an absolute mess. In it, they
describe the parameter to configure output dimensionality as
`output_dimensionality`. In the API example, they switch to using snake
case `outputDimensionality`, which is the correct variant.
[1]: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/text-embeddings-api#generative-ai-get-text-embedding-drest
|
2025-03-19 11:07:36 +01:00 |
|
Krish Dholakia
|
01c6cbd270
|
Merge pull request #9363 from BerriAI/litellm_dev_03_18_2025_p3
Read Version from pyproject.toml / read-version (push) Successful in 18s
Helm unit test / unit-test (push) Successful in 21s
fix(common_utils.py): handle cris only model
|
2025-03-18 23:36:12 -07:00 |
|
Krrish Dholakia
|
9adad381b4
|
fix(common_utils.py): handle cris only model
Fixes https://github.com/BerriAI/litellm/issues/9161#issuecomment-2734905153
|
2025-03-18 23:35:43 -07:00 |
|
Ishaan Jaff
|
e32aee9124
|
Merge pull request #9353 from BerriAI/litellm_arize_dynamic_logging
[Feat] - API - Allow using dynamic Arize AI Spaces on LiteLLM
|
2025-03-18 23:35:28 -07:00 |
|
Krish Dholakia
|
6347b694ee
|
Merge pull request #9335 from BerriAI/litellm_dev_03_17_2025_p3
Contributor PR: Fix sagemaker too little data for content error
|
2025-03-18 23:24:07 -07:00 |
|
Ishaan Jaff
|
8690873488
|
test_arize_dynamic_params
|
2025-03-18 23:22:55 -07:00 |
|
Ishaan Jaff
|
8568caf532
|
test_arize_dynamic_params
|
2025-03-18 23:18:07 -07:00 |
|
Krrish Dholakia
|
2c69b5d221
|
fix(base_routing_strategy.py): refactor for cleaner code
|
2025-03-18 22:57:05 -07:00 |
|
Krish Dholakia
|
c83e19e8c7
|
Merge pull request #9361 from BerriAI/litellm_fix_azure_gpt_4o_pricing
build(model_prices_and_context_window.json): fix azure gpt-4o pricing
|
2025-03-18 22:42:38 -07:00 |
|
Krrish Dholakia
|
084e8c425c
|
refactor(base_routing_strategy.py): fix function names
|
2025-03-18 22:41:02 -07:00 |
|