Krrish Dholakia
|
9d02d51a17
|
docs(pass_through.md): cleanup docs
|
2024-07-13 09:56:06 -07:00 |
|
Ishaan Jaff
|
c7f74b0297
|
test - test_completion_bedrock_invalid_role_exception
|
2024-07-13 09:54:32 -07:00 |
|
Ishaan Jaff
|
23cccba070
|
fix str from BadRequestError
|
2024-07-13 09:54:32 -07:00 |
|
Ishaan Jaff
|
03933de775
|
fix exception raised in factory.py
|
2024-07-13 09:54:32 -07:00 |
|
Krish Dholakia
|
66cedccd6b
|
Merge pull request #4686 from BerriAI/litellm_custom_chat_endpoints
docs(pass_through.md): Creating custom chat endpoints on proxy
|
2024-07-13 09:45:17 -07:00 |
|
Ishaan Jaff
|
8203174faf
|
ci/cd run again
|
2024-07-12 19:08:59 -07:00 |
|
Ishaan Jaff
|
2758a9165b
|
test_async_response_azure
|
2024-07-12 19:04:05 -07:00 |
|
Krrish Dholakia
|
0decc36bed
|
fix(factory.py): handle message content being a list instead of string
Fixes https://github.com/BerriAI/litellm/issues/4679
|
2024-07-12 19:00:39 -07:00 |
|
Ishaan Jaff
|
70b96d12e9
|
Merge pull request #4685 from BerriAI/litellm_return_type_expired_key
[Fix] Proxy Return type=expire_key on expired Key errors
|
2024-07-12 18:52:51 -07:00 |
|
Krrish Dholakia
|
667fd2b376
|
docs(pass_through.md): add doc on creating custom chat endpoints on proxy
Allows developers to call proxy with anthropic sdk/boto3/etc.
|
2024-07-12 18:48:40 -07:00 |
|
Ishaan Jaff
|
7918f41aca
|
test expired key raises correct exception
|
2024-07-12 18:45:01 -07:00 |
|
Ishaan Jaff
|
57ced1d25e
|
raise roxyErrorTypes.expired_key on expired key
|
2024-07-12 18:41:39 -07:00 |
|
Ishaan Jaff
|
34ff0a7e57
|
raise expired_key error
|
2024-07-12 18:39:00 -07:00 |
|
Ishaan Jaff
|
92bf98b30f
|
Merge pull request #4684 from BerriAI/litellm_safe_memory_mode
[Feat] Allow safe memory mode
|
2024-07-12 18:32:16 -07:00 |
|
Ishaan Jaff
|
eb342bbe2c
|
Merge pull request #4683 from BerriAI/litellm_dealloc_in_mem_cache
[Fix] Mem Util - De Reference when removing from in-memory cache
|
2024-07-12 18:31:56 -07:00 |
|
Ishaan Jaff
|
24918c5041
|
Merge pull request #4682 from BerriAI/litellm_mem_leak_debug
show stack trace of 10 files taking up memory
|
2024-07-12 18:31:41 -07:00 |
|
Ishaan Jaff
|
cf5f11cc84
|
Merge pull request #4681 from BerriAI/litellm_mem_usage
[Fix] Reduce Mem Usage - only set ttl for requests to 2 mins
|
2024-07-12 18:31:19 -07:00 |
|
Ishaan Jaff
|
08efef5316
|
feat add safe_memory_mode
|
2024-07-12 18:18:39 -07:00 |
|
Ishaan Jaff
|
0099bf7859
|
de-ref unused cache items
|
2024-07-12 16:38:36 -07:00 |
|
Krrish Dholakia
|
fd743aaefd
|
feat(opentelemetry.py): support logging call metadata to otel
|
2024-07-12 15:41:34 -07:00 |
|
Ishaan Jaff
|
1a8fce8edb
|
show stack trace of 10 files tking up memory
|
2024-07-12 15:33:03 -07:00 |
|
Ishaan Jaff
|
8c8dcdbdb1
|
reduce ttil for update_request_status
|
2024-07-12 15:14:54 -07:00 |
|
Krrish Dholakia
|
b74095deca
|
bump: version 1.41.19 → 1.41.20
|
2024-07-12 09:54:26 -07:00 |
|
Krrish Dholakia
|
f5b3cc6c02
|
fix(litellm_logging.py): fix condition check
Fixes https://github.com/BerriAI/litellm/issues/4633
|
2024-07-12 09:22:19 -07:00 |
|
Krrish Dholakia
|
88eb25da5c
|
fix(bedrock_httpx.py): handle user error - malformed system prompt
if user passes in system prompt as a list of content blocks, handle that
|
2024-07-12 08:28:50 -07:00 |
|
Krish Dholakia
|
905abab526
|
Merge pull request #4673 from andreaponti5/fix-langfuse-prompt-logging
Fix: Langfuse prompt logging
|
2024-07-12 07:47:24 -07:00 |
|
Andrea Ponti
|
496445481d
|
Rollback to metadata deepcopy
|
2024-07-12 11:25:23 +02:00 |
|
Krrish Dholakia
|
cff66d6151
|
fix(proxy_server.py): fix linting errors
|
2024-07-11 22:12:33 -07:00 |
|
Krrish Dholakia
|
5b40b62079
|
bump: version 1.41.18 → 1.41.19
|
2024-07-11 22:04:05 -07:00 |
|
Krish Dholakia
|
d72bcdbce3
|
Merge pull request #4669 from BerriAI/litellm_logging_only_masking
Flag for PII masking on Logging only
|
2024-07-11 22:03:37 -07:00 |
|
Krish Dholakia
|
f0b8c0e7fb
|
Merge pull request #4588 from Manouchehri/vertex-seed-2973
feat(vertex_httpx.py): Add seed parameter
|
2024-07-11 22:02:13 -07:00 |
|
Krish Dholakia
|
5ad341d0ff
|
Merge pull request #4607 from maamalama/helicone-cohere
Helicone Headers & Cohere support
|
2024-07-11 22:01:44 -07:00 |
|
Krish Dholakia
|
1362a91d66
|
Merge pull request #4612 from colegottdank/main
Update Helicone Docs
|
2024-07-11 22:00:30 -07:00 |
|
Krish Dholakia
|
533d2dba0b
|
Merge pull request #4650 from msabramo/litellm_call_id_in_response
Proxy: Add `x-litellm-call-id` HTTP response header
|
2024-07-11 21:57:03 -07:00 |
|
Krish Dholakia
|
72f1c9181d
|
Merge branch 'main' into litellm_call_id_in_response
|
2024-07-11 21:54:49 -07:00 |
|
Krish Dholakia
|
79d6b69d1c
|
Merge pull request #4651 from msabramo/docs-logging-cleanup
Docs: Miscellaneous cleanup of `docs/my-website/docs/proxy/logging.md`
|
2024-07-11 21:52:20 -07:00 |
|
Krrish Dholakia
|
ff3f3ed933
|
docs(health.md): add audio transcription to health check endpoitns
|
2024-07-11 21:46:15 -07:00 |
|
Krrish Dholakia
|
8d4e7f9967
|
test(test_assistants.py): handle openai api instability
|
2024-07-11 21:32:43 -07:00 |
|
Ishaan Jaff
|
4b8d33e6a8
|
ci/cd run again
|
2024-07-11 21:16:23 -07:00 |
|
Krrish Dholakia
|
28a07ee1a4
|
docs(pii_masking.md): update docs
|
2024-07-11 21:14:46 -07:00 |
|
Ishaan Jaff
|
8dbf0a634a
|
fix supports vision test
|
2024-07-11 21:14:25 -07:00 |
|
Krrish Dholakia
|
5f5c925efd
|
fix(guardrails.py): fix guardrail item typing
|
2024-07-11 21:01:56 -07:00 |
|
Krrish Dholakia
|
b2e46086dd
|
fix(utils.py): fix recreating model response object when stream usage is true
|
2024-07-11 21:01:12 -07:00 |
|
Krrish Dholakia
|
1300223f51
|
test: fix test
|
2024-07-11 20:09:24 -07:00 |
|
Ishaan Jaff
|
e112379d2f
|
ci/cd run again
|
2024-07-11 19:26:20 -07:00 |
|
Ishaan Jaff
|
1760ee5dba
|
bump: version 1.41.17 → 1.41.18
|
2024-07-11 19:13:21 -07:00 |
|
Ishaan Jaff
|
aec468c0e9
|
ui new build
|
2024-07-11 19:13:08 -07:00 |
|
Ishaan Jaff
|
d3f51a88e5
|
Merge pull request #4666 from BerriAI/dependabot/pip/azure-identity-1.16.1
Bump azure-identity from 1.16.0 to 1.16.1
|
2024-07-11 19:10:57 -07:00 |
|
Ishaan Jaff
|
a59cdeaa6d
|
Merge pull request #4668 from BerriAI/litellm_fix_setting_router_settings_ui
[UI-Fix] Setting router settings on ui
|
2024-07-11 19:10:37 -07:00 |
|
Ishaan Jaff
|
f65e1814f8
|
Merge pull request #4667 from BerriAI/litell_ui_fix
[Fix] UI Allow setting custom model names for OpenAI compatible endpoints
|
2024-07-11 19:10:29 -07:00 |
|