Commit graph

3617 commits

Author SHA1 Message Date
Ishaan Jaff
c1a9881d5c
Merge pull request #4697 from BerriAI/litellm_fix_sso_bug
[Fix] Bug - Clear user_id from cache when /user/update is called
2024-07-13 14:39:47 -07:00
Krrish Dholakia
7e769f3b89 fix: fix linting errors 2024-07-13 14:39:42 -07:00
Krrish Dholakia
55e153556a test(test_pass_through_endpoints.py): add test for rpm limit support 2024-07-13 13:49:20 -07:00
Ishaan Jaff
bba748eaf4 fix test rules 2024-07-13 13:23:23 -07:00
Ishaan Jaff
56b69eba18 test updating user role 2024-07-13 13:13:40 -07:00
Krrish Dholakia
6b78e39600 feat(guardrails.py): allow setting logging_only in guardrails_config for presidio pii masking integration 2024-07-13 12:22:17 -07:00
Krrish Dholakia
caa01d20cb build: re-run ci/cd 2024-07-13 11:41:35 -07:00
Ishaan Jaff
d0dbc0742b fix exception raised in factory.py 2024-07-13 09:55:04 -07:00
Ishaan Jaff
c7f74b0297 test - test_completion_bedrock_invalid_role_exception 2024-07-13 09:54:32 -07:00
Ishaan Jaff
03933de775 fix exception raised in factory.py 2024-07-13 09:54:32 -07:00
Krish Dholakia
66cedccd6b
Merge pull request #4686 from BerriAI/litellm_custom_chat_endpoints
docs(pass_through.md): Creating custom chat endpoints on proxy
2024-07-13 09:45:17 -07:00
Ishaan Jaff
8203174faf ci/cd run again 2024-07-12 19:08:59 -07:00
Ishaan Jaff
2758a9165b test_async_response_azure 2024-07-12 19:04:05 -07:00
Krrish Dholakia
0decc36bed fix(factory.py): handle message content being a list instead of string
Fixes https://github.com/BerriAI/litellm/issues/4679
2024-07-12 19:00:39 -07:00
Krrish Dholakia
667fd2b376 docs(pass_through.md): add doc on creating custom chat endpoints on proxy
Allows developers to call proxy with anthropic sdk/boto3/etc.
2024-07-12 18:48:40 -07:00
Ishaan Jaff
7918f41aca test expired key raises correct exception 2024-07-12 18:45:01 -07:00
Krrish Dholakia
f5b3cc6c02 fix(litellm_logging.py): fix condition check
Fixes https://github.com/BerriAI/litellm/issues/4633
2024-07-12 09:22:19 -07:00
Krrish Dholakia
88eb25da5c fix(bedrock_httpx.py): handle user error - malformed system prompt
if user passes in system prompt as a list of content blocks, handle that
2024-07-12 08:28:50 -07:00
Krish Dholakia
d72bcdbce3
Merge pull request #4669 from BerriAI/litellm_logging_only_masking
Flag for PII masking on Logging only
2024-07-11 22:03:37 -07:00
Krish Dholakia
5ad341d0ff
Merge pull request #4607 from maamalama/helicone-cohere
Helicone Headers & Cohere support
2024-07-11 22:01:44 -07:00
Krrish Dholakia
8d4e7f9967 test(test_assistants.py): handle openai api instability 2024-07-11 21:32:43 -07:00
Ishaan Jaff
4b8d33e6a8 ci/cd run again 2024-07-11 21:16:23 -07:00
Krrish Dholakia
b2e46086dd fix(utils.py): fix recreating model response object when stream usage is true 2024-07-11 21:01:12 -07:00
Krrish Dholakia
1300223f51 test: fix test 2024-07-11 20:09:24 -07:00
Ishaan Jaff
e112379d2f ci/cd run again 2024-07-11 19:26:20 -07:00
Krrish Dholakia
9d918d2ac7 fix(presidio_pii_masking.py): support logging_only pii masking 2024-07-11 18:04:12 -07:00
Krrish Dholakia
9deb9b4e3f feat(guardrails): Flag for PII Masking on Logging
Fixes https://github.com/BerriAI/litellm/issues/4580
2024-07-11 16:09:34 -07:00
Ishaan Jaff
8bf50ac5db
Merge pull request #4661 from BerriAI/litellm_fix_mh
[Fix] Model Hub - Show supports vision correctly
2024-07-11 15:03:37 -07:00
Ishaan Jaff
46493303ed test get mode info for gemini/gemini-1.5-flash 2024-07-11 13:04:18 -07:00
Krrish Dholakia
1ba3fcc3fb feat(utils.py): accept 'api_key' as param for validate_environment
Closes https://github.com/BerriAI/litellm/issues/4375
2024-07-11 12:02:23 -07:00
Krrish Dholakia
2163434ff3 fix(llm_cost_calc/google.py): fix google embedding cost calculation
Fixes https://github.com/BerriAI/litellm/issues/4630
2024-07-11 11:55:48 -07:00
Ishaan Jaff
e3470d8e91
Merge pull request #4658 from BerriAI/litellm_check_otel_spans
[Test-Proxy] Otel Traces
2024-07-11 10:41:51 -07:00
Ishaan Jaff
8d7db56deb fix Local only test. WIP 2024-07-11 10:30:40 -07:00
Krrish Dholakia
57607dfc47 test(test_alangfuse.py): fix test to expect correct response object 2024-07-11 09:00:31 -07:00
Ishaan Jaff
cb6ddaf1f9 test - otel spans 2024-07-11 08:01:18 -07:00
Krish Dholakia
dacce3d78b
Merge pull request #4635 from BerriAI/litellm_anthropic_adapter
Anthropic `/v1/messages` endpoint support
2024-07-10 22:41:53 -07:00
Krrish Dholakia
31829855c0 feat(proxy_server.py): working /v1/messages with config.yaml
Adds async router support for adapter_completion call
2024-07-10 18:53:54 -07:00
Krrish Dholakia
2f8dbbeb97 feat(proxy_server.py): working /v1/messages endpoint
Works with claude engineer
2024-07-10 18:15:38 -07:00
Ishaan Jaff
f49837df19 ci/cd run again 2024-07-10 18:06:09 -07:00
Ishaan Jaff
ca76d2fd72 fix test_completion_bedrock_httpx_models 2024-07-10 17:42:40 -07:00
Ishaan Jaff
7efe9beac5 fix test_bedrock_httpx_streaming 2024-07-10 17:14:53 -07:00
Ishaan Jaff
de0dacc42d fix test proxy routes 2024-07-10 16:58:53 -07:00
Ishaan Jaff
a741586519 test openai files endpoints 2024-07-10 15:54:55 -07:00
Ishaan Jaff
f18754b6ed test - delete file 2024-07-10 15:42:15 -07:00
Ishaan Jaff
5187569e11 test retrieve file 2024-07-10 15:00:27 -07:00
Ishaan Jaff
7e82d98299 test assistants endpoint 2024-07-10 11:15:28 -07:00
Ishaan Jaff
3480382495 test - delete assistants 2024-07-10 10:35:30 -07:00
Krrish Dholakia
5d6e172d5c feat(anthropic_adapter.py): support for translating anthropic params to openai format 2024-07-10 00:32:28 -07:00
Krrish Dholakia
43d86528c1 style: initial commit 2024-07-09 13:38:33 -07:00
Krrish Dholakia
a1986fab60 fix(vertex_httpx.py): add sync vertex image gen support
Fixes https://github.com/BerriAI/litellm/issues/4623
2024-07-09 13:33:54 -07:00