Ishaan Jaff
|
2cfafdc7c3
|
fix test tg - they are a very unstable provider
|
2024-07-15 21:40:06 -07:00 |
|
Krish Dholakia
|
74e263b8de
|
Merge pull request #4723 from BerriAI/litellm_add_dynamic_api_base
fix(utils.py): allow passing dynamic api base for openai-compatible endpoints (Fireworks AI, etc.)
|
2024-07-15 21:35:22 -07:00 |
|
Krrish Dholakia
|
d136f2b8a7
|
fix(litellm_logging.py): fix circular reference
|
2024-07-15 21:28:33 -07:00 |
|
Krrish Dholakia
|
155ee7e99c
|
fix(factory.py): allow converting pdf url to base64
|
2024-07-15 21:11:53 -07:00 |
|
Ishaan Jaff
|
4aa98dcbe9
|
test_amazing_sync_embedding
|
2024-07-15 20:53:13 -07:00 |
|
Ishaan Jaff
|
4baa48ba4b
|
fix test_sync_embedding
|
2024-07-15 20:51:29 -07:00 |
|
Krrish Dholakia
|
a15ba2592a
|
fix(utils.py): allow passing dynamic api base for openai-compatible endpoints
|
2024-07-15 20:00:44 -07:00 |
|
Krrish Dholakia
|
023f10cf1c
|
fix(vertex_httpx.py): return grounding metadata
|
2024-07-15 19:43:37 -07:00 |
|
Krrish Dholakia
|
959c627dd3
|
fix(litellm_logging.py): log response_cost=0 for failed calls
Fixes https://github.com/BerriAI/litellm/issues/4604
|
2024-07-15 19:25:56 -07:00 |
|
Pamela Fox
|
0c98cc6a86
|
New line
|
2024-07-15 11:13:14 -07:00 |
|
Pamela Fox
|
d43dbc756b
|
Count tokens for tools
|
2024-07-15 11:07:52 -07:00 |
|
Krrish Dholakia
|
82ca7af6df
|
fix(vertex_httpx.py): google search grounding fix
|
2024-07-14 08:06:17 -07:00 |
|
Krish Dholakia
|
6bf60d773e
|
Merge pull request #4696 from BerriAI/litellm_guardrail_logging_only
Allow setting `logging_only` in guardrails config
|
2024-07-13 21:50:43 -07:00 |
|
Krish Dholakia
|
7bc9a189e7
|
Merge branch 'main' into litellm_add_azure_ai_pricing
|
2024-07-13 21:50:26 -07:00 |
|
Krrish Dholakia
|
d475311eb3
|
test(test_presidio_pii_masking.py): fix presidio test
|
2024-07-13 21:44:22 -07:00 |
|
Krrish Dholakia
|
fde434be66
|
feat(proxy_server.py): return 'retry-after' param for rate limited requests
Closes https://github.com/BerriAI/litellm/issues/4695
|
2024-07-13 17:15:20 -07:00 |
|
Krrish Dholakia
|
b1be355d42
|
build(model_prices_and_context_window.json): add azure ai jamba instruct pricing + token details
Adds jamba instruct, mistral, llama3 pricing + token info for azure_ai
|
2024-07-13 16:34:31 -07:00 |
|
Krish Dholakia
|
bc58e44d8f
|
Merge pull request #4701 from BerriAI/litellm_rpm_support_passthrough
Support key-rpm limits on pass-through endpoints
|
2024-07-13 15:22:29 -07:00 |
|
Ishaan Jaff
|
1206b0b6a9
|
Merge pull request #4693 from BerriAI/litellm_bad_req_error_mapping
fix - Raise `BadRequestError` when passing the wrong role
|
2024-07-13 15:05:54 -07:00 |
|
Krrish Dholakia
|
da4bd47e3e
|
test: test fixes
|
2024-07-13 15:04:13 -07:00 |
|
Krrish Dholakia
|
77325358b4
|
fix(pass_through_endpoints.py): fix client init
|
2024-07-13 14:46:56 -07:00 |
|
Ishaan Jaff
|
c1a9881d5c
|
Merge pull request #4697 from BerriAI/litellm_fix_sso_bug
[Fix] Bug - Clear user_id from cache when /user/update is called
|
2024-07-13 14:39:47 -07:00 |
|
Krrish Dholakia
|
7e769f3b89
|
fix: fix linting errors
|
2024-07-13 14:39:42 -07:00 |
|
Krrish Dholakia
|
55e153556a
|
test(test_pass_through_endpoints.py): add test for rpm limit support
|
2024-07-13 13:49:20 -07:00 |
|
Ishaan Jaff
|
bba748eaf4
|
fix test rules
|
2024-07-13 13:23:23 -07:00 |
|
Ishaan Jaff
|
56b69eba18
|
test updating user role
|
2024-07-13 13:13:40 -07:00 |
|
Krrish Dholakia
|
6b78e39600
|
feat(guardrails.py): allow setting logging_only in guardrails_config for presidio pii masking integration
|
2024-07-13 12:22:17 -07:00 |
|
Krrish Dholakia
|
caa01d20cb
|
build: re-run ci/cd
|
2024-07-13 11:41:35 -07:00 |
|
Ishaan Jaff
|
d0dbc0742b
|
fix exception raised in factory.py
|
2024-07-13 09:55:04 -07:00 |
|
Ishaan Jaff
|
c7f74b0297
|
test - test_completion_bedrock_invalid_role_exception
|
2024-07-13 09:54:32 -07:00 |
|
Ishaan Jaff
|
03933de775
|
fix exception raised in factory.py
|
2024-07-13 09:54:32 -07:00 |
|
Krish Dholakia
|
66cedccd6b
|
Merge pull request #4686 from BerriAI/litellm_custom_chat_endpoints
docs(pass_through.md): Creating custom chat endpoints on proxy
|
2024-07-13 09:45:17 -07:00 |
|
Ishaan Jaff
|
8203174faf
|
ci/cd run again
|
2024-07-12 19:08:59 -07:00 |
|
Ishaan Jaff
|
2758a9165b
|
test_async_response_azure
|
2024-07-12 19:04:05 -07:00 |
|
Krrish Dholakia
|
0decc36bed
|
fix(factory.py): handle message content being a list instead of string
Fixes https://github.com/BerriAI/litellm/issues/4679
|
2024-07-12 19:00:39 -07:00 |
|
Krrish Dholakia
|
667fd2b376
|
docs(pass_through.md): add doc on creating custom chat endpoints on proxy
Allows developers to call proxy with anthropic sdk/boto3/etc.
|
2024-07-12 18:48:40 -07:00 |
|
Ishaan Jaff
|
7918f41aca
|
test expired key raises correct exception
|
2024-07-12 18:45:01 -07:00 |
|
Krrish Dholakia
|
f5b3cc6c02
|
fix(litellm_logging.py): fix condition check
Fixes https://github.com/BerriAI/litellm/issues/4633
|
2024-07-12 09:22:19 -07:00 |
|
Krrish Dholakia
|
88eb25da5c
|
fix(bedrock_httpx.py): handle user error - malformed system prompt
if user passes in system prompt as a list of content blocks, handle that
|
2024-07-12 08:28:50 -07:00 |
|
Krish Dholakia
|
d72bcdbce3
|
Merge pull request #4669 from BerriAI/litellm_logging_only_masking
Flag for PII masking on Logging only
|
2024-07-11 22:03:37 -07:00 |
|
Krish Dholakia
|
5ad341d0ff
|
Merge pull request #4607 from maamalama/helicone-cohere
Helicone Headers & Cohere support
|
2024-07-11 22:01:44 -07:00 |
|
Krrish Dholakia
|
8d4e7f9967
|
test(test_assistants.py): handle openai api instability
|
2024-07-11 21:32:43 -07:00 |
|
Ishaan Jaff
|
4b8d33e6a8
|
ci/cd run again
|
2024-07-11 21:16:23 -07:00 |
|
Krrish Dholakia
|
b2e46086dd
|
fix(utils.py): fix recreating model response object when stream usage is true
|
2024-07-11 21:01:12 -07:00 |
|
Krrish Dholakia
|
1300223f51
|
test: fix test
|
2024-07-11 20:09:24 -07:00 |
|
Ishaan Jaff
|
e112379d2f
|
ci/cd run again
|
2024-07-11 19:26:20 -07:00 |
|
Krrish Dholakia
|
9d918d2ac7
|
fix(presidio_pii_masking.py): support logging_only pii masking
|
2024-07-11 18:04:12 -07:00 |
|
Krrish Dholakia
|
9deb9b4e3f
|
feat(guardrails): Flag for PII Masking on Logging
Fixes https://github.com/BerriAI/litellm/issues/4580
|
2024-07-11 16:09:34 -07:00 |
|
Ishaan Jaff
|
8bf50ac5db
|
Merge pull request #4661 from BerriAI/litellm_fix_mh
[Fix] Model Hub - Show supports vision correctly
|
2024-07-11 15:03:37 -07:00 |
|
Ishaan Jaff
|
46493303ed
|
test get mode info for gemini/gemini-1.5-flash
|
2024-07-11 13:04:18 -07:00 |
|