Commit graph

15012 commits

Author SHA1 Message Date
Krrish Dholakia
80e7310c5c feat(lakera_ai.py): support running prompt injection detection lakera check pre-api call 2024-07-22 20:16:05 -07:00
Krrish Dholakia
99a5436ed5 feat(lakera_ai.py): control running prompt injection between pre-call and in_parallel 2024-07-22 20:04:42 -07:00
Krish Dholakia
a32a7af215
Merge pull request #4819 from BerriAI/revert-4613-main
Revert "Fix: use Bedrock region from environment variables before other region definitions"
2024-07-22 07:41:10 -07:00
Krish Dholakia
227f55f370
Revert "Fix: use Bedrock region from environment variables before other region definitions" 2024-07-22 07:40:24 -07:00
Ishaan Jaff
48c365976f fix bedrock embedding test 2024-07-20 20:05:22 -07:00
Ishaan Jaff
b901a65572 fix make_sync_openai_audio_transcriptions_request 2024-07-20 20:03:12 -07:00
Krish Dholakia
bb6b2c6872
Merge pull request #4613 from petermuller/main
Fix: use Bedrock region from environment variables before other region definitions
2024-07-20 19:19:05 -07:00
Ishaan Jaff
66c73e4425 fix merge conflicts 2024-07-20 19:14:20 -07:00
Krrish Dholakia
f10af7596c fix(utils.py): allow dropping extra_body in additional_drop_params
Fixes https://github.com/BerriAI/litellm/issues/4769
2024-07-20 19:12:58 -07:00
Ishaan Jaff
00d431ea42
Merge pull request #4807 from BerriAI/litellm_return-response_headers
[Feat] Return response headers on `litellm.completion` , `litellm.embedding`
2024-07-20 19:06:03 -07:00
Ishaan Jaff
f6225623e9
Merge branch 'main' into litellm_return-response_headers 2024-07-20 19:05:56 -07:00
Ishaan Jaff
36cb63cb82
Merge pull request #4811 from BerriAI/revert-4802-litellm_add_tg_models_ui
Revert "[Ui] add together AI, Mistral,  PerplexityAI, OpenRouter models on Admin UI "
2024-07-20 19:04:35 -07:00
Ishaan Jaff
9a545c1ff8
Revert "[Ui] add together AI, Mistral, PerplexityAI, OpenRouter models on Admin UI " 2024-07-20 19:04:22 -07:00
Ishaan Jaff
28bb2919b6 fix - test router debug logs 2024-07-20 18:45:31 -07:00
Ishaan Jaff
8eb839863d bump: version 1.41.25 → 1.41.26 2024-07-20 18:20:54 -07:00
Ishaan Jaff
cf8ec4a888 ui new build 2024-07-20 18:18:54 -07:00
Ishaan Jaff
27e25fb229
Merge pull request #4809 from BerriAI/litellm_refactor_router_prints
router - use verbose logger when using litellm.Router
2024-07-20 18:17:48 -07:00
Ishaan Jaff
df05b11913
Merge pull request #4802 from BerriAI/litellm_add_tg_models_ui
[Ui] add together AI, Mistral,  PerplexityAI, OpenRouter models on Admin UI
2024-07-20 18:17:33 -07:00
Ishaan Jaff
82764d2cec fix make_sync_openai_audio_transcriptions_request 2024-07-20 18:17:21 -07:00
Ishaan Jaff
c44cdc3f4b ui - support adding mistral, tg ai, perplexity, open router 2024-07-20 18:14:26 -07:00
Krish Dholakia
fee4f3385d
Merge pull request #4806 from BerriAI/litellm_drop_invalid_params
fix(openai.py): drop invalid params if `drop_params: true` for azure ai
2024-07-20 17:45:46 -07:00
Ishaan Jaff
2513b64ed4 ci/cd run tests again 2024-07-20 17:44:12 -07:00
Ishaan Jaff
4038b3dcea router - use verbose logger when using litellm.Router 2024-07-20 17:36:25 -07:00
Ishaan Jaff
4e301658ca docs _response_headers 2024-07-20 17:32:34 -07:00
Ishaan Jaff
5e4d291244 rename to _response_headers 2024-07-20 17:31:16 -07:00
Ishaan Jaff
40ee954e8b
Merge pull request #4808 from BerriAI/litellm_Add_mistral_models
feat - add mistral `open-codestral-mamba` `open-mistral-nemo`
2024-07-20 17:23:28 -07:00
Ishaan Jaff
106ff31c4d docs - using mistral models with litellm proxy 2024-07-20 17:08:35 -07:00
Ishaan Jaff
756a5f94a7 docs add mistral models 2024-07-20 17:02:34 -07:00
Ishaan Jaff
48ae2603e4 feat - add mistral models 2024-07-20 16:59:22 -07:00
Ishaan Jaff
5e52f50a82 return response headers 2024-07-20 15:26:44 -07:00
Krrish Dholakia
a27454b8e3 fix(openai.py): support completion, streaming, async_streaming 2024-07-20 15:23:42 -07:00
Ishaan Jaff
2e9f1e8de2 docs - response headers 2024-07-20 15:19:15 -07:00
Ishaan Jaff
6039e0b2a7 test - response_headers 2024-07-20 15:08:54 -07:00
Krrish Dholakia
86c9e05c10 fix(openai.py): drop invalid params if drop_params: true for azure ai
Fixes https://github.com/BerriAI/litellm/issues/4800
2024-07-20 15:08:15 -07:00
Ishaan Jaff
3427838ce5 openai - return response headers 2024-07-20 15:04:27 -07:00
Ishaan Jaff
46cf4f69ae return response headers in response 2024-07-20 14:59:08 -07:00
Ishaan Jaff
ca8012090c return response_headers in response 2024-07-20 14:58:14 -07:00
Krrish Dholakia
576cccaade fix(main.py): check for ANTHROPIC_BASE_URL in environment
Fixes https://github.com/BerriAI/litellm/issues/4803
2024-07-20 14:38:31 -07:00
Krish Dholakia
9dc10f97a6
Merge pull request #4799 from BerriAI/litellm_proxy_team_cache_update
fix(user_api_key_auth.py): update valid token cache with updated team object cache
2024-07-20 14:33:18 -07:00
Ishaan Jaff
64dbe07593 openai return response headers 2024-07-20 14:07:41 -07:00
Ishaan Jaff
725cd91064
Merge pull request #4805 from BerriAI/docs_show_tags_js
docs - show to do spend tracking with OpenAI Js + Proxy
2024-07-20 13:15:38 -07:00
Ishaan Jaff
89922ab71f docs js spend tracking with tags 2024-07-20 13:15:20 -07:00
Ishaan Jaff
b9fab4bd4d docs - using tags OpenAI JS 2024-07-20 13:02:10 -07:00
Krrish Dholakia
305e884174 docs(bedrock.md): add jamba instruct to bedrock docs 2024-07-19 21:14:34 -07:00
Krish Dholakia
156d445597
Merge pull request #4796 from BerriAI/litellm_refactor_requests_factory
fix(factory.py): refactor factory to use httpx client
2024-07-19 21:07:41 -07:00
Krish Dholakia
f797597202
Merge branch 'main' into litellm_proxy_team_cache_update 2024-07-19 21:07:26 -07:00
Krish Dholakia
3053f52c43
Merge pull request #4801 from BerriAI/litellm_dynamic_params_oai_compatible_endpoints
fix(utils.py): support dynamic params for openai-compatible providers
2024-07-19 21:07:06 -07:00
Ishaan Jaff
3bb66ab939 ci/cd run again 2024-07-19 20:08:50 -07:00
Ishaan Jaff
d35aeb3f4f fix test_vertex_ai_medlm_completion_cost 2024-07-19 19:58:44 -07:00
Ishaan Jaff
3708355f91 router fix init openai compatible providers 2024-07-19 19:42:04 -07:00