Commit graph

16807 commits

Author SHA1 Message Date
Ishaan Jaff
c23cf18a70 Merge branch 'main' into litellm_add_bedrock_guardrails 2024-08-22 17:28:49 -07:00
Ishaan Jaff
550da1153e test bedrock guardrails 2024-08-22 17:24:42 -07:00
Krrish Dholakia
849cfa9bde docs(configs.md): add global_max_parallel_requests to docs 2024-08-22 17:12:52 -07:00
Krrish Dholakia
b0706a6f8f fix(proxy_server.py): expose flag to disable retries when max parallel request limit is hit 2024-08-22 16:49:52 -07:00
Krrish Dholakia
73a5921262 feat(auth_checks.py): allow team to call all models, when explicitly set via /* 2024-08-22 16:38:56 -07:00
Ishaan Jaff
0a74738112 add async_post_call_success_hook 2024-08-22 16:34:43 -07:00
Ishaan Jaff
37142373ff doc bedrock guardrails 2024-08-22 16:25:22 -07:00
Krish Dholakia
e9928a01ad Merge pull request #5325 from BerriAI/litellm_redis_cluster
feat(caching.py): redis cluster support
2024-08-22 16:13:45 -07:00
Ishaan Jaff
2f01a22ef7 add bedrock guardrails support 2024-08-22 16:09:55 -07:00
Krish Dholakia
c1c611284a Merge pull request #5336 from micpst/docs-dbally
docs(projects): add dbally to sidebar
2024-08-22 15:49:16 -07:00
Ishaan Jaff
5f1a03e897 add types for BedrockMessage 2024-08-22 15:40:58 -07:00
Michał Pstrąg
02bd326109 Merge branch 'main' into docs-dbally 2024-08-22 23:25:57 +02:00
Michał Pstrąg
2568b5536d add dbally project 2024-08-22 23:21:40 +02:00
Ishaan Jaff
4d1f40211a Merge pull request #5335 from BerriAI/litellm_add_metrics_latency
[Feat-Proxy] Prometheus Metrics to Track request latency, track llm api latency
2024-08-22 14:10:19 -07:00
Ishaan Jaff
55d35c2ea1 add prom docs for Request Latency Metrics 2024-08-22 14:06:14 -07:00
Ishaan Jaff
9476582fb7 update promtheus metric names 2024-08-22 14:03:00 -07:00
Ishaan Jaff
c719c375f7 track litellm_request_latency_metric 2024-08-22 13:58:10 -07:00
Ishaan Jaff
8162208a5c track api_call_start_time 2024-08-22 13:52:03 -07:00
Ishaan Jaff
0ccb1c17f7 fix init correct prometheus metrics 2024-08-22 13:29:35 -07:00
Krrish Dholakia
2f9f01e72c docs(azure_ai.md): add azure ai jamba instruct to docs
Closes https://github.com/BerriAI/litellm/issues/5333
2024-08-22 11:34:52 -07:00
Krish Dholakia
f87f3987bd Merge branch 'main' into litellm_redis_cluster 2024-08-22 11:06:14 -07:00
Krrish Dholakia
2c5fc1ffb4 docs(utils.py): cleanup docstring 2024-08-22 11:05:25 -07:00
Krrish Dholakia
900d8ecbf0 feat(factory.py): enable 'user_continue_message' for interweaving user/assistant messages when provider requires it
allows bedrock to be used with autogen
2024-08-22 11:03:33 -07:00
Krrish Dholakia
8f306f8e41 fix(cohere_chat.py): support passing 'extra_headers'
Fixes https://github.com/BerriAI/litellm/issues/4709
2024-08-22 10:17:36 -07:00
Ishaan Jaff
7d10451bc8 Merge pull request #5331 from BerriAI/dependabot/npm_and_yarn/litellm-js/spend-logs/hono-4.5.8
build(deps): bump hono from 4.2.7 to 4.5.8 in /litellm-js/spend-logs
2024-08-22 10:08:41 -07:00
Ishaan Jaff
d2dd40e1d2 fix allow setting LiteLLM license as .env 2024-08-22 10:05:00 -07:00
Krrish Dholakia
2dd616bad0 fix(ollama_chat.py): fix passing assistant message with tool call param
Fixes https://github.com/BerriAI/litellm/issues/5319
2024-08-22 10:00:03 -07:00
Ishaan Jaff
e45ec0ef46 fix test_vertexai_multimodal_embedding use magicMock requests 2024-08-22 09:56:24 -07:00
Ishaan Jaff
468bf7c615 fix allow setting license in config.yaml 2024-08-22 09:45:15 -07:00
dependabot[bot]
6489d982ba build(deps): bump hono from 4.2.7 to 4.5.8 in /litellm-js/spend-logs
Bumps [hono](https://github.com/honojs/hono) from 4.2.7 to 4.5.8.
- [Release notes](https://github.com/honojs/hono/releases)
- [Commits](https://github.com/honojs/hono/compare/v4.2.7...v4.5.8)

---
updated-dependencies:
- dependency-name: hono
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-08-22 16:44:08 +00:00
Ishaan Jaff
a067f0f926 add docstring for /embeddings and /completions 2024-08-22 09:30:47 -07:00
Ishaan Jaff
2caee891e0 add doc string for /chat/completions swagger 2024-08-22 09:27:40 -07:00
Krrish Dholakia
6fd4c7fd54 test(test_custom_callback_input.py): skip flaky ci/cd test 2024-08-22 09:19:10 -07:00
Ishaan Jaff
0de3f615ca fix /user/delete doc string 2024-08-22 09:09:51 -07:00
Krrish Dholakia
8f3f044aa2 test(test_custom_callback_input.py): fix test 2024-08-22 08:54:59 -07:00
Krrish Dholakia
e117a041ff test: fix test 2024-08-21 22:30:41 -07:00
Krrish Dholakia
6ad4207328 fix: rerun ci/cd 2024-08-21 22:28:35 -07:00
Krrish Dholakia
f4e14ece50 docs(enterprise.md): add key/team level spend tags to docs 2024-08-21 22:10:18 -07:00
Krrish Dholakia
98d4f458aa test(test_function_calling.py): remove redundant gemini test (causing ratelimit errors) 2024-08-21 21:48:14 -07:00
Krrish Dholakia
ea247bbc4d test(test_image_generation.py): handle azure api error 2024-08-21 21:46:00 -07:00
Krrish Dholakia
3fa4e5971f test: test_function_calling.py 2024-08-21 21:12:15 -07:00
Ishaan Jaff
7236c3defb docs vertex 2024-08-21 19:15:23 -07:00
Ishaan Jaff
b0b5400cde fix team_member_add 2024-08-21 19:10:37 -07:00
Ishaan Jaff
5ffee5875c fix test_master_key_hashing 2024-08-21 17:56:09 -07:00
Ishaan Jaff
2baa6eb5e1 use litellm proxy with vertex ai sdk 2024-08-21 17:47:01 -07:00
Krrish Dholakia
22099c255c docs(vertex.md): add vertex global safety settings to doc 2024-08-21 17:41:49 -07:00
Krrish Dholakia
d87e8f5b30 feat(utils.py): support global vertex ai safety settings param 2024-08-21 17:37:50 -07:00
Ishaan Jaff
317936660d docs add example using litellm with vertex python sdk 2024-08-21 17:35:34 -07:00
Ishaan Jaff
cfd2eb37c2 Merge pull request #5327 from BerriAI/litellm_pass_through_vtx_multi_modal
[Feat-Proxy] Make LiteLLM Proxy (Gateway) compatible with VertexAI SDK 🔥
2024-08-21 17:25:23 -07:00
Ishaan Jaff
56057f278a Merge branch 'main' into litellm_pass_through_vtx_multi_modal 2024-08-21 17:23:22 -07:00