Commit graph

8982 commits

Author SHA1 Message Date
Ishaan Jaff
19a1d999ec (feat) update docs to not include gunicorn usage 2024-03-23 17:40:22 -07:00
Ishaan Jaff
61d2e91632 (docs) update gunicorn usage 2024-03-23 17:39:07 -07:00
Krrish Dholakia
9b951b906d test(test_completion.py): fix claude multi-turn conversation test 2024-03-23 00:56:41 -07:00
Krrish Dholakia
b9143a0a00 fix(factory.py): fix anthropic check 2024-03-23 00:27:24 -07:00
Ishaan Jaff
68fd487c2e bump: version 1.33.6 → 1.33.7 2024-03-22 22:16:14 -07:00
Ishaan Jaff
3639b93d9f bump: version 1.33.5 → 1.33.6 2024-03-22 22:16:12 -07:00
Ishaan Jaff
5ae84d13ba
Merge pull request #2657 from BerriAI/litellm_improve_perf
(feat) remove litellm.telemetry - improve perf on EC2 tiny machines by 90%
2024-03-22 22:15:41 -07:00
Ishaan Jaff
f39f606e02 (feat) remove litellm.telemetry 2024-03-22 20:58:14 -07:00
Krrish Dholakia
42a7588b04 fix(anthropic.py): support async claude 3 tool calling + streaming
https://github.com/BerriAI/litellm/issues/2644
2024-03-22 19:57:01 -07:00
Ishaan Jaff
2e284a0cfe
Merge pull request #2653 from BerriAI/litellm_add_example_kub_yamls
[FEAT] Add example Kubernetes + Service YAML Files
2024-03-22 19:49:11 -07:00
Ishaan Jaff
aca6ec85e2 (fix) add some better load testing 2024-03-22 19:48:54 -07:00
Ishaan Jaff
28e62af4e1 (fix) update load test used 2024-03-22 19:48:54 -07:00
Ishaan Jaff
9c483dbae4 (feat) add sample kubernetes for litellm 2024-03-22 19:47:44 -07:00
Ishaan Jaff
311918b99c (fix) add some better load testing 2024-03-22 19:45:24 -07:00
Ishaan Jaff
48b9250a3d (fix) update load test used 2024-03-22 19:44:16 -07:00
Krish Dholakia
e2d81722d2
Merge pull request #2650 from BerriAI/litellm_jwt_auth_fixes
feat(handle_jwt.py): enable jwt-project based auth
2024-03-22 19:32:46 -07:00
Krrish Dholakia
90465ff00a bump: version 1.33.4 → 1.33.5 2024-03-22 18:15:42 -07:00
Krrish Dholakia
691a83b7dc fix(anthropic.py): handle multiple system prompts 2024-03-22 18:14:15 -07:00
Krrish Dholakia
265dd5cd4f docs(token_auth.md): add project based auth to docs 2024-03-22 17:27:40 -07:00
Krrish Dholakia
d06b9a5a47 fix(proxy_server.py): enable jwt-auth for users
allow a user to auth into the proxy via jwt's and call allowed routes
2024-03-22 17:08:10 -07:00
Krrish Dholakia
9bf086386e fix(handle_jwt.py): add more logging for jwt header 2024-03-22 16:33:32 -07:00
Ishaan Jaff
de1f348453
Merge pull request #2646 from BerriAI/litellm_bump_python
(feat) bump to python 3.11 - Improve Proxy perf 7%
2024-03-22 15:01:11 -07:00
Ishaan Jaff
ff57887b70 (feat) bump to python 3.11 2024-03-22 14:44:41 -07:00
Krrish Dholakia
858fa07e07 docs(call_hooks.md): fix dead link 2024-03-22 09:06:01 -07:00
Krrish Dholakia
211a6887f3 docs(enterprise.md): fix llm guard api link 2024-03-22 09:03:11 -07:00
Krrish Dholakia
566d48d51b docs(prompt_injection.md): fix dead link on docs 2024-03-22 08:24:47 -07:00
Krrish Dholakia
dfcc0c9ff0 fix(ollama_chat.py): don't pop from dictionary while iterating through it 2024-03-22 08:18:22 -07:00
Krrish Dholakia
93a1a865f0 bump: version 1.33.3 → 1.33.4 2024-03-21 21:55:19 -07:00
Krrish Dholakia
66e7345296 docs(virtual_keys.md): simplify virtual keys docs 2024-03-21 21:49:50 -07:00
Krish Dholakia
db7974f9f2
Merge pull request #2617 from RoniGurvich/main
Bump fastapi version `0.104.1` to `0.109.1`
2024-03-21 20:56:28 -07:00
Krish Dholakia
c980093ca4
Merge pull request #2620 from BerriAI/litellm_fix_retry_logic
[ fix ] retry logic - when using router/proxy - don't retry on the litellm.completion level too
2024-03-21 20:56:05 -07:00
Krish Dholakia
8c45986e72
Merge pull request #2619 from BerriAI/litellm_install_tenacity
(fix) include tenacity in req.txt
2024-03-21 20:55:56 -07:00
Krish Dholakia
5a086392e2
Merge pull request #2630 from BerriAI/litellm_bedrock_function_calling_streaming
fix(bedrock.py): support bedrock claude 3 function calling when stream=true
2024-03-21 20:40:44 -07:00
Krrish Dholakia
94f55aa6d9 fix(bedrock.py): support claude 3 function calling when stream=true
https://github.com/BerriAI/litellm/issues/2615
2024-03-21 18:39:03 -07:00
Krrish Dholakia
425165dda9 docs(gemini.md): fix string for calling gemini 1.5 2024-03-21 18:04:11 -07:00
Krrish Dholakia
d57995a73d build(schema.prisma): use jsonProtocol to fix db connection issues
https://github.com/prisma/prisma/discussions/19978
2024-03-21 18:01:45 -07:00
Krrish Dholakia
bc17404055 bump: version 1.33.2 → 1.33.3 2024-03-21 17:56:26 -07:00
Krrish Dholakia
4ac14a4e85 build(networking.tsx): fix trailing slash 2024-03-21 17:50:59 -07:00
Krrish Dholakia
d9577c3e2b build(networking.tsx): modify url to prevent redirects 2024-03-21 17:43:41 -07:00
Krrish Dholakia
abe8d7c921 docs(configs.md): add disable swagger ui env tutorial to docs 2024-03-21 17:16:52 -07:00
Krrish Dholakia
33964233a5 fix(proxy_server.py): allow user to disable swagger ui docs via env
user can disable swagger ui docs by setting 'NO_DOCS="True"' in their env
2024-03-21 17:15:18 -07:00
Krrish Dholakia
44a91fe43a bump: version 1.33.1 → 1.33.2 2024-03-21 16:55:52 -07:00
Krrish Dholakia
b5457beba6 fix(llm_guard.py): await moderation check 2024-03-21 16:55:28 -07:00
Krrish Dholakia
d7b502bf64 bump: version 1.33.0 → 1.33.1 2024-03-21 11:23:02 -07:00
Krrish Dholakia
c4dad3f34f fix(llm_guard.py): more logging for llm guard.py 2024-03-21 11:22:52 -07:00
Krrish Dholakia
550c9508d3 bump: version 1.32.9 → 1.33.0 2024-03-21 10:56:58 -07:00
Krrish Dholakia
af27a61d76 refactor(main.py): trigger new build 2024-03-21 10:56:44 -07:00
Krish Dholakia
4df79ff4db
Merge pull request #2614 from BerriAI/litellm_llm_api_prompt_injection_check
feat(proxy_server.py): enable llm api based prompt injection checks
2024-03-21 09:57:16 -07:00
Krish Dholakia
33a433eb0a
Merge branch 'main' into litellm_llm_api_prompt_injection_check 2024-03-21 09:57:10 -07:00
Roni Gurvich
f623d9b0bf
Merge branch 'BerriAI:main' into main 2024-03-21 18:45:26 +02:00