ishaan-jaff
|
f3209b63cd
|
(fix) locustfile used in load test
|
2024-03-15 12:38:37 -07:00 |
|
Krish Dholakia
|
32ca306123
|
Merge pull request #2535 from BerriAI/litellm_fireworks_ai_support
feat(utils.py): add native fireworks ai support
|
2024-03-15 10:02:53 -07:00 |
|
Krrish Dholakia
|
4e1dc7d62e
|
fix(cohere.py): return usage as a pydantic object not dict
|
2024-03-15 10:00:22 -07:00 |
|
Krrish Dholakia
|
860b06d273
|
refactor(main.py): trigger new build
|
2024-03-15 09:42:23 -07:00 |
|
Krrish Dholakia
|
f7cf90a636
|
fix(bedrock.py): add all supported bedrock / anthropic messages api params
|
2024-03-15 09:41:40 -07:00 |
|
Krrish Dholakia
|
9909f44015
|
feat(utils.py): add native fireworks ai support
addresses - https://github.com/BerriAI/litellm/issues/777, https://github.com/BerriAI/litellm/issues/2486
|
2024-03-15 09:09:59 -07:00 |
|
Ishaan Jaff
|
31dcc6acf2
|
Merge pull request #2524 from BerriAI/litellm_fix_update_user
(fix) - update user error
|
2024-03-15 08:51:45 -07:00 |
|
ishaan-jaff
|
7f0cebe756
|
(ci/cd) check triggers
|
2024-03-15 08:21:16 -07:00 |
|
ishaan-jaff
|
fd33eda29d
|
(ci/cd) check linked triggers
|
2024-03-15 08:17:55 -07:00 |
|
ishaan-jaff
|
a5450125c5
|
bump 1.31.13 -> 1.31.14
|
2024-03-15 08:11:11 -07:00 |
|
ishaan-jaff
|
fa1676b253
|
(fix) error cli users see when importing enterprise folder
|
2024-03-15 08:10:45 -07:00 |
|
Krrish Dholakia
|
5a2e024576
|
fix(factory.py): raise exception on invalid message being received
|
2024-03-15 07:55:29 -07:00 |
|
ishaan-jaff
|
82e44e4962
|
(ci/cd) check actions run
|
2024-03-14 20:58:22 -07:00 |
|
ishaan-jaff
|
e7240bb5c1
|
(ci/cd) fix litellm triggers on commits
|
2024-03-14 20:50:02 -07:00 |
|
Krish Dholakia
|
eab08caa18
|
Merge pull request #2520 from BerriAI/litellm_max_tokens_fix
fix(utils.py): move to using `litellm.modify_params` to enable max output token trimming fix
|
2024-03-14 20:04:59 -07:00 |
|
ishaan-jaff
|
634e0227f3
|
(fix) - update user error
|
2024-03-14 19:50:26 -07:00 |
|
ishaan-jaff
|
04ef2f2023
|
(fix) load test run over 300s
|
2024-03-14 19:44:31 -07:00 |
|
ishaan-jaff
|
1b63748831
|
(temp) test build without using argon2
|
2024-03-14 18:53:16 -07:00 |
|
ishaan-jaff
|
517e453adf
|
(docs) load testing proxy
|
2024-03-14 15:20:36 -07:00 |
|
Krrish Dholakia
|
8f4a99e35e
|
build(model_prices_and_context_window.json): add mistral and mixtral bedrock pricing
|
2024-03-14 15:05:59 -07:00 |
|
Krrish Dholakia
|
1ba21a8c58
|
fix(router.py): add no-proxy support for router
|
2024-03-14 14:25:30 -07:00 |
|
Krrish Dholakia
|
7c72cc0ec9
|
fix(caching.py): support redis caching with namespaces
|
2024-03-14 13:35:17 -07:00 |
|
Krrish Dholakia
|
0b6cf3d5cf
|
refactor(main.py): trigger new build
|
2024-03-14 13:01:18 -07:00 |
|
Krrish Dholakia
|
d6537a05ca
|
fix(caching.py): fix print statements
|
2024-03-14 12:58:34 -07:00 |
|
Krrish Dholakia
|
a634424fb2
|
fix(utils.py): move to using litellm.modify_params to enable max output token trimming fix
|
2024-03-14 12:17:56 -07:00 |
|
Krrish Dholakia
|
bdd2004691
|
refactor(main.py): trigger new build
|
2024-03-14 12:10:39 -07:00 |
|
Krrish Dholakia
|
7876aa2d75
|
fix(parallel_request_limiter.py): handle metadata being none
|
2024-03-14 10:02:41 -07:00 |
|
Krrish Dholakia
|
704573c3f6
|
fix(proxy_server.py): improve error message on ui login error
|
2024-03-14 10:02:41 -07:00 |
|
ishaan-jaff
|
ebfefe61ea
|
(fix-ci-cd) skip deep infra 429 errors
|
2024-03-13 22:05:16 -07:00 |
|
ishaan-jaff
|
e3cc0da5f1
|
(ci/cd) run testing again
|
2024-03-13 21:47:56 -07:00 |
|
ishaan-jaff
|
4006d10b7b
|
(fix) importing PromptInjectionDetection
|
2024-03-13 21:24:37 -07:00 |
|
Krish Dholakia
|
d8eff53ebe
|
Merge pull request #2506 from BerriAI/litellm_update_db_perf_improvements
fix(proxy_server.py): move to using UPDATE + SET for track_cost_callback
|
2024-03-13 20:55:40 -07:00 |
|
ishaan-jaff
|
0269cddd55
|
(feat) add claude-3-haiku
|
2024-03-13 20:24:06 -07:00 |
|
Krrish Dholakia
|
1b807fa3f5
|
fix(proxy_server.py): fix key caching logic
|
2024-03-13 19:10:24 -07:00 |
|
ishaan-jaff
|
10a8eb223a
|
(fix) update load test result
|
2024-03-13 17:53:51 -07:00 |
|
Krrish Dholakia
|
acc672a78f
|
fix(proxy_server.py): maintain support for model specific budgets
|
2024-03-13 17:04:51 -07:00 |
|
Krrish Dholakia
|
cf090acb25
|
fix(proxy_server.py): move to using UPDATE + SET for track_cost_callback
|
2024-03-13 16:13:37 -07:00 |
|
Ishaan Jaff
|
8a886c6e93
|
Merge pull request #2501 from BerriAI/litellm_fix_using_enterprise_docker
(fix) using enterprise folder on litellm Docker
|
2024-03-13 14:26:21 -07:00 |
|
Ishaan Jaff
|
d82be720d2
|
Merge pull request #2493 from BerriAI/litellm_return_429_no_models_available
[Proxy] return 429 when no models available
|
2024-03-13 13:33:43 -07:00 |
|
Krrish Dholakia
|
16e3aaced5
|
docs(enterprise.md): add prompt injection detection to docs
|
2024-03-13 12:37:32 -07:00 |
|
Krish Dholakia
|
3e66b50602
|
Merge pull request #2498 from BerriAI/litellm_prompt_injection_detection
feat(prompt_injection_detection.py): support simple heuristic similarity check for prompt injection attacks
|
2024-03-13 12:28:19 -07:00 |
|
ishaan-jaff
|
82246d8e30
|
(fix) using enterprise folder on litellm
|
2024-03-13 12:16:58 -07:00 |
|
Ishaan Jaff
|
fd99f9f64a
|
Merge pull request #2500 from BerriAI/litellm_fix_using_enterpise
(fix) issue with using litellm enterprise on Admin UI
|
2024-03-13 11:55:29 -07:00 |
|
Krrish Dholakia
|
dfac742324
|
fix(factory.py): fix mistral api prompt formatting
|
2024-03-13 11:34:45 -07:00 |
|
Krrish Dholakia
|
234cdbbfef
|
feat(prompt_injection_detection.py): support simple heuristic similarity check for prompt injection attacks
|
2024-03-13 10:32:21 -07:00 |
|
ishaan-jaff
|
771d09312e
|
(fix) issue with using litellm enterprise license
|
2024-03-13 10:30:31 -07:00 |
|
ishaan-jaff
|
4c526ade27
|
(fix) errors fro litellm proxy
|
2024-03-13 08:05:32 -07:00 |
|
ishaan-jaff
|
3aeada232e
|
(fix) return 429 error
|
2024-03-13 08:03:28 -07:00 |
|
ishaan-jaff
|
aaa008ecde
|
(fix) raising No healthy deployment
|
2024-03-13 08:00:56 -07:00 |
|
Krish Dholakia
|
9f2d540ebf
|
Merge pull request #2472 from BerriAI/litellm_anthropic_streaming_tool_calling
fix(anthropic.py): support claude-3 streaming with function calling
|
2024-03-12 21:36:01 -07:00 |
|