Commit graph

11876 commits

Author SHA1 Message Date
Ishaan Jaff
dabaf5f297 fix python 3.8 Tuple 2024-05-20 12:21:02 -07:00
Ishaan Jaff
aa0ed8238b docs - image generation vertex 2024-05-20 12:18:31 -07:00
Ishaan Jaff
571d4cf569 test - fix test_aimage_generation_vertex_ai 2024-05-20 12:10:28 -07:00
Krrish Dholakia
f11f207ae6 feat(proxy_server.py): refactor returning rejected message, to work with error logging
log the rejected request as a failed call to langfuse/slack alerting
2024-05-20 11:14:36 -07:00
Ishaan Jaff
6ddc9873e5 test vertex image gen test 2024-05-20 11:14:16 -07:00
Ishaan Jaff
91f8443381 fix add debug to vertex httpx image 2024-05-20 11:11:14 -07:00
Ishaan Jaff
d50d552e5a fix python 3.8 import 2024-05-20 11:03:28 -07:00
Ishaan Jaff
655478e8dc fix python3.8 error 2024-05-20 10:55:10 -07:00
Ishaan Jaff
2da89a0c8e fix vertex test 2024-05-20 10:51:25 -07:00
Ishaan Jaff
2519879e67 add ImageObject 2024-05-20 10:45:37 -07:00
Krrish Dholakia
372323c38a feat(proxy_server.py): allow admin to return rejected response as string to user
Closes https://github.com/BerriAI/litellm/issues/3671
2024-05-20 10:30:23 -07:00
Ishaan Jaff
a4f906b464 feat - add litellm.ImageResponse 2024-05-20 10:09:41 -07:00
Ishaan Jaff
24951d44a4 feat - working httpx requests vertex ai image gen 2024-05-20 09:51:15 -07:00
Krrish Dholakia
dbaeb8ff53 bump: version 1.37.16 → 1.37.17 2024-05-20 09:47:03 -07:00
Krish Dholakia
626e556e70
Merge pull request #3737 from BerriAI/litellm_json_logs
feat(proxy_cli.py): support json logs on proxy
2024-05-20 09:46:37 -07:00
Krrish Dholakia
89ad1ce9b5 docs(debugging.md): add json logs to proxy docs 2024-05-20 09:29:30 -07:00
Krrish Dholakia
52acd3955d fix(_logging.py): support all logs being in json mode, if enabled 2024-05-20 09:22:59 -07:00
Krrish Dholakia
058bfb101d feat(proxy_cli.py): support json logs on proxy
allow user to enable 'json logs' for proxy server
2024-05-20 09:18:12 -07:00
Krrish Dholakia
25df95ab10 feat(proxy_server.py): new 'supported_openai_params' endpoint
get supported openai params for a given model
2024-05-20 08:39:50 -07:00
Ishaan Jaff
5ba5f15b56 test - test_aimage_generation_vertex_ai 2024-05-20 08:14:43 -07:00
Krish Dholakia
bb85a5e6b2
Merge pull request #3729 from BerriAI/litellm_vertex_history
fix(vertex_ai.py): support passing in result of tool call to vertex
2024-05-20 07:33:05 -07:00
Krrish Dholakia
49b71c8118 fix(vertex_ai.py): revert system instructions - unable to find supported vertex version 2024-05-20 06:30:11 -07:00
Krrish Dholakia
45c46a84d0 fix(vertex_ai.py): support passing system instructions to vertex ai 2024-05-20 06:18:19 -07:00
Krrish Dholakia
6216c3639f fix(types/vertex_ai.py): fix typing 2024-05-19 12:36:05 -07:00
Krish Dholakia
c94ce069f7
Merge pull request #3724 from BerriAI/litellm_gpt_4_response_format_drop_params
fix(utils.py): drop response_format if 'drop_params=True' for gpt-4
2024-05-19 12:35:13 -07:00
Krrish Dholakia
65aacc0c1a fix(vertex_ai.py): use chat_messages_with_history for async + streaming calls 2024-05-19 12:30:24 -07:00
Krrish Dholakia
f9ab72841a fix(vertex_ai.py): passing all tests on 'test_amazing_vertex_completion.py 2024-05-19 12:22:21 -07:00
Krrish Dholakia
a2c66ed4fb fix(vertex_ai.py): support passing in result of tool call to vertex
Fixes https://github.com/BerriAI/litellm/issues/3709
2024-05-19 11:34:07 -07:00
alisalim17
9ed4f84b50 chore: update traceloop-sdk to version 0.18.2 2024-05-19 20:28:02 +04:00
Krish Dholakia
35c48c7c24
Update README.md 2024-05-18 19:06:09 -07:00
Krrish Dholakia
12942c39db fix(utils.py): drop response_format if 'drop_params=True' for gpt-4 2024-05-18 13:02:48 -07:00
Krish Dholakia
a43cfa1770
Update README.md 2024-05-18 12:58:22 -07:00
Krish Dholakia
8d25a7b9dc
Merge pull request #3715 from BerriAI/litellm_model_id_fix
fix(proxy_server.py): fix setting model id for db models
2024-05-17 22:36:23 -07:00
Krish Dholakia
5e5179e476
Merge branch 'main' into litellm_model_id_fix 2024-05-17 22:36:17 -07:00
Krrish Dholakia
5d3fe52a08 test: fix test 2024-05-17 22:35:34 -07:00
Krrish Dholakia
1cecdc4690 fix(utils.py): fix replicate completion cost calculation 2024-05-17 22:18:57 -07:00
Ishaan Jaff
c8a1cf6ce2 (ci/cd) run again 2024-05-17 22:07:21 -07:00
Ishaan Jaff
7af7610929 fix - test num callbacks 2024-05-17 22:06:51 -07:00
Krrish Dholakia
a75b865ebc test(test_config.py): fix test 2024-05-17 22:00:44 -07:00
Ishaan Jaff
25920a7396 bump: version 1.37.15 → 1.37.16 2024-05-17 21:58:30 -07:00
Ishaan Jaff
6708a1adaa ui - new build 2024-05-17 21:58:10 -07:00
Ishaan Jaff
60b9bc2764
Merge pull request #3714 from BerriAI/litellm_show_max_input_tokens_ui
[Admin UI] show max input tokens on  UI
2024-05-17 21:55:35 -07:00
Ishaan Jaff
8281c150f0
Merge pull request #3713 from BerriAI/litellm_ui_infer_azure_prefix
[Feat] Admin UI - use `base_model` for Slack Alerts
2024-05-17 21:55:23 -07:00
Ishaan Jaff
f7a1675337 fix - cooldown based on exception header 2024-05-17 18:52:45 -07:00
Ishaan Jaff
6368d5a725 feat - read cooldown time from exception header 2024-05-17 18:50:33 -07:00
Krrish Dholakia
4b3551abfc fix(slack_alerting.py): show langfuse traces on error messages 2024-05-17 18:42:30 -07:00
Krish Dholakia
2695989e49
Merge pull request #3708 from BerriAI/litellm_bedrock_anthropic_fix
fix(bedrock_httpx.py): move anthropic bedrock calls to httpx
2024-05-17 17:47:44 -07:00
Krish Dholakia
3a06fe2818
Merge branch 'main' into litellm_bedrock_anthropic_fix 2024-05-17 17:47:32 -07:00
Krrish Dholakia
b137cea230 fix(proxy_server.py): fix setting model id for db models
get model_id and use that as it's id in router, this enables `/model/delete` to work with the given id from `/model/info`
2024-05-17 17:45:05 -07:00
Ishaan Jaff
7add1ef42e ui - show max input tokens 2024-05-17 17:18:20 -07:00