Commit graph

7473 commits

Author SHA1 Message Date
Krrish Dholakia
6710c2ee5d fix(proxy_server.py): fix /spend/logs endpoint 2024-02-09 18:11:33 -08:00
Krrish Dholakia
d1d4568dc5 fix(usage.tsx): show key spend per day 2024-02-09 18:09:55 -08:00
ishaan-jaff
640011379c (feat) set timeout on proxy config 2024-02-09 17:42:35 -08:00
ishaan-jaff
3d97004b15 (feat) support timeout on bedrock 2024-02-09 17:42:17 -08:00
ishaan-jaff
5078ed1ace (docs) llamaindex + proxy 2024-02-09 17:06:49 -08:00
ishaan-jaff
e3f5579091 (docs) use llama index with litellm proxy 2024-02-09 16:57:48 -08:00
ishaan-jaff
09c36c6e78 (test) llama index VectorStoreIndex 2024-02-09 16:49:03 -08:00
ishaan-jaff
924b7db540 (feat) support azure deployments for embeddings 2024-02-09 16:47:01 -08:00
Krrish Dholakia
43da22ae13 feat(proxy_server.py): show admin global spend as time series data 2024-02-09 16:31:35 -08:00
Ishaan Jaff
0da1737b59
Merge pull request #1919 from BerriAI/litellm_bedrock_set_timeouts
[FEAT] Bedrock set timeouts on litellm.completion
2024-02-09 16:19:27 -08:00
ishaan-jaff
782e84a421 (test) Proxy llama index request 2024-02-09 16:09:03 -08:00
ishaan-jaff
2c116af596 (test) bedrock timeout 2024-02-09 14:38:17 -08:00
ishaan-jaff
6dc7ded1a6 (bedrock) raise timeout error 2024-02-09 14:37:34 -08:00
ishaan-jaff
896fd393db (feat) support bedrock timeout 2024-02-09 14:36:43 -08:00
Rena Lu
6833f37986 remove prints 2024-02-09 16:25:29 -05:00
Rena Lu
ae0ede4190
Merge branch 'BerriAI:main' into main 2024-02-09 16:20:14 -05:00
Rena Lu
0e8a0aefd5 add vertex ai private endpoint support 2024-02-09 16:19:26 -05:00
Ishaan Jaff
61031fe380
Merge pull request #1915 from BerriAI/litellm_view_total_proxy_budget_spend
[FEAT] ui - view total proxy spend / budget
2024-02-09 11:59:16 -08:00
ishaan-jaff
1d2ee8a487 (feat) ui - view total proxy spend / budget 2024-02-09 10:59:19 -08:00
ishaan-jaff
cf16687509 (docs) litellm 2024-02-09 10:27:46 -08:00
ishaan-jaff
694f0d6af5 (docs) proxy quickstart 2024-02-09 10:21:39 -08:00
ishaan-jaff
854eee3785 (docs) getting started 2024-02-09 09:42:37 -08:00
Ishaan Jaff
f0fc7f1552
Update README.md 2024-02-09 09:30:16 -08:00
ishaan-jaff
63a2b5636b (chore) cleanup 2024-02-09 09:28:13 -08:00
Ishaan Jaff
1209ae0dd1
Update README.md 2024-02-09 09:25:36 -08:00
Krrish Dholakia
e39ce9b119 bump: version 1.23.4 → 1.23.5 2024-02-08 23:03:14 -08:00
Krrish Dholakia
2a7e346144 fix(main.py): trigger new build 2024-02-08 23:03:03 -08:00
Krish Dholakia
51c07e294a
Merge pull request #1902 from BerriAI/litellm_mistral_message_list_fix
fix(factory.py): mistral message input fix
2024-02-08 23:01:39 -08:00
Krish Dholakia
6084e0b25a
Merge pull request #1901 from BerriAI/litellm_ui_usage_tiers
fix(proxy_server.py): enable aggregate queries via /spend/keys
2024-02-08 22:45:52 -08:00
Krrish Dholakia
b426fa55f4 test(test_completion.py): fix test 2024-02-08 22:04:22 -08:00
Krrish Dholakia
3a4ac8be79 fix: fixes 2024-02-08 21:54:48 -08:00
Krrish Dholakia
2756ba591c test(test_parallel_request_limiter.py): fix test 2024-02-08 21:49:58 -08:00
Krrish Dholakia
b9393fb769 fix(test_parallel_request_limiter.py): use mock responses for streaming 2024-02-08 21:45:38 -08:00
ishaan-jaff
1ef7ad3416 bump: version 1.23.3 → 1.23.4 2024-02-08 21:45:08 -08:00
Ishaan Jaff
c54f21f9ec
Merge pull request #1904 from BerriAI/litellm_show_delete_confirmation
Admin UI - show delete confirmation when deleting keys
2024-02-08 21:39:17 -08:00
Ishaan Jaff
3978b9a076
Merge pull request #1903 from BerriAI/litellm_ui_show_models_tpm_limit
Admin UI - View Models, TPM, RPM Limit of a Key
2024-02-08 21:38:29 -08:00
ishaan-jaff
ab3fe95810 (feat) update ui build 2024-02-08 21:38:04 -08:00
ishaan-jaff
5e87932e8e (fea) ui - see delete confirmation before deleting 2024-02-08 21:33:50 -08:00
ishaan-jaff
1f6827f4f8 (feat) ui - view models, tpm limit of key 2024-02-08 21:18:05 -08:00
Krrish Dholakia
841639333b fix(bedrock.py): raise exception for amazon titan null response 2024-02-08 21:12:25 -08:00
Krrish Dholakia
c9e5c796ad fix(factory.py): mistral message input fix 2024-02-08 20:54:26 -08:00
Krrish Dholakia
e98437104d fix(proxy_server.py): enable aggregate queries via /spend/keys 2024-02-08 20:29:08 -08:00
Krish Dholakia
95bf684a8c
Merge pull request #1898 from BerriAI/litellm_langfuse_error_logging
Litellm langfuse error logging - log input
2024-02-08 17:38:46 -08:00
Krrish Dholakia
ff93609453 build(schema.prisma): support direct url on prisma schema 2024-02-08 17:37:37 -08:00
Krrish Dholakia
64fd1f7d21 fix(langfuse.py): langfuse success logging fix 2024-02-08 16:46:04 -08:00
Krrish Dholakia
bc23a9266e fix(langfuse.py): support passing input params for langfuse errors 2024-02-08 16:37:33 -08:00
Krish Dholakia
f473d59c23
Merge pull request #1895 from dleen/profile
(feat) Add support for AWS credentials from profile file
2024-02-08 15:15:05 -08:00
David Leen
140d915adf Add support for AWS credentials from profile file
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html#aws-config-file
2024-02-08 15:10:50 -08:00
Ishaan Jaff
59465bd612
Merge pull request #1892 from BerriAI/litellm_speed_up_s3_logging
[FEAT] 76 % Faster s3 logging Proxy / litellm.acompletion / router.acompletion 🚀
2024-02-08 11:48:36 -08:00
ishaan-jaff
6eb17cd916 (test) s3 logging 2024-02-08 11:11:19 -08:00