Commit graph

1424 commits

Author SHA1 Message Date
ishaan-jaff
edac4130bb (fix) s3 + os.environ/ cache test 2024-01-06 16:33:29 +05:30
ishaan-jaff
174248fc71 (test) add back test for counting stream completion tokens 2024-01-06 16:08:32 +05:30
Krish Dholakia
8d32f08858
Merge pull request #1342 from BerriAI/litellm_dockerfile_updates
build(Dockerfile): moves prisma logic to dockerfile
2024-01-06 16:03:25 +05:30
ishaan-jaff
f999b63d05 (test) using os.environ/ on cache + proxy 2024-01-06 15:54:50 +05:30
ishaan-jaff
c2b061acb2 (feat) cache+proxy - set os.environ/ on proxy config 2024-01-06 15:54:16 +05:30
ishaan-jaff
0d152b3748 (fix) cloudflare tests 2024-01-06 15:35:49 +05:30
Krrish Dholakia
9375570547 test(test_async_fn.py): skip cloudflare test - flaky 2024-01-06 15:17:42 +05:30
Krrish Dholakia
04c04d62e3 test(test_stream_chunk_builder.py): remove completion assert, the test is for prompt tokens 2024-01-06 14:12:44 +05:30
Krrish Dholakia
5c45e69a5e test(test_proxy_server_keys.py): add logic for connecting/disconnecting from http server 2024-01-06 14:09:10 +05:30
ishaan-jaff
4a076350cc (ci/cd) move to old version of test_proxy_server_keys.py 2024-01-06 13:03:12 +05:30
ishaan-jaff
41bfd43a48 (ci/cd) pin anyio / async dependencies 2024-01-06 12:38:56 +05:30
ishaan-jaff
3bb49447bc (ci/cd) fix event loop bug proxy_test 2024-01-06 12:30:43 +05:30
ishaan-jaff
79fd2380bb (ci/cd) run again 2024-01-06 12:11:31 +05:30
ishaan-jaff
0ebd0653c5 (ci/cd) make prisma tests async 2024-01-06 11:43:23 +05:30
ishaan-jaff
ae54e6d8b0 (ci/cd) proxy:test_add_new_key 2024-01-05 22:53:03 +05:30
ishaan-jaff
40aaac69cc (ci/cd) add print_verbose for /key/generate 2024-01-05 22:38:46 +05:30
ishaan-jaff
dfdd329ddf (ci/cd) pytest event loop fixture 2024-01-05 22:28:34 +05:30
ishaan-jaff
050c289ed1 (ci/cd) test fixture 2024-01-05 22:15:08 +05:30
ishaan-jaff
6f9d3fc3bc (ci/cd) retry hosted ollama + stream test 3 times 2024-01-05 18:02:20 +05:30
ishaan-jaff
0eb899c087 (test) hosted ollama - retry 3 times 2024-01-05 17:58:59 +05:30
ishaan-jaff
76b2db4492 (ci/cd) run test again 2024-01-05 16:40:56 +05:30
ishaan-jaff
69bac0dbf6 (ci/cd) test proxy - init prisma in test 2024-01-05 16:18:23 +05:30
ishaan-jaff
4679c7b99a (fix) caching use same "created" in response_object 2024-01-05 16:03:56 +05:30
ishaan-jaff
f211009263 (test) openai embedding cost calculation 2024-01-05 15:22:17 +05:30
ishaan-jaff
20256c45ad (fix) retry cloudflare ai workers 3 times 2024-01-05 13:55:47 +05:30
ishaan-jaff
6694975ec3 (test) azure completion_cost 2024-01-05 13:53:08 +05:30
ishaan-jaff
13201edc4b (test) test reading configs on proxy 2024-01-05 13:37:31 +05:30
ishaan-jaff
72e7178c9b (test) azure/embedding + completion_cost 2024-01-05 13:19:17 +05:30
ishaan-jaff
f681f0f2b2 (feat) completion_cost - embeddings + raise Exception 2024-01-05 13:11:23 +05:30
ishaan-jaff
113b5e7284 (ci/cd) retry cloudflare request 3 times 2024-01-05 12:40:53 +05:30
ishaan-jaff
83b31141c6 (ci/cd) raise correct exception proxy 2024-01-05 12:29:03 +05:30
ishaan-jaff
bcf22725a6 (ci/cd) run cloudflare test 3 retries 2024-01-05 11:55:12 +05:30
ishaan-jaff
d1865591aa (fix) test caching- use azure, instead of bedrock 2024-01-05 10:51:56 +05:30
Krrish Dholakia
6506fba3bc test(test_proxy_exception_mapping.py): fix exception checking 2024-01-04 22:45:16 +05:30
Krrish Dholakia
25241de69e fix(router.py): don't retry malformed / content policy violating errors (400 status code)
https://github.com/BerriAI/litellm/issues/1317 , https://github.com/BerriAI/litellm/issues/1316
2024-01-04 22:23:51 +05:30
Krrish Dholakia
74f6f6489a fix(proxy_server.py): fix prisma client connection error 2024-01-04 18:28:18 +05:30
Krrish Dholakia
c7644915f9 fix(test_proxy_server.py): fix import 2024-01-04 16:11:23 +05:30
ishaan-jaff
c231a6e4d3 (ci/cd) run proxy test with debug=True 2024-01-04 13:01:00 +05:30
ishaan-jaff
234c057e97 (fix) azure+cf gateway, health check 2024-01-04 12:34:07 +05:30
Krrish Dholakia
b0827a87b2 fix(caching.py): support s-maxage param for cache controls 2024-01-04 11:41:23 +05:30
ishaan-jaff
54653f9a4a (test) proxy + s3 caching 2024-01-04 11:11:08 +05:30
ishaan-jaff
aa757d19f5 (test) router - init clients - azure cloudflare, openai etc 2024-01-04 10:55:18 +05:30
ishaan-jaff
0864713b62 (test) cf azure 2024-01-04 10:26:41 +05:30
ishaan-jaff
6d21ee3a2f (fix) proxy - cloudflare + Azure bug [non-streaming] 2024-01-04 10:24:51 +05:30
ishaan-jaff
d14a41863f (test) s3 cache with setting s3_bucket_name 2024-01-03 15:42:23 +05:30
ishaan-jaff
fea0a933ae (test) use s3 buckets cache 2024-01-03 15:13:43 +05:30
Krrish Dholakia
8cee267a5b fix(caching.py): support ttl, s-max-age, and no-cache cache controls
https://github.com/BerriAI/litellm/issues/1306
2024-01-03 12:42:43 +05:30
ishaan-jaff
2bea0c742e (test) completion tokens counting + azure stream 2024-01-03 12:06:39 +05:30
ishaan-jaff
14738ec89d (test) xinference on litellm router 2024-01-02 16:51:08 +05:30
ishaan-jaff
bfbed2d93d (test) xinference embeddings 2024-01-02 15:41:51 +05:30