Krish Dholakia
|
439ee3bafc
|
Merge pull request #1344 from BerriAI/litellm_speed_improvements
Litellm speed improvements
|
2024-01-06 22:38:10 +05:30 |
|
Krrish Dholakia
|
5fd2f945f3
|
fix(factory.py): support gemini-pro-vision on google ai studio
https://github.com/BerriAI/litellm/issues/1329
|
2024-01-06 22:36:22 +05:30 |
|
Krrish Dholakia
|
3577857ed1
|
fix(sagemaker.py): fix the post-call logging logic
|
2024-01-06 21:52:58 +05:30 |
|
Krrish Dholakia
|
f2ad13af65
|
fix(openai.py): fix image generation model dump
|
2024-01-06 17:55:32 +05:30 |
|
Krrish Dholakia
|
2d8d7e3569
|
perf(router.py): don't use asyncio.wait for - just pass it to the completion call for timeouts
|
2024-01-06 17:05:55 +05:30 |
|
Krrish Dholakia
|
712f89b4f1
|
fix(utils.py): handle original_response being a json
|
2024-01-06 17:02:50 +05:30 |
|
Krrish Dholakia
|
a7245dba07
|
build(Dockerfile): fixes the build time setup
|
2024-01-06 16:41:37 +05:30 |
|
ishaan-jaff
|
edac4130bb
|
(fix) s3 + os.environ/ cache test
|
2024-01-06 16:33:29 +05:30 |
|
ishaan-jaff
|
c222c0bfb8
|
(fix) proxy + cache - os.environ/ vars
|
2024-01-06 16:15:53 +05:30 |
|
ishaan-jaff
|
174248fc71
|
(test) add back test for counting stream completion tokens
|
2024-01-06 16:08:32 +05:30 |
|
Krish Dholakia
|
8d32f08858
|
Merge pull request #1342 from BerriAI/litellm_dockerfile_updates
build(Dockerfile): moves prisma logic to dockerfile
|
2024-01-06 16:03:25 +05:30 |
|
ishaan-jaff
|
f999b63d05
|
(test) using os.environ/ on cache + proxy
|
2024-01-06 15:54:50 +05:30 |
|
ishaan-jaff
|
c2b061acb2
|
(feat) cache+proxy - set os.environ/ on proxy config
|
2024-01-06 15:54:16 +05:30 |
|
Krrish Dholakia
|
9a4a96f46e
|
perf(azure+openai-files): use model_dump instead of json.loads + model_dump_json
|
2024-01-06 15:50:05 +05:30 |
|
ishaan-jaff
|
0d152b3748
|
(fix) cloudflare tests
|
2024-01-06 15:35:49 +05:30 |
|
Krrish Dholakia
|
13e8535b14
|
test(test_async_fn.py): skip cloudflare test - flaky
|
2024-01-06 15:21:10 +05:30 |
|
Krrish Dholakia
|
523d8e5977
|
build(Dockerfile): moves prisma logic to dockerfile
|
2024-01-06 15:21:10 +05:30 |
|
Krrish Dholakia
|
9375570547
|
test(test_async_fn.py): skip cloudflare test - flaky
|
2024-01-06 15:17:42 +05:30 |
|
Krrish Dholakia
|
7434f1a300
|
build(Dockerfile): moves prisma logic to dockerfile
|
2024-01-06 14:59:10 +05:30 |
|
ishaan-jaff
|
6011c5c8c2
|
(fix) undo changes that led were trying to control prisma connections
|
2024-01-06 14:32:40 +05:30 |
|
Krrish Dholakia
|
04c04d62e3
|
test(test_stream_chunk_builder.py): remove completion assert, the test is for prompt tokens
|
2024-01-06 14:12:44 +05:30 |
|
Krrish Dholakia
|
5c45e69a5e
|
test(test_proxy_server_keys.py): add logic for connecting/disconnecting from http server
|
2024-01-06 14:09:10 +05:30 |
|
Krrish Dholakia
|
b51d98c6e3
|
docs: fix pip install litellm[proxy] instruction
|
2024-01-06 13:49:15 +05:30 |
|
Krrish Dholakia
|
bf56179da8
|
fix(proxy/utils.py): increase http connection pool for prisma
|
2024-01-06 13:45:30 +05:30 |
|
ishaan-jaff
|
4a076350cc
|
(ci/cd) move to old version of test_proxy_server_keys.py
|
2024-01-06 13:03:12 +05:30 |
|
ishaan-jaff
|
41bfd43a48
|
(ci/cd) pin anyio / async dependencies
|
2024-01-06 12:38:56 +05:30 |
|
ishaan-jaff
|
3bb49447bc
|
(ci/cd) fix event loop bug proxy_test
|
2024-01-06 12:30:43 +05:30 |
|
ishaan-jaff
|
250672eddc
|
(ci/cd) temp fix - check if model_dump_json exists
|
2024-01-06 12:23:39 +05:30 |
|
ishaan-jaff
|
79fd2380bb
|
(ci/cd) run again
|
2024-01-06 12:11:31 +05:30 |
|
ishaan-jaff
|
65ac4c1acb
|
(ci/cd) run again
|
2024-01-06 11:57:31 +05:30 |
|
ishaan-jaff
|
0ebd0653c5
|
(ci/cd) make prisma tests async
|
2024-01-06 11:43:23 +05:30 |
|
ishaan-jaff
|
357c6c56bd
|
Revert "build(Dockerfile): move prisma build to dockerfile"
This reverts commit 2741835605 .
|
2024-01-06 09:51:44 +05:30 |
|
spdustin@gmail.com
|
6201ab2c21
|
Update factory (and tests) for Claude 2.1 via Bedrock
|
2024-01-05 23:32:32 +00:00 |
|
spdustin@gmail.com
|
5d074f5b56
|
Adds tests and updates docs for Claude "pre-fill"
|
2024-01-05 22:58:41 +00:00 |
|
Dustin Miller
|
53e5e1df07
|
Merge branch 'BerriAI:main' into feature_allow_claude_prefill
|
2024-01-05 15:15:29 -06:00 |
|
ishaan-jaff
|
ae54e6d8b0
|
(ci/cd) proxy:test_add_new_key
|
2024-01-05 22:53:03 +05:30 |
|
ishaan-jaff
|
40aaac69cc
|
(ci/cd) add print_verbose for /key/generate
|
2024-01-05 22:38:46 +05:30 |
|
ishaan-jaff
|
dfdd329ddf
|
(ci/cd) pytest event loop fixture
|
2024-01-05 22:28:34 +05:30 |
|
ishaan-jaff
|
050c289ed1
|
(ci/cd) test fixture
|
2024-01-05 22:15:08 +05:30 |
|
ishaan-jaff
|
d9fd38ae16
|
(fix) revert 469ae0a
|
2024-01-05 22:06:39 +05:30 |
|
ishaan-jaff
|
41f5cb7f04
|
(fix) prisma set DATABASE_URL in env
|
2024-01-05 20:57:27 +05:30 |
|
ishaan-jaff
|
898c072103
|
(fix) proxy - self.connect() for get_data()
|
2024-01-05 20:48:16 +05:30 |
|
Krrish Dholakia
|
2741835605
|
build(Dockerfile): move prisma build to dockerfile
Seems to solve - https://github.com/BerriAI/litellm/issues/1321
|
2024-01-05 19:03:41 +05:30 |
|
ishaan-jaff
|
6f9d3fc3bc
|
(ci/cd) retry hosted ollama + stream test 3 times
|
2024-01-05 18:02:20 +05:30 |
|
ishaan-jaff
|
0eb899c087
|
(test) hosted ollama - retry 3 times
|
2024-01-05 17:58:59 +05:30 |
|
ishaan-jaff
|
90973d92bf
|
(fix) re-connect prisma if not connected
|
2024-01-05 17:58:23 +05:30 |
|
ishaan-jaff
|
d2578f0cd2
|
(ci/cd) proxy print_verbose on failing insert_data
|
2024-01-05 17:28:27 +05:30 |
|
ishaan-jaff
|
76b2db4492
|
(ci/cd) run test again
|
2024-01-05 16:40:56 +05:30 |
|
ishaan-jaff
|
69bac0dbf6
|
(ci/cd) test proxy - init prisma in test
|
2024-01-05 16:18:23 +05:30 |
|
ishaan-jaff
|
4679c7b99a
|
(fix) caching use same "created" in response_object
|
2024-01-05 16:03:56 +05:30 |
|