Commit graph

326 commits

Author SHA1 Message Date
WilliamEspegren
d2bdfecd96
remove unused parameter 2023-09-17 13:12:49 +02:00
WilliamEspegren
5544a9251f
rebuild chunks to openAI response
Rebuild the cunks, but does not include the "usage"
2023-09-17 13:07:54 +02:00
WilliamEspegren
43a18b9528
add stream_chunk_builder function 2023-09-17 10:26:34 +02:00
Krrish Dholakia
c829798e04 bump version and clean print statements 2023-09-16 20:59:24 -07:00
Krrish Dholakia
e4fbc8d908 fix cohere streaming 2023-09-16 19:36:41 -07:00
Krrish Dholakia
e44c218c1b fix streaming formatting for non-openai models 2023-09-16 19:20:07 -07:00
Krrish Dholakia
122c993e6f except custom openai proxy 2023-09-16 16:15:44 -07:00
Krrish Dholakia
ce827faa93 fixes to testing 2023-09-16 11:40:01 -07:00
Krrish Dholakia
21cd55ab26 ensure streaming format is exactly the same as openai 2023-09-16 10:35:20 -07:00
ishaan-jaff
c714372b9d streaming for amazon titan bedrock 2023-09-16 09:57:16 -07:00
Toni Engelhardt
73bea9345f
fix setting mock response 2023-09-16 17:03:32 +01:00
ishaan-jaff
5ec1fc5048 bump version 2023-09-15 14:15:09 -07:00
ishaan-jaff
abb3793e50 tests 2023-09-15 13:38:26 -07:00
ishaan-jaff
b2c4c3576c add new tg ai llm check 2023-09-15 09:29:46 -07:00
ishaan-jaff
38bdb9335c add bedrock to main and init 2023-09-14 13:51:09 -07:00
Krrish Dholakia
3b4064a58f move cohere to http endpoint 2023-09-14 11:17:38 -07:00
Krrish Dholakia
e2ea4adb84 fixes to mock completion 2023-09-14 10:03:57 -07:00
Krish Dholakia
7c9779f0ac
Merge pull request #371 from promptmetheus/simplify-mock-logic
Simplify mock logic
2023-09-14 09:23:45 -07:00
Krrish Dholakia
f98da9f13c adding support for nlp cloud 2023-09-14 09:19:34 -07:00
ishaan-jaff
61737e67a1 pass api_type as an arg in ChatCompl 2023-09-14 08:48:59 -07:00
Toni Engelhardt
900cf2c2ee
simplify mock logic
Adds shortcut for the mock_completion method.
2023-09-14 16:13:31 +01:00
Toni Engelhardt
630b5d2c1f
add kwargs to mock_completion
Allows replacing `completion` with `mock_completion` without encountering errors.
2023-09-14 10:27:19 +01:00
Krrish Dholakia
aaa57abddd map finish reason 2023-09-13 19:22:38 -07:00
Krrish Dholakia
5b6b9a9fab huggingface conversational task support 2023-09-13 13:45:23 -07:00
ishaan-jaff
e3fa1d686e remove verify_access_key from main 2023-09-12 11:50:30 -07:00
Ishaan Jaff
60e3e42fba
Merge branch 'main' into main 2023-09-12 11:43:03 -07:00
Krrish Dholakia
baa69734b0 raise better exception if llm provider isn't passed in or inferred 2023-09-12 11:28:50 -07:00
ishaan-jaff
0ffcf51445 fix azure logging 2023-09-12 10:35:47 -07:00
ishaan-jaff
0f89636b47 use api_version for azure completion call 2023-09-12 10:35:24 -07:00
William Espegren
261db15bcf
util: verify_access_key
Verify that the user provided a valid openai token by creating a request to the openAI endpoint.
2023-09-12 18:48:23 +02:00
Ishaan Jaff
fdddd00bfd
Merge pull request #330 from Taik/main
Add support for overriding API type for Azure calls
2023-09-11 17:04:25 -07:00
Krrish Dholakia
cc1313492b code cleanup 2023-09-11 16:32:17 -07:00
Krrish Dholakia
cbc7e6dbc2 mock responses for streaming 2023-09-11 16:30:29 -07:00
Thinh Nguyen
8801f71a52
Add support for overriding API type for Azure calls 2023-09-11 16:28:00 -07:00
Krrish Dholakia
6da500c6e0 add mock request to docs 2023-09-11 12:19:13 -07:00
ishaan-jaff
1ef80c9a2a begin using litellm.api_key 2023-09-11 07:32:20 -07:00
Krrish Dholakia
beecb60f51 update testing 2023-09-09 16:35:38 -07:00
Krrish Dholakia
a39756bfda add api manager 2023-09-09 15:55:38 -07:00
ishaan-jaff
bc1ad908c0 add support for calling openai proxy using lite 2023-09-09 15:47:29 -07:00
ishaan-jaff
56bd8c1c52 olla upgrades, fix streaming, add non streaming resp 2023-09-09 14:07:13 -07:00
ishaan-jaff
9f7d397014 add cache params 2023-09-08 20:18:12 -07:00
ishaan-jaff
8180ba273b updating caching tests 2023-09-08 20:15:15 -07:00
ishaan-jaff
0ab62f13e8 caching updates 2023-09-08 18:06:47 -07:00
ishaan-jaff
1c9169e563 raise cohere, vertex, tenacity import errors 2023-09-08 15:26:09 -07:00
ishaan-jaff
632d928bfb comments about using hosted vllm 2023-09-08 14:06:52 -07:00
ishaan-jaff
c45e2ed48c hosted vllm usage 2023-09-08 13:58:06 -07:00
ishaan-jaff
c05606b587 custom base with openai completion 2023-09-08 13:21:43 -07:00
ishaan-jaff
a611409e0f async streaming generator 2023-09-07 13:53:40 -07:00
Krrish Dholakia
6b3cb18983 fix linting issues 2023-09-06 20:43:59 -07:00
Krrish Dholakia
35cf6ef0a1 batch completions for vllm now works too 2023-09-06 19:26:19 -07:00