Commit graph

725 commits

Author SHA1 Message Date
ishaan-jaff
f95049bbe9 test or 2023-09-20 20:36:44 -07:00
ishaan-jaff
20e86e6a05 fix test completion 2023-09-20 20:14:18 -07:00
ishaan-jaff
c844d26ff1 fix azure 2023-09-20 20:12:26 -07:00
ishaan-jaff
7b386ee4d8 fix test_completion 2023-09-20 20:11:32 -07:00
ishaan-jaff
2ba8e9bc02 only show error log dashboard on fail 2023-09-20 20:09:02 -07:00
ishaan-jaff
fc10cf5eeb exception_type work 2023-09-19 21:29:51 -07:00
ishaan-jaff
a230d08795 test batch_completion_models 2023-09-19 13:29:41 -07:00
ishaan-jaff
2c30855389 new batch completion function 2023-09-19 13:29:41 -07:00
ishaan-jaff
85862c1066 petals remove print statement 2023-09-19 10:56:30 -07:00
ishaan-jaff
946c81626d fix petals imports 2023-09-19 09:27:40 -07:00
ishaan-jaff
5f808f9f87 petals on main 2023-09-19 09:27:28 -07:00
ishaan-jaff
9c37619e5c test petals 2023-09-19 09:24:34 -07:00
ishaan-jaff
bee1224aa2 dev4 2023-09-19 09:24:34 -07:00
ishaan-jaff
385640b743 petals fixes 2023-09-19 09:24:34 -07:00
ishaan-jaff
f6ccadabc8 fix petals import 2023-09-19 09:24:34 -07:00
ishaan-jaff
a107b7b3ec add petals file for completion 2023-09-19 09:24:34 -07:00
ishaan-jaff
934deab1f7 add petals to init and main 2023-09-19 09:24:34 -07:00
ishaan-jaff
e546aa1c76 bump v 2023-09-18 21:38:37 -07:00
Krrish Dholakia
45293613ba fix meta llama prompt template mapping bug 2023-09-18 21:24:41 -07:00
Krrish Dholakia
5acd1c9d47 update budget manager 2023-09-18 15:48:29 -07:00
Krrish Dholakia
633e36de42 handle llama 2 eos tokens in streaming 2023-09-18 13:44:19 -07:00
Krrish Dholakia
f134de1287 bug fix for litellm proxy implementation 2023-09-18 12:54:56 -07:00
Krrish Dholakia
8c809db567 bump version 2023-09-18 12:27:18 -07:00
Krrish Dholakia
9067ec3b43 add support for litellm proxy calls 2023-09-18 12:15:21 -07:00
ishaan-jaff
0f88b82c4f add gpt-3.5-instruct 2023-09-18 12:14:52 -07:00
Krrish Dholakia
5b294c704e fix hf conversational task bug 2023-09-18 11:46:36 -07:00
ishaan-jaff
6b548eeb28 fix linting errors 2023-09-18 10:57:55 -07:00
ishaan-jaff
0bee6e0d38 cleanup 2023-09-18 10:43:44 -07:00
ishaan-jaff
071e77ccad hf non conv + tgi llms 2023-09-18 10:43:40 -07:00
ishaan-jaff
e7f4e8b4a4 allow non tgi llms 2023-09-18 10:26:57 -07:00
Krish Dholakia
e83d89d12f
Merge pull request #387 from WilliamEspegren/main
Rebuild stream chunks to openAI object
2023-09-18 09:30:42 -07:00
Krrish Dholakia
78da8e7e65 update readme with contributing.md information 2023-09-18 09:30:30 -07:00
ishaan-jaff
a5425df91f doc string for completion 2023-09-18 09:03:26 -07:00
ishaan-jaff
d684f75683 add doc string for a completion 2023-09-18 08:49:27 -07:00
ishaan-jaff
4a17a6c2b3 add doc string for completion_cost 2023-09-18 08:44:40 -07:00
WilliamEspegren
518deae394
resolve merge conflicts 2023-09-18 09:15:03 +02:00
WilliamEspegren
63a8614a14
Added function_call support to stream_chunk_builder 2023-09-17 19:42:55 +02:00
WilliamEspegren
404af1be0f
if "function_call" find name 2023-09-17 19:02:50 +02:00
Phodaie
35b5d773c8 code typo in falcon related prompt factory 2023-09-17 15:40:36 +00:00
Krrish Dholakia
abe3286ec6 bump version 2023-09-17 08:07:19 -07:00
Krrish Dholakia
4daaa1cf91 clean out print statements 2023-09-17 05:52:19 -07:00
WilliamEspegren
d2bdfecd96
remove unused parameter 2023-09-17 13:12:49 +02:00
WilliamEspegren
5544a9251f
rebuild chunks to openAI response
Rebuild the cunks, but does not include the "usage"
2023-09-17 13:07:54 +02:00
WilliamEspegren
e07b3f5bfe
append chunk to chunks 2023-09-17 10:29:09 +02:00
WilliamEspegren
fe37f97423
add empty chunks list 2023-09-17 10:28:37 +02:00
WilliamEspegren
a39a204457
import stream_chunk_builder 2023-09-17 10:28:01 +02:00
WilliamEspegren
71afdbc0e5
add max_tokens 2023-09-17 10:27:32 +02:00
WilliamEspegren
43a18b9528
add stream_chunk_builder function 2023-09-17 10:26:34 +02:00
WilliamEspegren
14fd6770be
Merge branch 'main' of https://github.com/WilliamEspegren/litellm 2023-09-17 09:29:10 +02:00
Krrish Dholakia
c829798e04 bump version and clean print statements 2023-09-16 20:59:24 -07:00