Krrish Dholakia
|
41483d2660
|
feat(factory.py): add support for anthropic system prompts for claude 2.1
|
2023-11-21 09:57:26 -08:00 |
|
Krrish Dholakia
|
855964ed45
|
fix(utils.py): adding support for rules + mythomax/alpaca prompt template
|
2023-11-20 18:58:15 -08:00 |
|
Krrish Dholakia
|
7ef1014e59
|
fix(factory.py): for ollama models check if it's instruct or not before applying prompt template
|
2023-11-16 15:45:08 -08:00 |
|
Krrish Dholakia
|
ce27e08e7d
|
(fix): llama-2 non-chat models prompt template
|
2023-11-07 21:33:54 -08:00 |
|
Krrish Dholakia
|
e3a1c58dd9
|
build(litellm_server/utils.py): add support for general settings + num retries as a module variable
|
2023-11-02 20:56:41 -07:00 |
|
Krrish Dholakia
|
512a1637eb
|
feat(completion()): enable setting prompt templates via completion()
|
2023-11-02 16:24:01 -07:00 |
|
Krrish Dholakia
|
4b48af7c3c
|
fix(anthropic.py-+-bedrock.py): anthropic prompt format
|
2023-10-20 10:56:15 -07:00 |
|
Krrish Dholakia
|
7358d2e4ea
|
bump: version 0.8.4 → 0.8.5
|
2023-10-14 16:43:06 -07:00 |
|
Krrish Dholakia
|
b28c055896
|
feat(proxy_server): adds create-proxy feature
|
2023-10-12 18:27:07 -07:00 |
|
Krrish Dholakia
|
704be9dcd1
|
feat(factory.py): option to add function details to prompt, if model doesn't support functions param
|
2023-10-09 09:53:53 -07:00 |
|
Krrish Dholakia
|
306a38880d
|
feat(ollama.py): exposing ollama config
|
2023-10-06 15:52:58 -07:00 |
|
Krrish Dholakia
|
7e34736a38
|
fix(add-custom-success-callback-for-streaming): add custom success callback for streaming
|
2023-10-06 15:02:02 -07:00 |
|
Krrish Dholakia
|
3ca79a88bb
|
improvements to proxy cli and finish reason mapping for anthropic
|
2023-09-30 18:09:16 -07:00 |
|
Krrish Dholakia
|
16c755257b
|
add support for custom hf prompt templates
|
2023-09-30 15:37:30 -07:00 |
|
Krrish Dholakia
|
e8ec3e8795
|
add mistral prompt templating
|
2023-09-29 21:41:28 -07:00 |
|
Krrish Dholakia
|
45293613ba
|
fix meta llama prompt template mapping bug
|
2023-09-18 21:24:41 -07:00 |
|
Krrish Dholakia
|
633e36de42
|
handle llama 2 eos tokens in streaming
|
2023-09-18 13:44:19 -07:00 |
|
Phodaie
|
35b5d773c8
|
code typo in falcon related prompt factory
|
2023-09-17 15:40:36 +00:00 |
|
Krrish Dholakia
|
0ace48d719
|
update custom prompt template function
|
2023-09-06 13:14:36 -07:00 |
|
Krrish Dholakia
|
3d6836417e
|
adding prompt template for falcon 180b
|
2023-09-06 08:44:13 -07:00 |
|
Krrish Dholakia
|
af33a85043
|
only use tgai's prompt template for llama2 instruct models
|
2023-09-05 12:25:52 -07:00 |
|
Krrish Dholakia
|
64f3d3c56e
|
prompt formatting for together ai llama2 models
|
2023-09-05 11:57:13 -07:00 |
|