Krish Dholakia
|
4ea3e778f7
|
Merge pull request #1315 from spdustin/feature_allow_claude_prefill
Adds "pre-fill" support for Claude
|
2024-01-08 10:48:15 +05:30 |
|
Krrish Dholakia
|
1507217725
|
fix(factory.py): more logging around the image loading for gemini
|
2024-01-06 22:50:44 +05:30 |
|
Krrish Dholakia
|
5fd2f945f3
|
fix(factory.py): support gemini-pro-vision on google ai studio
https://github.com/BerriAI/litellm/issues/1329
|
2024-01-06 22:36:22 +05:30 |
|
spdustin@gmail.com
|
6201ab2c21
|
Update factory (and tests) for Claude 2.1 via Bedrock
|
2024-01-05 23:32:32 +00:00 |
|
Dustin Miller
|
b10f64face
|
Adds "pre-fill" support for Claude
|
2024-01-03 18:45:36 -06:00 |
|
Krrish Dholakia
|
4905929de3
|
refactor: add black formatting
|
2023-12-25 14:11:20 +05:30 |
|
Krrish Dholakia
|
1262d89ab3
|
feat(gemini.py): add support for completion calls for gemini-pro (google ai studio)
|
2023-12-24 09:42:58 +05:30 |
|
Krrish Dholakia
|
f0df28362a
|
feat(ollama.py): add support for ollama function calling
|
2023-12-20 14:59:55 +05:30 |
|
ishaan-jaff
|
287633887e
|
(feat) add ollama/llava
|
2023-12-16 10:35:27 +05:30 |
|
Krrish Dholakia
|
75bcb37cb2
|
fix(factory.py): fix tgai rendering template
|
2023-12-13 12:27:31 -08:00 |
|
Krrish Dholakia
|
82d28a8825
|
fix(factory.py): safely fail prompt template get requests for together ai
|
2023-12-12 17:28:22 -08:00 |
|
James4Ever0
|
69fc2694bb
|
Update factory.py
Fixing issue when calling from write-the -> langchain -> litellm served ollama
|
2023-12-08 02:58:28 +08:00 |
|
Krrish Dholakia
|
ac7d0a1632
|
fix(together_ai.py): improve together ai custom prompt templating
|
2023-12-06 19:34:49 -08:00 |
|
Krrish Dholakia
|
fff0228c20
|
fix(factory.py): support togethercomputer codellama pt
|
2023-12-06 19:02:58 -08:00 |
|
Krrish Dholakia
|
0295509b3b
|
fix(factory.py): fix claude 2.1 prompt template to handle system, assistant, user prompt
|
2023-12-06 18:02:06 -08:00 |
|
Krrish Dholakia
|
41483d2660
|
feat(factory.py): add support for anthropic system prompts for claude 2.1
|
2023-11-21 09:57:26 -08:00 |
|
Krrish Dholakia
|
855964ed45
|
fix(utils.py): adding support for rules + mythomax/alpaca prompt template
|
2023-11-20 18:58:15 -08:00 |
|
Krrish Dholakia
|
7ef1014e59
|
fix(factory.py): for ollama models check if it's instruct or not before applying prompt template
|
2023-11-16 15:45:08 -08:00 |
|
Krrish Dholakia
|
ce27e08e7d
|
(fix): llama-2 non-chat models prompt template
|
2023-11-07 21:33:54 -08:00 |
|
Krrish Dholakia
|
e3a1c58dd9
|
build(litellm_server/utils.py): add support for general settings + num retries as a module variable
|
2023-11-02 20:56:41 -07:00 |
|
Krrish Dholakia
|
512a1637eb
|
feat(completion()): enable setting prompt templates via completion()
|
2023-11-02 16:24:01 -07:00 |
|
Krrish Dholakia
|
4b48af7c3c
|
fix(anthropic.py-+-bedrock.py): anthropic prompt format
|
2023-10-20 10:56:15 -07:00 |
|
Krrish Dholakia
|
7358d2e4ea
|
bump: version 0.8.4 → 0.8.5
|
2023-10-14 16:43:06 -07:00 |
|
Krrish Dholakia
|
b28c055896
|
feat(proxy_server): adds create-proxy feature
|
2023-10-12 18:27:07 -07:00 |
|
Krrish Dholakia
|
704be9dcd1
|
feat(factory.py): option to add function details to prompt, if model doesn't support functions param
|
2023-10-09 09:53:53 -07:00 |
|
Krrish Dholakia
|
306a38880d
|
feat(ollama.py): exposing ollama config
|
2023-10-06 15:52:58 -07:00 |
|
Krrish Dholakia
|
7e34736a38
|
fix(add-custom-success-callback-for-streaming): add custom success callback for streaming
|
2023-10-06 15:02:02 -07:00 |
|
Krrish Dholakia
|
3ca79a88bb
|
improvements to proxy cli and finish reason mapping for anthropic
|
2023-09-30 18:09:16 -07:00 |
|
Krrish Dholakia
|
16c755257b
|
add support for custom hf prompt templates
|
2023-09-30 15:37:30 -07:00 |
|
Krrish Dholakia
|
e8ec3e8795
|
add mistral prompt templating
|
2023-09-29 21:41:28 -07:00 |
|
Krrish Dholakia
|
45293613ba
|
fix meta llama prompt template mapping bug
|
2023-09-18 21:24:41 -07:00 |
|
Krrish Dholakia
|
633e36de42
|
handle llama 2 eos tokens in streaming
|
2023-09-18 13:44:19 -07:00 |
|
Phodaie
|
35b5d773c8
|
code typo in falcon related prompt factory
|
2023-09-17 15:40:36 +00:00 |
|
Krrish Dholakia
|
0ace48d719
|
update custom prompt template function
|
2023-09-06 13:14:36 -07:00 |
|
Krrish Dholakia
|
3d6836417e
|
adding prompt template for falcon 180b
|
2023-09-06 08:44:13 -07:00 |
|
Krrish Dholakia
|
af33a85043
|
only use tgai's prompt template for llama2 instruct models
|
2023-09-05 12:25:52 -07:00 |
|
Krrish Dholakia
|
64f3d3c56e
|
prompt formatting for together ai llama2 models
|
2023-09-05 11:57:13 -07:00 |
|