Krrish Dholakia
|
82d28a8825
|
fix(factory.py): safely fail prompt template get requests for together ai
|
2023-12-12 17:28:22 -08:00 |
|
Krrish Dholakia
|
8e7116635f
|
fix(ollama.py): add support for async streaming
|
2023-12-12 16:44:20 -08:00 |
|
Krrish Dholakia
|
8b07a6c046
|
fix(main.py): pass user_id + encoding_format for logging + to openai/azure
|
2023-12-12 15:46:44 -08:00 |
|
ishaan-jaff
|
a251a52717
|
(chore) remove junk tkinter import
|
2023-12-12 13:54:50 -08:00 |
|
ishaan-jaff
|
99b48eff17
|
(fix) tkinter import
|
2023-12-12 12:18:25 -08:00 |
|
Krrish Dholakia
|
9cf5ab468f
|
fix(router.py): deepcopy initial model list, don't mutate it
|
2023-12-12 09:54:06 -08:00 |
|
Krrish Dholakia
|
2c1c75fdf0
|
fix(ollama.py): enable parallel ollama completion calls
|
2023-12-11 23:18:37 -08:00 |
|
Krrish Dholakia
|
ad39afc0ad
|
test(test_custom_callback_input.py): embedding callback tests for azure, openai, bedrock
|
2023-12-11 15:32:46 -08:00 |
|
Krrish Dholakia
|
b09ecb986e
|
test(test_custom_callback_input.py): add bedrock testing
n
n
|
2023-12-11 13:00:01 -08:00 |
|
Krrish Dholakia
|
ea89a8a938
|
test(test_custom_callback_unit.py): adding unit tests for custom callbacks + fixing related bugs
|
2023-12-11 11:44:09 -08:00 |
|
Krish Dholakia
|
4ffe6a4296
|
Merge pull request #1054 from James4Ever0/main
Update factory.py to fix issue when calling from write-the -> langchain -> litellm served ollama
|
2023-12-11 07:18:02 -08:00 |
|
Krish Dholakia
|
bbbc5db104
|
Merge pull request #1080 from nbaldwin98/fixing-replicate-sys-prompt
fix replicate system prompt: forgot to add **optional_params to input data
|
2023-12-11 07:11:52 -08:00 |
|
chabala98
|
c5ce11541b
|
adfix: ded **optional_params in input data when system prompt is available (allows to papass other optional params apart from sys prompt)
|
2023-12-11 14:42:05 +01:00 |
|
James4Ever0
|
69fc2694bb
|
Update factory.py
Fixing issue when calling from write-the -> langchain -> litellm served ollama
|
2023-12-08 02:58:28 +08:00 |
|
ishaan-jaff
|
d2eee342fb
|
(feat) vertex ai - better debugging
|
2023-12-07 09:38:37 -08:00 |
|
Krrish Dholakia
|
c1e95740b0
|
fix(bedrock.py): fix output format for cohere embeddings
|
2023-12-06 22:47:01 -08:00 |
|
Krrish Dholakia
|
ac7d0a1632
|
fix(together_ai.py): improve together ai custom prompt templating
|
2023-12-06 19:34:49 -08:00 |
|
Krrish Dholakia
|
fff0228c20
|
fix(factory.py): support togethercomputer codellama pt
|
2023-12-06 19:02:58 -08:00 |
|
Krrish Dholakia
|
0295509b3b
|
fix(factory.py): fix claude 2.1 prompt template to handle system, assistant, user prompt
|
2023-12-06 18:02:06 -08:00 |
|
Krrish Dholakia
|
f1c1ec8523
|
fix(bedrock.py): fix embeddings call
|
2023-12-06 14:16:00 -08:00 |
|
Krrish Dholakia
|
b24c9b4cbf
|
refactor: fix linting
|
2023-12-06 13:27:40 -08:00 |
|
Krrish Dholakia
|
d962d5d4c0
|
fix(bedrock.py): adding support for cohere embeddings
|
2023-12-06 13:25:18 -08:00 |
|
Krrish Dholakia
|
102de97960
|
refactor: fix linting errors
|
2023-12-06 11:46:15 -08:00 |
|
Krrish Dholakia
|
94f065f83c
|
feat(sagemaker.py): support huggingface embedding models
|
2023-12-06 11:41:38 -08:00 |
|
Krrish Dholakia
|
648d41c96f
|
fix(sagemaker.py): prompt templating fixes
|
2023-12-05 17:47:44 -08:00 |
|
Krrish Dholakia
|
ff949490de
|
docs(input.md): add hf_model_name to docs
|
2023-12-05 16:56:18 -08:00 |
|
Krrish Dholakia
|
88845dddb1
|
fix(sagemaker.py): bring back llama2 templating for sagemaker
|
2023-12-05 16:42:19 -08:00 |
|
Krrish Dholakia
|
54d8a9df3f
|
fix(sagemaker.py): enable passing hf model name for prompt template
|
2023-12-05 16:31:59 -08:00 |
|
Krrish Dholakia
|
a38504ff1b
|
fix(sagemaker.py): fix meta llama model name for sagemaker custom deployment
|
2023-12-05 16:23:03 -08:00 |
|
Krrish Dholakia
|
3c60682eb4
|
fix(sagemaker.py): accept all amazon neuron llama2 models
|
2023-12-05 16:19:28 -08:00 |
|
Krrish Dholakia
|
01fc7f1931
|
fix(sagemaker.py): add support for amazon neuron llama models
|
2023-12-05 16:18:20 -08:00 |
|
Krrish Dholakia
|
b4c78c7b9e
|
fix(utils.py): support sagemaker llama2 custom endpoints
|
2023-12-05 16:05:15 -08:00 |
|
Krrish Dholakia
|
71e64c34cb
|
fix(huggingface_restapi.py): raise better exceptions for unprocessable hf responses
|
2023-12-05 07:28:21 -08:00 |
|
Krish Dholakia
|
b90fcbdac4
|
Merge pull request #970 from nbaldwin98/fixing-replicate-sys-prompt
fix system prompts for replicate
|
2023-12-04 16:39:44 -08:00 |
|
ishaan-jaff
|
32ecc1a677
|
(feat) replicate/deployments: add POST Req view
|
2023-12-04 13:43:03 -08:00 |
|
chabala98
|
c2e2e927fb
|
fix system prompts for replicate
|
2023-12-01 13:16:35 +01:00 |
|
ishaan-jaff
|
1081d4c766
|
(feat) aembedding: return raw openai response
|
2023-11-30 20:02:47 -08:00 |
|
Krrish Dholakia
|
c473abde49
|
fix(azure.py): logging fix
|
2023-11-30 14:13:40 -08:00 |
|
Krrish Dholakia
|
032f71adb2
|
fix(router.py): support cloudflare ai gateway for azure models on router
|
2023-11-30 14:09:06 -08:00 |
|
Krrish Dholakia
|
82553e8aac
|
fix(azure.py): fix linting errors
|
2023-11-30 13:32:29 -08:00 |
|
Krrish Dholakia
|
4f07c8565a
|
feat(main.py): add support for azure-openai via cloudflare ai gateway
|
2023-11-30 13:19:49 -08:00 |
|
ishaan-jaff
|
4ed5b3b46d
|
(chore) linting fix
|
2023-11-29 19:58:12 -08:00 |
|
Krrish Dholakia
|
1f5a1122fc
|
fix(replicate.py): fix custom prompt formatting
|
2023-11-29 19:44:09 -08:00 |
|
ishaan-jaff
|
c05da0797b
|
(feat) Embedding: Async Azure
|
2023-11-29 19:43:47 -08:00 |
|
ishaan-jaff
|
09caab549a
|
(feat) async embeddings: OpenAI
|
2023-11-29 19:35:08 -08:00 |
|
Krrish Dholakia
|
ab76daa90b
|
fix(bedrock.py): support ai21 / bedrock streaming
|
2023-11-29 16:35:06 -08:00 |
|
ishaan-jaff
|
9bf603889f
|
(fix) azure: remove max retries before completion
|
2023-11-29 16:09:31 -08:00 |
|
Krrish Dholakia
|
a9ed768991
|
fix(azure.py): fix error handling for openai/azure streaming
|
2023-11-29 11:52:24 -08:00 |
|
ishaan-jaff
|
0f0ddcc0fb
|
(fix) using AzureOpenAI client
|
2023-11-28 17:17:40 -08:00 |
|
ishaan-jaff
|
8609694b49
|
(fix) completion:openai-pop out max_retries from completion kwargs
|
2023-11-28 17:09:58 -08:00 |
|