Krrish Dholakia
|
a09a6f24a4
|
fix(together_ai.py): additional logging for together ai encoding prompt
|
2023-12-15 10:39:23 -08:00 |
|
Krrish Dholakia
|
cab870f73a
|
fix(ollama.py): fix ollama async streaming for /completions calls
|
2023-12-15 09:28:32 -08:00 |
|
ishaan-jaff
|
85a3c67574
|
(feat) - acompletion, correct exception mapping
|
2023-12-15 08:28:12 +05:30 |
|
Krrish Dholakia
|
804d58eb20
|
bump: version 1.14.4 → 1.14.5.dev1
|
2023-12-14 15:23:52 -08:00 |
|
Krrish Dholakia
|
1608dd7e0b
|
fix(main.py): support async streaming for text completions endpoint
|
2023-12-14 13:56:32 -08:00 |
|
Krish Dholakia
|
a6e78497b5
|
Merge pull request #1122 from emsi/main
Fix #1119, no content when streaming.
|
2023-12-14 10:01:00 -08:00 |
|
Krrish Dholakia
|
e678009695
|
fix(vertex_ai.py): add exception mapping for acompletion calls
|
2023-12-13 16:35:50 -08:00 |
|
Krrish Dholakia
|
7b8851cce5
|
fix(ollama.py): fix async completion calls for ollama
|
2023-12-13 13:10:25 -08:00 |
|
Mariusz Woloszyn
|
1feb6317f6
|
Fix #1119, no content when streaming.
|
2023-12-13 21:42:35 +01:00 |
|
Krrish Dholakia
|
75bcb37cb2
|
fix(factory.py): fix tgai rendering template
|
2023-12-13 12:27:31 -08:00 |
|
Krrish Dholakia
|
69c29f8f86
|
fix(vertex_ai.py): add support for real async streaming + completion calls
|
2023-12-13 11:53:55 -08:00 |
|
Krrish Dholakia
|
07015843ac
|
fix(vertex_ai.py): support optional params + enable async calls for gemini
|
2023-12-13 11:01:23 -08:00 |
|
Krrish Dholakia
|
ef7a6e3ae1
|
feat(vertex_ai.py): adds support for gemini-pro on vertex ai
|
2023-12-13 10:26:30 -08:00 |
|
ishaan-jaff
|
86e626edab
|
(feat) pass vertex_ai/ as custom_llm_provider
|
2023-12-13 19:02:24 +03:00 |
|
Krrish Dholakia
|
a64bd2ca1e
|
fix(sagemaker.py): filter out templated prompt if in model response
|
2023-12-13 07:43:33 -08:00 |
|
Krrish Dholakia
|
82d28a8825
|
fix(factory.py): safely fail prompt template get requests for together ai
|
2023-12-12 17:28:22 -08:00 |
|
Krrish Dholakia
|
8e7116635f
|
fix(ollama.py): add support for async streaming
|
2023-12-12 16:44:20 -08:00 |
|
Krrish Dholakia
|
8b07a6c046
|
fix(main.py): pass user_id + encoding_format for logging + to openai/azure
|
2023-12-12 15:46:44 -08:00 |
|
ishaan-jaff
|
a251a52717
|
(chore) remove junk tkinter import
|
2023-12-12 13:54:50 -08:00 |
|
ishaan-jaff
|
99b48eff17
|
(fix) tkinter import
|
2023-12-12 12:18:25 -08:00 |
|
Krrish Dholakia
|
9cf5ab468f
|
fix(router.py): deepcopy initial model list, don't mutate it
|
2023-12-12 09:54:06 -08:00 |
|
Krrish Dholakia
|
2c1c75fdf0
|
fix(ollama.py): enable parallel ollama completion calls
|
2023-12-11 23:18:37 -08:00 |
|
Krrish Dholakia
|
ad39afc0ad
|
test(test_custom_callback_input.py): embedding callback tests for azure, openai, bedrock
|
2023-12-11 15:32:46 -08:00 |
|
Krrish Dholakia
|
b09ecb986e
|
test(test_custom_callback_input.py): add bedrock testing
n
n
|
2023-12-11 13:00:01 -08:00 |
|
Krrish Dholakia
|
ea89a8a938
|
test(test_custom_callback_unit.py): adding unit tests for custom callbacks + fixing related bugs
|
2023-12-11 11:44:09 -08:00 |
|
Krish Dholakia
|
4ffe6a4296
|
Merge pull request #1054 from James4Ever0/main
Update factory.py to fix issue when calling from write-the -> langchain -> litellm served ollama
|
2023-12-11 07:18:02 -08:00 |
|
Krish Dholakia
|
bbbc5db104
|
Merge pull request #1080 from nbaldwin98/fixing-replicate-sys-prompt
fix replicate system prompt: forgot to add **optional_params to input data
|
2023-12-11 07:11:52 -08:00 |
|
chabala98
|
c5ce11541b
|
adfix: ded **optional_params in input data when system prompt is available (allows to papass other optional params apart from sys prompt)
|
2023-12-11 14:42:05 +01:00 |
|
James4Ever0
|
69fc2694bb
|
Update factory.py
Fixing issue when calling from write-the -> langchain -> litellm served ollama
|
2023-12-08 02:58:28 +08:00 |
|
ishaan-jaff
|
d2eee342fb
|
(feat) vertex ai - better debugging
|
2023-12-07 09:38:37 -08:00 |
|
Krrish Dholakia
|
c1e95740b0
|
fix(bedrock.py): fix output format for cohere embeddings
|
2023-12-06 22:47:01 -08:00 |
|
Krrish Dholakia
|
ac7d0a1632
|
fix(together_ai.py): improve together ai custom prompt templating
|
2023-12-06 19:34:49 -08:00 |
|
Krrish Dholakia
|
fff0228c20
|
fix(factory.py): support togethercomputer codellama pt
|
2023-12-06 19:02:58 -08:00 |
|
Krrish Dholakia
|
0295509b3b
|
fix(factory.py): fix claude 2.1 prompt template to handle system, assistant, user prompt
|
2023-12-06 18:02:06 -08:00 |
|
Krrish Dholakia
|
f1c1ec8523
|
fix(bedrock.py): fix embeddings call
|
2023-12-06 14:16:00 -08:00 |
|
Krrish Dholakia
|
b24c9b4cbf
|
refactor: fix linting
|
2023-12-06 13:27:40 -08:00 |
|
Krrish Dholakia
|
d962d5d4c0
|
fix(bedrock.py): adding support for cohere embeddings
|
2023-12-06 13:25:18 -08:00 |
|
Krrish Dholakia
|
102de97960
|
refactor: fix linting errors
|
2023-12-06 11:46:15 -08:00 |
|
Krrish Dholakia
|
94f065f83c
|
feat(sagemaker.py): support huggingface embedding models
|
2023-12-06 11:41:38 -08:00 |
|
Krrish Dholakia
|
648d41c96f
|
fix(sagemaker.py): prompt templating fixes
|
2023-12-05 17:47:44 -08:00 |
|
Krrish Dholakia
|
ff949490de
|
docs(input.md): add hf_model_name to docs
|
2023-12-05 16:56:18 -08:00 |
|
Krrish Dholakia
|
88845dddb1
|
fix(sagemaker.py): bring back llama2 templating for sagemaker
|
2023-12-05 16:42:19 -08:00 |
|
Krrish Dholakia
|
54d8a9df3f
|
fix(sagemaker.py): enable passing hf model name for prompt template
|
2023-12-05 16:31:59 -08:00 |
|
Krrish Dholakia
|
a38504ff1b
|
fix(sagemaker.py): fix meta llama model name for sagemaker custom deployment
|
2023-12-05 16:23:03 -08:00 |
|
Krrish Dholakia
|
3c60682eb4
|
fix(sagemaker.py): accept all amazon neuron llama2 models
|
2023-12-05 16:19:28 -08:00 |
|
Krrish Dholakia
|
01fc7f1931
|
fix(sagemaker.py): add support for amazon neuron llama models
|
2023-12-05 16:18:20 -08:00 |
|
Krrish Dholakia
|
b4c78c7b9e
|
fix(utils.py): support sagemaker llama2 custom endpoints
|
2023-12-05 16:05:15 -08:00 |
|
Krrish Dholakia
|
71e64c34cb
|
fix(huggingface_restapi.py): raise better exceptions for unprocessable hf responses
|
2023-12-05 07:28:21 -08:00 |
|
Krish Dholakia
|
b90fcbdac4
|
Merge pull request #970 from nbaldwin98/fixing-replicate-sys-prompt
fix system prompts for replicate
|
2023-12-04 16:39:44 -08:00 |
|
ishaan-jaff
|
32ecc1a677
|
(feat) replicate/deployments: add POST Req view
|
2023-12-04 13:43:03 -08:00 |
|