Krrish Dholakia
|
330708e7ef
|
fix(tests): fixing response objects for testing
|
2023-11-13 14:39:30 -08:00 |
|
Krrish Dholakia
|
bdf801d987
|
fix(together_ai.py): exception mapping for tgai
|
2023-11-13 13:17:15 -08:00 |
|
Krrish Dholakia
|
d8121737d6
|
test(test_completion.py): cleanup tests
|
2023-11-13 11:23:38 -08:00 |
|
ishaan-jaff
|
a1f1262d18
|
(fix) text completion response
|
2023-11-13 10:29:23 -08:00 |
|
ishaan-jaff
|
f388000566
|
(fix) deepinfra with openai v1.0.0
|
2023-11-13 09:51:22 -08:00 |
|
ishaan-jaff
|
8656a6aff7
|
(fix) token_counter - use openai token counter only for chat completion
|
2023-11-13 08:00:27 -08:00 |
|
Krrish Dholakia
|
8db19d0af4
|
fix(utils.py): replacing openai.error import statements
|
2023-11-11 19:25:21 -08:00 |
|
Krrish Dholakia
|
4b74ddcb17
|
refactor: fixing linting issues
|
2023-11-11 18:52:28 -08:00 |
|
Krrish Dholakia
|
c0a757a25f
|
refactor(azure.py): working azure completion calls with openai v1 sdk
|
2023-11-11 16:44:39 -08:00 |
|
Krrish Dholakia
|
1ec07c0aba
|
refactor(openai.py): working openai chat + text completion for openai v1 sdk
|
2023-11-11 16:25:10 -08:00 |
|
Krrish Dholakia
|
a5ec85b1f2
|
refactor(openai.py): making it compatible for openai v1
BREAKING CHANGE:
|
2023-11-11 15:33:02 -08:00 |
|
ishaan-jaff
|
4408a1f806
|
(fix) completion gpt-4 vision check finish_details or finish_reason
|
2023-11-11 10:28:20 -08:00 |
|
Ishaan Jaff
|
292b12b191
|
Merge pull request #787 from duc-phamh/improve_message_trimming
Improve message trimming
|
2023-11-11 09:39:43 -08:00 |
|
Krrish Dholakia
|
55b8432145
|
fix(utils.py): caching for embedding
|
2023-11-10 14:33:17 -08:00 |
|
ishaan-jaff
|
5413696771
|
(fix) streaming replicate
|
2023-11-10 12:46:33 -08:00 |
|
Krrish Dholakia
|
c10b0646c1
|
fix(utils.py): fix exception raised
|
2023-11-10 11:28:17 -08:00 |
|
Krrish Dholakia
|
548605def8
|
fix(utils.py): return function call as part of response object
|
2023-11-10 11:02:10 -08:00 |
|
Krrish Dholakia
|
67e8b12a09
|
fix(utils.py): fix cached responses - translate dict to objects
|
2023-11-10 10:38:20 -08:00 |
|
Krrish Dholakia
|
d392f6efb5
|
fix(utils.py): fix sync streaming
|
2023-11-09 18:47:20 -08:00 |
|
Krrish Dholakia
|
af7468e9bc
|
fix(main.py): accepting azure deployment_id
|
2023-11-09 18:16:02 -08:00 |
|
Krrish Dholakia
|
cfd2ccf429
|
fix(utils.py): fix logging integrations
|
2023-11-09 17:09:49 -08:00 |
|
Krrish Dholakia
|
272a6dc9b0
|
refactor(azure.py): enabling async streaming with aiohttp
|
2023-11-09 16:41:06 -08:00 |
|
Krrish Dholakia
|
9b278f567b
|
refactor(openai.py): support aiohttp streaming
|
2023-11-09 16:15:30 -08:00 |
|
Duc Pham
|
c7ca8f75a2
|
Another small refactoring
|
2023-11-10 01:47:06 +07:00 |
|
Krrish Dholakia
|
1d46891ceb
|
fix(azure.py): adding support for aiohttp calls on azure + openai
|
2023-11-09 10:40:33 -08:00 |
|
Duc Pham
|
61f2e37349
|
Reverted error while refactoring
|
2023-11-10 01:35:41 +07:00 |
|
Duc Pham
|
c74e6f8cdd
|
Improved trimming logic and OpenAI token counter
|
2023-11-10 01:26:13 +07:00 |
|
Krrish Dholakia
|
8ee4b1f603
|
feat(utils.py): enable returning complete response when stream=true
|
2023-11-09 09:17:51 -08:00 |
|
Krrish Dholakia
|
e66373bd47
|
refactor(openai.py): moving openai text completion calls to http
|
2023-11-08 18:40:03 -08:00 |
|
Krrish Dholakia
|
decf86b145
|
refactor(openai.py): moving openai chat completion calls to http
|
2023-11-08 17:40:41 -08:00 |
|
Krrish Dholakia
|
17f5e46080
|
refactor(azure.py): moving azure openai calls to http calls
|
2023-11-08 16:52:18 -08:00 |
|
ishaan-jaff
|
11ee52207e
|
(feat) add streaming for text_completion
|
2023-11-08 11:58:07 -08:00 |
|
ishaan-jaff
|
106ccc2b94
|
(fix) text_completion don't pass echo to HF after translating
|
2023-11-08 11:45:05 -08:00 |
|
Krrish Dholakia
|
97c8b52bba
|
fix(utils.py): llmmonitor integration
|
2023-11-07 15:49:32 -08:00 |
|
ishaan-jaff
|
4d8d50d97e
|
(fix) HF round up temperature 0 -> 0.01
|
2023-11-06 14:35:06 -08:00 |
|
ishaan-jaff
|
b75a113e39
|
(fix) hf fix this error: Failed: Error occurred: HuggingfaceException - Input validation error: temperature must be strictly positive
|
2023-11-06 14:22:33 -08:00 |
|
ishaan-jaff
|
fdded281a9
|
(fix) bug fix: completion, text_completion, check if optional params are not None and pass to LLM
|
2023-11-06 13:17:19 -08:00 |
|
Krrish Dholakia
|
713c659d09
|
fix(utils.py): remove special characters from streaming output
|
2023-11-06 12:21:50 -08:00 |
|
ishaan-jaff
|
441ef48a54
|
(fix) improve litellm.set_verbose prints
|
2023-11-06 08:00:03 -08:00 |
|
Krrish Dholakia
|
10987304ba
|
bump: version 0.13.3.dev1 → 0.13.3.dev2
|
2023-11-06 06:44:15 -08:00 |
|
Krrish Dholakia
|
b8cc981db5
|
fix(utils.py): better exception raising if logging object is not able to get set
|
2023-11-06 06:34:27 -08:00 |
|
Krrish Dholakia
|
e633566253
|
feat(utils.py): adding additional states for custom logging
|
2023-11-04 17:07:20 -07:00 |
|
Krrish Dholakia
|
f7c5595a0d
|
fix(main.py): fixing print_verbose
|
2023-11-04 14:41:34 -07:00 |
|
ishaan-jaff
|
3477604c90
|
(fix) linting
|
2023-11-04 13:28:09 -07:00 |
|
ishaan-jaff
|
e53f5316d0
|
(fix) anyscale streaming detect [DONE] special char
|
2023-11-04 13:23:02 -07:00 |
|
Krrish Dholakia
|
d0b23a2722
|
refactor(all-files): removing all print statements; adding pre-commit + flake8 to prevent future regressions
|
2023-11-04 12:50:15 -07:00 |
|
ishaan-jaff
|
07f8fa65eb
|
(feat) add TextCompletionResponse
|
2023-11-03 22:14:07 -07:00 |
|
Krrish Dholakia
|
64b6b0155d
|
fix(bedrock.py): add exception mapping coverage for authentication scenarios
|
2023-11-03 18:25:34 -07:00 |
|
Krrish Dholakia
|
8bf8464fc2
|
fix(bedrock.py): fix bedrock exception mapping
|
2023-11-03 18:14:12 -07:00 |
|
Krrish Dholakia
|
fa24a61976
|
refactor(proxy_server.py): print statement showing how to add debug for logs
|
2023-11-03 17:41:14 -07:00 |
|