Commit graph

503 commits

Author SHA1 Message Date
Krrish Dholakia
c489f41964 test(utils.py): additional logging 2023-11-13 17:13:41 -08:00
Krrish Dholakia
681da80e55 test(utils.py): additional logging 2023-11-13 17:06:24 -08:00
Krrish Dholakia
c1a0411186 test(utils.py): additional logging 2023-11-13 16:59:04 -08:00
Krrish Dholakia
05d720075b test(utils.py): adding more logging for streaming test 2023-11-13 16:54:16 -08:00
Krrish Dholakia
38ff412b9a fix(utils.py): fix response object mapping 2023-11-13 15:58:25 -08:00
Krrish Dholakia
9d8f872f38 fix(promptlayer.py): fixing promptlayer logging integration 2023-11-13 15:04:15 -08:00
Krrish Dholakia
f20820fd00 fix(main.py): fix linting errors 2023-11-13 14:52:37 -08:00
Krrish Dholakia
8a3b771e50 fix(tests): fixing response objects for testing 2023-11-13 14:39:30 -08:00
Krrish Dholakia
d4de55b053 fix(together_ai.py): exception mapping for tgai 2023-11-13 13:17:15 -08:00
Krrish Dholakia
aa8ca781ba test(test_completion.py): cleanup tests 2023-11-13 11:23:38 -08:00
ishaan-jaff
4439109658 (fix) text completion response 2023-11-13 10:29:23 -08:00
ishaan-jaff
27cbd7d895 (fix) deepinfra with openai v1.0.0 2023-11-13 09:51:22 -08:00
ishaan-jaff
c91abc8ad1 (fix) token_counter - use openai token counter only for chat completion 2023-11-13 08:00:27 -08:00
Krrish Dholakia
62013520aa fix(utils.py): replacing openai.error import statements 2023-11-11 19:25:21 -08:00
Krrish Dholakia
45b6f8b853 refactor: fixing linting issues 2023-11-11 18:52:28 -08:00
Krrish Dholakia
39c2597c33 refactor(azure.py): working azure completion calls with openai v1 sdk 2023-11-11 16:44:39 -08:00
Krrish Dholakia
d0bd932b3c refactor(openai.py): working openai chat + text completion for openai v1 sdk 2023-11-11 16:25:10 -08:00
Krrish Dholakia
d3323ba637 refactor(openai.py): making it compatible for openai v1
BREAKING CHANGE:
2023-11-11 15:33:02 -08:00
ishaan-jaff
4d67cee135 (fix) completion gpt-4 vision check finish_details or finish_reason 2023-11-11 10:28:20 -08:00
Ishaan Jaff
fd6064b571
Merge pull request #787 from duc-phamh/improve_message_trimming
Improve message trimming
2023-11-11 09:39:43 -08:00
Krrish Dholakia
abee5a0e05 fix(utils.py): caching for embedding 2023-11-10 14:33:17 -08:00
ishaan-jaff
49c8c8a74f (fix) streaming replicate 2023-11-10 12:46:33 -08:00
Krrish Dholakia
62290ec5d9 fix(utils.py): fix exception raised 2023-11-10 11:28:17 -08:00
Krrish Dholakia
18a8bd5543 fix(utils.py): return function call as part of response object 2023-11-10 11:02:10 -08:00
Krrish Dholakia
a4c9e6bd46 fix(utils.py): fix cached responses - translate dict to objects 2023-11-10 10:38:20 -08:00
Krrish Dholakia
3d4c5e10a7 fix(utils.py): fix sync streaming 2023-11-09 18:47:20 -08:00
Krrish Dholakia
249cde3d40 fix(main.py): accepting azure deployment_id 2023-11-09 18:16:02 -08:00
Krrish Dholakia
b19e7dcc5a fix(utils.py): fix logging integrations 2023-11-09 17:09:49 -08:00
Krrish Dholakia
e12bff6d7f refactor(azure.py): enabling async streaming with aiohttp 2023-11-09 16:41:06 -08:00
Krrish Dholakia
c053782d96 refactor(openai.py): support aiohttp streaming 2023-11-09 16:15:30 -08:00
Duc Pham
8e13da198c Another small refactoring 2023-11-10 01:47:06 +07:00
Krrish Dholakia
86ef2a02f7 fix(azure.py): adding support for aiohttp calls on azure + openai 2023-11-09 10:40:33 -08:00
Duc Pham
eeac3954d5 Reverted error while refactoring 2023-11-10 01:35:41 +07:00
Duc Pham
07e8cf1d9a Improved trimming logic and OpenAI token counter 2023-11-10 01:26:13 +07:00
Krrish Dholakia
9bfbdc18fb feat(utils.py): enable returning complete response when stream=true 2023-11-09 09:17:51 -08:00
Krrish Dholakia
c2cbdb23fd refactor(openai.py): moving openai text completion calls to http 2023-11-08 18:40:03 -08:00
Krrish Dholakia
c57ed0a9d7 refactor(openai.py): moving openai chat completion calls to http 2023-11-08 17:40:41 -08:00
Krrish Dholakia
53abc31c27 refactor(azure.py): moving azure openai calls to http calls 2023-11-08 16:52:18 -08:00
ishaan-jaff
2a751c277f (feat) add streaming for text_completion 2023-11-08 11:58:07 -08:00
ishaan-jaff
6ee599545a (fix) text_completion don't pass echo to HF after translating 2023-11-08 11:45:05 -08:00
Krrish Dholakia
193cbe632f fix(utils.py): llmmonitor integration 2023-11-07 15:49:32 -08:00
ishaan-jaff
8481e21317 (fix) HF round up temperature 0 -> 0.01 2023-11-06 14:35:06 -08:00
ishaan-jaff
68b6e07aa7 (fix) hf fix this error: Failed: Error occurred: HuggingfaceException - Input validation error: temperature must be strictly positive 2023-11-06 14:22:33 -08:00
ishaan-jaff
b4797bec3b (fix) bug fix: completion, text_completion, check if optional params are not None and pass to LLM 2023-11-06 13:17:19 -08:00
Krrish Dholakia
6e7e409615 fix(utils.py): remove special characters from streaming output 2023-11-06 12:21:50 -08:00
ishaan-jaff
5cf2239aaa (fix) improve litellm.set_verbose prints 2023-11-06 08:00:03 -08:00
Krrish Dholakia
4dd1913da1 bump: version 0.13.3.dev1 → 0.13.3.dev2 2023-11-06 06:44:15 -08:00
Krrish Dholakia
c55db28b6f fix(utils.py): better exception raising if logging object is not able to get set 2023-11-06 06:34:27 -08:00
Krrish Dholakia
c3916a7754 feat(utils.py): adding additional states for custom logging 2023-11-04 17:07:20 -07:00
Krrish Dholakia
5b3978eff4 fix(main.py): fixing print_verbose 2023-11-04 14:41:34 -07:00