Commit graph

476 commits

Author SHA1 Message Date
Krrish Dholakia
67e8b12a09 fix(utils.py): fix cached responses - translate dict to objects 2023-11-10 10:38:20 -08:00
Krrish Dholakia
d392f6efb5 fix(utils.py): fix sync streaming 2023-11-09 18:47:20 -08:00
Krrish Dholakia
af7468e9bc fix(main.py): accepting azure deployment_id 2023-11-09 18:16:02 -08:00
Krrish Dholakia
cfd2ccf429 fix(utils.py): fix logging integrations 2023-11-09 17:09:49 -08:00
Krrish Dholakia
272a6dc9b0 refactor(azure.py): enabling async streaming with aiohttp 2023-11-09 16:41:06 -08:00
Krrish Dholakia
9b278f567b refactor(openai.py): support aiohttp streaming 2023-11-09 16:15:30 -08:00
Krrish Dholakia
1d46891ceb fix(azure.py): adding support for aiohttp calls on azure + openai 2023-11-09 10:40:33 -08:00
Krrish Dholakia
8ee4b1f603 feat(utils.py): enable returning complete response when stream=true 2023-11-09 09:17:51 -08:00
Krrish Dholakia
e66373bd47 refactor(openai.py): moving openai text completion calls to http 2023-11-08 18:40:03 -08:00
Krrish Dholakia
decf86b145 refactor(openai.py): moving openai chat completion calls to http 2023-11-08 17:40:41 -08:00
Krrish Dholakia
17f5e46080 refactor(azure.py): moving azure openai calls to http calls 2023-11-08 16:52:18 -08:00
ishaan-jaff
11ee52207e (feat) add streaming for text_completion 2023-11-08 11:58:07 -08:00
ishaan-jaff
106ccc2b94 (fix) text_completion don't pass echo to HF after translating 2023-11-08 11:45:05 -08:00
Krrish Dholakia
97c8b52bba fix(utils.py): llmmonitor integration 2023-11-07 15:49:32 -08:00
ishaan-jaff
4d8d50d97e (fix) HF round up temperature 0 -> 0.01 2023-11-06 14:35:06 -08:00
ishaan-jaff
b75a113e39 (fix) hf fix this error: Failed: Error occurred: HuggingfaceException - Input validation error: temperature must be strictly positive 2023-11-06 14:22:33 -08:00
ishaan-jaff
fdded281a9 (fix) bug fix: completion, text_completion, check if optional params are not None and pass to LLM 2023-11-06 13:17:19 -08:00
Krrish Dholakia
713c659d09 fix(utils.py): remove special characters from streaming output 2023-11-06 12:21:50 -08:00
ishaan-jaff
441ef48a54 (fix) improve litellm.set_verbose prints 2023-11-06 08:00:03 -08:00
Krrish Dholakia
10987304ba bump: version 0.13.3.dev1 → 0.13.3.dev2 2023-11-06 06:44:15 -08:00
Krrish Dholakia
b8cc981db5 fix(utils.py): better exception raising if logging object is not able to get set 2023-11-06 06:34:27 -08:00
Krrish Dholakia
e633566253 feat(utils.py): adding additional states for custom logging 2023-11-04 17:07:20 -07:00
Krrish Dholakia
f7c5595a0d fix(main.py): fixing print_verbose 2023-11-04 14:41:34 -07:00
ishaan-jaff
3477604c90 (fix) linting 2023-11-04 13:28:09 -07:00
ishaan-jaff
e53f5316d0 (fix) anyscale streaming detect [DONE] special char 2023-11-04 13:23:02 -07:00
Krrish Dholakia
d0b23a2722 refactor(all-files): removing all print statements; adding pre-commit + flake8 to prevent future regressions 2023-11-04 12:50:15 -07:00
ishaan-jaff
07f8fa65eb (feat) add TextCompletionResponse 2023-11-03 22:14:07 -07:00
Krrish Dholakia
64b6b0155d fix(bedrock.py): add exception mapping coverage for authentication scenarios 2023-11-03 18:25:34 -07:00
Krrish Dholakia
8bf8464fc2 fix(bedrock.py): fix bedrock exception mapping 2023-11-03 18:14:12 -07:00
Krrish Dholakia
fa24a61976 refactor(proxy_server.py): print statement showing how to add debug for logs 2023-11-03 17:41:14 -07:00
ishaan-jaff
5b76e12976 (fix) temp_top_logprobs 2023-11-03 16:45:10 -07:00
ishaan-jaff
5a5e6e0fac (fix) remove errant print statements 2023-11-03 08:20:14 -07:00
Krrish Dholakia
127972a80b build(litellm_server/utils.py): add support for general settings + num retries as a module variable 2023-11-02 20:56:41 -07:00
ishaan-jaff
fb94c7e00d (fix) litellm utils 2023-11-02 17:03:46 -07:00
ishaan-jaff
3c22fbf637 (fix) vertexai detect code_chat and code_text llms as vertex 2023-11-02 16:31:13 -07:00
Krrish Dholakia
33c1118080 feat(completion()): enable setting prompt templates via completion() 2023-11-02 16:24:01 -07:00
ishaan-jaff
5262a3a2f7 (feat) add transform_logprobs for text_completion 2023-11-01 18:25:13 -07:00
ishaan-jaff
51060d1eea (feat) text_completion add transform_logprobs 2023-11-01 18:25:13 -07:00
ishaan-jaff
a0ed669f25 (fix) improve litellm.set_verbose=True logging 2023-11-01 18:25:13 -07:00
ishaan-jaff
f66e9c6bce (feat) detect amazon.titan-embed-text-v1 as bedrock embedding model 2023-11-01 14:46:33 -07:00
Krrish Dholakia
a951f7ff85 fix(utils.py): mapping stop sequences for palm 2023-11-01 14:00:45 -07:00
Krrish Dholakia
2cf06a3235 feat(utils.py): accept context window fallback dictionary 2023-10-31 22:32:36 -07:00
ishaan-jaff
c6229b7113 (feat) add bedrock.cohere streaming 2023-10-31 22:26:43 -07:00
ishaan-jaff
19177ae041 (feat) add support for echo for HF logprobs 2023-10-31 18:20:59 -07:00
ishaan-jaff
d57dc616b8 (feat) text_completion return raw response for davinci003 2023-10-31 15:32:04 -07:00
Krrish Dholakia
147d69f230 feat(main.py): add support for maritalk api 2023-10-30 17:36:51 -07:00
ishaan-jaff
32b6714a8b (docs) completion_with_config 2023-10-30 14:29:40 -07:00
ishaan-jaff
ae376a9835 (docs) encode docstring 2023-10-30 14:10:29 -07:00
ishaan-jaff
fdef63439d (docs) add docstring for validate_environment 2023-10-30 14:06:55 -07:00
ishaan-jaff
94542ae6be def get_valid_models():
(docs) add docstring for
2023-10-30 14:05:09 -07:00