Commit graph

465 commits

Author SHA1 Message Date
ishaan-jaff
2a751c277f (feat) add streaming for text_completion 2023-11-08 11:58:07 -08:00
ishaan-jaff
6ee599545a (fix) text_completion don't pass echo to HF after translating 2023-11-08 11:45:05 -08:00
Krrish Dholakia
193cbe632f fix(utils.py): llmmonitor integration 2023-11-07 15:49:32 -08:00
ishaan-jaff
8481e21317 (fix) HF round up temperature 0 -> 0.01 2023-11-06 14:35:06 -08:00
ishaan-jaff
68b6e07aa7 (fix) hf fix this error: Failed: Error occurred: HuggingfaceException - Input validation error: temperature must be strictly positive 2023-11-06 14:22:33 -08:00
ishaan-jaff
b4797bec3b (fix) bug fix: completion, text_completion, check if optional params are not None and pass to LLM 2023-11-06 13:17:19 -08:00
Krrish Dholakia
6e7e409615 fix(utils.py): remove special characters from streaming output 2023-11-06 12:21:50 -08:00
ishaan-jaff
5cf2239aaa (fix) improve litellm.set_verbose prints 2023-11-06 08:00:03 -08:00
Krrish Dholakia
4dd1913da1 bump: version 0.13.3.dev1 → 0.13.3.dev2 2023-11-06 06:44:15 -08:00
Krrish Dholakia
c55db28b6f fix(utils.py): better exception raising if logging object is not able to get set 2023-11-06 06:34:27 -08:00
Krrish Dholakia
c3916a7754 feat(utils.py): adding additional states for custom logging 2023-11-04 17:07:20 -07:00
Krrish Dholakia
5b3978eff4 fix(main.py): fixing print_verbose 2023-11-04 14:41:34 -07:00
ishaan-jaff
cef7ae7896 (fix) linting 2023-11-04 13:28:09 -07:00
ishaan-jaff
56ba58b35b (fix) anyscale streaming detect [DONE] special char 2023-11-04 13:23:02 -07:00
Krrish Dholakia
6b40546e59 refactor(all-files): removing all print statements; adding pre-commit + flake8 to prevent future regressions 2023-11-04 12:50:15 -07:00
ishaan-jaff
cc9f17a1a8 (feat) add TextCompletionResponse 2023-11-03 22:14:07 -07:00
Krrish Dholakia
1c4dd0671b fix(bedrock.py): add exception mapping coverage for authentication scenarios 2023-11-03 18:25:34 -07:00
Krrish Dholakia
142750adff fix(bedrock.py): fix bedrock exception mapping 2023-11-03 18:14:12 -07:00
Krrish Dholakia
4e1885734a refactor(proxy_server.py): print statement showing how to add debug for logs 2023-11-03 17:41:14 -07:00
ishaan-jaff
f3dc06da04 (fix) temp_top_logprobs 2023-11-03 16:45:10 -07:00
ishaan-jaff
539bdae364 (fix) remove errant print statements 2023-11-03 08:20:14 -07:00
Krrish Dholakia
e3a1c58dd9 build(litellm_server/utils.py): add support for general settings + num retries as a module variable 2023-11-02 20:56:41 -07:00
ishaan-jaff
bb832e38b9 (fix) litellm utils 2023-11-02 17:03:46 -07:00
ishaan-jaff
81f608dd34 (fix) vertexai detect code_chat and code_text llms as vertex 2023-11-02 16:31:13 -07:00
Krrish Dholakia
512a1637eb feat(completion()): enable setting prompt templates via completion() 2023-11-02 16:24:01 -07:00
ishaan-jaff
8b5ee89d82 (feat) add transform_logprobs for text_completion 2023-11-01 18:25:13 -07:00
ishaan-jaff
700fcfa5fb (feat) text_completion add transform_logprobs 2023-11-01 18:25:13 -07:00
ishaan-jaff
1d0ce77baf (fix) improve litellm.set_verbose=True logging 2023-11-01 18:25:13 -07:00
ishaan-jaff
9cfd218101 (feat) detect amazon.titan-embed-text-v1 as bedrock embedding model 2023-11-01 14:46:33 -07:00
Krrish Dholakia
f9ff03d5af fix(utils.py): mapping stop sequences for palm 2023-11-01 14:00:45 -07:00
Krrish Dholakia
7762ae7762 feat(utils.py): accept context window fallback dictionary 2023-10-31 22:32:36 -07:00
ishaan-jaff
fa7e063198 (feat) add bedrock.cohere streaming 2023-10-31 22:26:43 -07:00
ishaan-jaff
ce462824be (feat) add support for echo for HF logprobs 2023-10-31 18:20:59 -07:00
ishaan-jaff
1ada3a13d4 (feat) text_completion return raw response for davinci003 2023-10-31 15:32:04 -07:00
Krrish Dholakia
0ed3917b09 feat(main.py): add support for maritalk api 2023-10-30 17:36:51 -07:00
ishaan-jaff
494c5e8345 (docs) completion_with_config 2023-10-30 14:29:40 -07:00
ishaan-jaff
c61fa70ba0 (docs) encode docstring 2023-10-30 14:10:29 -07:00
ishaan-jaff
c7752be7d3 (docs) add docstring for validate_environment 2023-10-30 14:06:55 -07:00
ishaan-jaff
362e8519ef def get_valid_models():
(docs) add docstring for
2023-10-30 14:05:09 -07:00
ishaan-jaff
4a32272bfd (docs) add doc string for check_valid_api_key 2023-10-30 14:02:31 -07:00
ishaan-jaff
d2ff7a17df (feat) track cost for responses easily 2023-10-28 15:08:35 -07:00
ishaan-jaff
bf5e8bd0ea (feat) set litellm_call_id in fallbacks 2023-10-27 18:09:00 -07:00
ishaan-jaff
b0970827d3 (fix) utils - remove bloat - deprecated completion_with_split_tests 2023-10-27 18:04:15 -07:00
ishaan-jaff
fe4ef2bd57 (fix) only set litellm call id if it's not set in completion() 2023-10-27 18:02:14 -07:00
ishaan-jaff
e260a9cf2f (fix) remove print statements from completion fallbacks, make them print verbose 2023-10-27 17:54:48 -07:00
ishaan-jaff
ad2afae31d (fix) remove bloat - rate limite manager 2023-10-27 17:47:45 -07:00
ishaan-jaff
b3776fc0d8 (fix) use sentry dsn instead of sentry API URL 2023-10-27 16:25:02 -07:00
Krrish Dholakia
afe14c8a96 fix(utils.py/completion_with_fallbacks): accept azure deployment name in rotations 2023-10-27 16:00:42 -07:00
ishaan-jaff
dcdbd02a67 (fix) remove errant print statement 2023-10-27 09:38:53 -07:00
ishaan-jaff
962e75eb70 (feat) create a usage class in ModelResponse, use it for anthropic 2023-10-27 09:32:10 -07:00