Krrish Dholakia
|
3d4c5e10a7
|
fix(utils.py): fix sync streaming
|
2023-11-09 18:47:20 -08:00 |
|
Krrish Dholakia
|
249cde3d40
|
fix(main.py): accepting azure deployment_id
|
2023-11-09 18:16:02 -08:00 |
|
Krrish Dholakia
|
b19e7dcc5a
|
fix(utils.py): fix logging integrations
|
2023-11-09 17:09:49 -08:00 |
|
Krrish Dholakia
|
e12bff6d7f
|
refactor(azure.py): enabling async streaming with aiohttp
|
2023-11-09 16:41:06 -08:00 |
|
Krrish Dholakia
|
c053782d96
|
refactor(openai.py): support aiohttp streaming
|
2023-11-09 16:15:30 -08:00 |
|
Krrish Dholakia
|
86ef2a02f7
|
fix(azure.py): adding support for aiohttp calls on azure + openai
|
2023-11-09 10:40:33 -08:00 |
|
Krrish Dholakia
|
9bfbdc18fb
|
feat(utils.py): enable returning complete response when stream=true
|
2023-11-09 09:17:51 -08:00 |
|
Krrish Dholakia
|
c2cbdb23fd
|
refactor(openai.py): moving openai text completion calls to http
|
2023-11-08 18:40:03 -08:00 |
|
Krrish Dholakia
|
c57ed0a9d7
|
refactor(openai.py): moving openai chat completion calls to http
|
2023-11-08 17:40:41 -08:00 |
|
Krrish Dholakia
|
53abc31c27
|
refactor(azure.py): moving azure openai calls to http calls
|
2023-11-08 16:52:18 -08:00 |
|
ishaan-jaff
|
2a751c277f
|
(feat) add streaming for text_completion
|
2023-11-08 11:58:07 -08:00 |
|
ishaan-jaff
|
6ee599545a
|
(fix) text_completion don't pass echo to HF after translating
|
2023-11-08 11:45:05 -08:00 |
|
Krrish Dholakia
|
193cbe632f
|
fix(utils.py): llmmonitor integration
|
2023-11-07 15:49:32 -08:00 |
|
ishaan-jaff
|
8481e21317
|
(fix) HF round up temperature 0 -> 0.01
|
2023-11-06 14:35:06 -08:00 |
|
ishaan-jaff
|
68b6e07aa7
|
(fix) hf fix this error: Failed: Error occurred: HuggingfaceException - Input validation error: temperature must be strictly positive
|
2023-11-06 14:22:33 -08:00 |
|
ishaan-jaff
|
b4797bec3b
|
(fix) bug fix: completion, text_completion, check if optional params are not None and pass to LLM
|
2023-11-06 13:17:19 -08:00 |
|
Krrish Dholakia
|
6e7e409615
|
fix(utils.py): remove special characters from streaming output
|
2023-11-06 12:21:50 -08:00 |
|
ishaan-jaff
|
5cf2239aaa
|
(fix) improve litellm.set_verbose prints
|
2023-11-06 08:00:03 -08:00 |
|
Krrish Dholakia
|
4dd1913da1
|
bump: version 0.13.3.dev1 → 0.13.3.dev2
|
2023-11-06 06:44:15 -08:00 |
|
Krrish Dholakia
|
c55db28b6f
|
fix(utils.py): better exception raising if logging object is not able to get set
|
2023-11-06 06:34:27 -08:00 |
|
Krrish Dholakia
|
c3916a7754
|
feat(utils.py): adding additional states for custom logging
|
2023-11-04 17:07:20 -07:00 |
|
Krrish Dholakia
|
5b3978eff4
|
fix(main.py): fixing print_verbose
|
2023-11-04 14:41:34 -07:00 |
|
ishaan-jaff
|
cef7ae7896
|
(fix) linting
|
2023-11-04 13:28:09 -07:00 |
|
ishaan-jaff
|
56ba58b35b
|
(fix) anyscale streaming detect [DONE] special char
|
2023-11-04 13:23:02 -07:00 |
|
Krrish Dholakia
|
6b40546e59
|
refactor(all-files): removing all print statements; adding pre-commit + flake8 to prevent future regressions
|
2023-11-04 12:50:15 -07:00 |
|
ishaan-jaff
|
cc9f17a1a8
|
(feat) add TextCompletionResponse
|
2023-11-03 22:14:07 -07:00 |
|
Krrish Dholakia
|
1c4dd0671b
|
fix(bedrock.py): add exception mapping coverage for authentication scenarios
|
2023-11-03 18:25:34 -07:00 |
|
Krrish Dholakia
|
142750adff
|
fix(bedrock.py): fix bedrock exception mapping
|
2023-11-03 18:14:12 -07:00 |
|
Krrish Dholakia
|
4e1885734a
|
refactor(proxy_server.py): print statement showing how to add debug for logs
|
2023-11-03 17:41:14 -07:00 |
|
ishaan-jaff
|
f3dc06da04
|
(fix) temp_top_logprobs
|
2023-11-03 16:45:10 -07:00 |
|
ishaan-jaff
|
539bdae364
|
(fix) remove errant print statements
|
2023-11-03 08:20:14 -07:00 |
|
Krrish Dholakia
|
e3a1c58dd9
|
build(litellm_server/utils.py): add support for general settings + num retries as a module variable
|
2023-11-02 20:56:41 -07:00 |
|
ishaan-jaff
|
bb832e38b9
|
(fix) litellm utils
|
2023-11-02 17:03:46 -07:00 |
|
ishaan-jaff
|
81f608dd34
|
(fix) vertexai detect code_chat and code_text llms as vertex
|
2023-11-02 16:31:13 -07:00 |
|
Krrish Dholakia
|
512a1637eb
|
feat(completion()): enable setting prompt templates via completion()
|
2023-11-02 16:24:01 -07:00 |
|
ishaan-jaff
|
8b5ee89d82
|
(feat) add transform_logprobs for text_completion
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
700fcfa5fb
|
(feat) text_completion add transform_logprobs
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
1d0ce77baf
|
(fix) improve litellm.set_verbose=True logging
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
9cfd218101
|
(feat) detect amazon.titan-embed-text-v1 as bedrock embedding model
|
2023-11-01 14:46:33 -07:00 |
|
Krrish Dholakia
|
f9ff03d5af
|
fix(utils.py): mapping stop sequences for palm
|
2023-11-01 14:00:45 -07:00 |
|
Krrish Dholakia
|
7762ae7762
|
feat(utils.py): accept context window fallback dictionary
|
2023-10-31 22:32:36 -07:00 |
|
ishaan-jaff
|
fa7e063198
|
(feat) add bedrock.cohere streaming
|
2023-10-31 22:26:43 -07:00 |
|
ishaan-jaff
|
ce462824be
|
(feat) add support for echo for HF logprobs
|
2023-10-31 18:20:59 -07:00 |
|
ishaan-jaff
|
1ada3a13d4
|
(feat) text_completion return raw response for davinci003
|
2023-10-31 15:32:04 -07:00 |
|
Krrish Dholakia
|
0ed3917b09
|
feat(main.py): add support for maritalk api
|
2023-10-30 17:36:51 -07:00 |
|
ishaan-jaff
|
494c5e8345
|
(docs) completion_with_config
|
2023-10-30 14:29:40 -07:00 |
|
ishaan-jaff
|
c61fa70ba0
|
(docs) encode docstring
|
2023-10-30 14:10:29 -07:00 |
|
ishaan-jaff
|
c7752be7d3
|
(docs) add docstring for validate_environment
|
2023-10-30 14:06:55 -07:00 |
|
ishaan-jaff
|
362e8519ef
|
def get_valid_models():
(docs) add docstring for
|
2023-10-30 14:05:09 -07:00 |
|
ishaan-jaff
|
4a32272bfd
|
(docs) add doc string for check_valid_api_key
|
2023-10-30 14:02:31 -07:00 |
|