Commit graph

318 commits

Author SHA1 Message Date
Krrish Dholakia
7358d2e4ea bump: version 0.8.4 → 0.8.5 2023-10-14 16:43:06 -07:00
ishaan-jaff
882ac46727 (feat) add doc string for embedding 2023-10-14 16:08:13 -07:00
Krrish Dholakia
9513d6b862 fix(utils.py): read env variables for known openai-compatible api's (e.g. perplexity), dynamically from th eenvironment 2023-10-13 22:43:32 -07:00
Krrish Dholakia
91c8e92e71 fix(openai.p): adding support for exception mapping for openai-compatible apis via http calls 2023-10-13 21:56:51 -07:00
ishaan-jaff
b72dbe61c0 (feat) set api_base, api_key, api_version for embedding() 2023-10-13 21:09:44 -07:00
Krrish Dholakia
4d4f8bfa5d feat(proxy_server): adding model fallbacks and default model to toml 2023-10-13 15:31:17 -07:00
ishaan-jaff
fabad3dc42 (fix) Ollama use new streaming format 2023-10-11 17:00:39 -07:00
ishaan-jaff
689acb8a08 (feat) add CustomStreamWrapper for Ollama - match OpenAI streaming 2023-10-11 17:00:39 -07:00
Krrish Dholakia
d280a8c434 fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints 2023-10-10 20:53:02 -07:00
Krrish Dholakia
b50013386f fix(openai.py): enable custom proxy to pass in ca_bundle_path 2023-10-10 13:23:27 -07:00
Krrish Dholakia
af2fd0e0de fix: fix value error if model returns empty completion 2023-10-10 10:11:40 -07:00
Krrish Dholakia
db20cb84d4 fix(main.py): return n>1 response for openai text completion 2023-10-09 20:44:07 -07:00
Krrish Dholakia
689371949c fix(main.py): read openai org from env 2023-10-09 16:49:22 -07:00
ishaan-jaff
4e64f123ef (fix) api_base, api_version and api_key 2023-10-09 14:11:05 -07:00
ishaan-jaff
bf4ce08640 (fix) acompletion for ollama non streaing 2023-10-09 13:47:08 -07:00
Krrish Dholakia
704be9dcd1 feat(factory.py): option to add function details to prompt, if model doesn't support functions param 2023-10-09 09:53:53 -07:00
ishaan-jaff
f6f7c0b891 (feat) add api_key, api_base, api_version to completion 2023-10-09 08:13:12 -07:00
Krrish Dholakia
bcf0b0ac7b style(main.py): clean up print statement 2023-10-07 15:43:40 -07:00
Krrish Dholakia
9cda24e1b2 fix(utils): adds complete streaming response to success handler 2023-10-07 15:42:00 -07:00
Krrish Dholakia
d69038883c docs(completion-docs): adds more details on provider-specific params 2023-10-07 13:49:30 -07:00
Krrish Dholakia
306a38880d feat(ollama.py): exposing ollama config 2023-10-06 15:52:58 -07:00
ishaan-jaff
47521c5a97 test commitizen bump 2023-10-06 15:41:38 -07:00
ishaan-jaff
4ae8a71aa3 bump: version 0.2.5 → 0.3.0 2023-10-06 15:40:01 -07:00
Krrish Dholakia
7e34736a38 fix(add-custom-success-callback-for-streaming): add custom success callback for streaming 2023-10-06 15:02:02 -07:00
Krrish Dholakia
a977e94a5d style(main.py): adding spacing 2023-10-06 06:16:17 -07:00
Krrish Dholakia
e0f1cffa87 fix: azure flag check 2023-10-05 22:44:41 -07:00
Krrish Dholakia
060a2e40b2 fix: fixing mypy linting errors and being backwards compatible for azure=true flag 2023-10-05 22:36:32 -07:00
Krrish Dholakia
dd7e397650 style(test_completion.py): fix merge conflict 2023-10-05 22:09:38 -07:00
ishaan-jaff
8120477be4 fix(completion()): add request_timeout as a param, fix claude error when request_timeout set 2023-10-05 19:05:28 -07:00
ishaan-jaff
29509a48f8 ollama default api_base to http://localhost:11434 2023-10-05 11:04:51 -07:00
Krrish Dholakia
ed31860206 adding custom prompt templates to ollama 2023-10-05 10:48:16 -07:00
ishaan-jaff
2d4671a7ef add ratelimitmanager 2023-10-04 16:03:58 -07:00
ishaan-jaff
68006ff584 make RateLimitHandler a class 2023-10-04 16:03:58 -07:00
ishaan-jaff
defc830e95 add batch_completion_rate_limits 2023-10-04 14:46:11 -07:00
Krrish Dholakia
430e2698a2 add param mapping to docs 2023-10-03 09:20:58 -07:00
Krrish Dholakia
345a14483e Fixes to bedrock 2023-10-02 16:09:06 -07:00
Krrish Dholakia
0daf2e3880 fixes to get optional params 2023-10-02 14:44:11 -07:00
Krrish Dholakia
8b60d797e1 fixing optional param mapping 2023-10-02 14:14:30 -07:00
Krrish Dholakia
7cec308a2c fixes to get_optional_params 2023-10-02 12:34:22 -07:00
Krrish Dholakia
8acad78fb3 fix linting issues 2023-10-02 12:12:34 -07:00
Krrish Dholakia
5a19ee1a71 fix get optional params 2023-10-02 12:02:53 -07:00
Krrish Dholakia
1cae080eb2 raise exception if optional param is not mapped to model 2023-10-02 11:17:51 -07:00
Krrish Dholakia
1c802c26fc updates 2023-09-30 18:26:23 -07:00
ishaan-jaff
8d1f5ba69d add fake streaming for petals 2023-09-30 10:22:04 -07:00
ishaan-jaff
1a8144360e fix logging error 2023-09-29 16:02:19 -07:00
ishaan-jaff
3fbad7dfa7 add hf embedding models 2023-09-29 11:57:38 -07:00
ishaan-jaff
17be1d89b3 remove junk code in getting custom_llm_provider 2023-09-29 11:18:08 -07:00
ishaan-jaff
af914d4be1 add cohere embedding models 2023-09-29 09:59:31 -07:00
ishaan-jaff
13ff65a8fe remove junk from completion 2023-09-28 17:57:48 -07:00
ishaan-jaff
0f3be26778 remove junk from completion input 2023-09-28 17:54:46 -07:00