Commit graph

441 commits

Author SHA1 Message Date
ishaan-jaff
9f72ce9fc6 (fix) improve batch_completion_models + multiple deployments, if 1 model fails, return result from 2nd 2023-10-30 17:20:07 -07:00
ishaan-jaff
43b450319e (docs) add docstring for batch_completion 2023-10-30 14:31:34 -07:00
ishaan-jaff
f52b36a338 (docs) docstring for completion_with_retries 2023-10-30 14:20:09 -07:00
ishaan-jaff
43b8387334 (docs) add doc string for aembedding 2023-10-30 13:59:56 -07:00
ishaan-jaff
7037913f9d (fix) update acompletion docstring 2023-10-30 13:53:32 -07:00
ishaan-jaff
cac320f74e (fix) remove bloat - ratelimitmanager 2023-10-27 18:11:39 -07:00
Krrish Dholakia
afe14c8a96 fix(utils.py/completion_with_fallbacks): accept azure deployment name in rotations 2023-10-27 16:00:42 -07:00
Krrish Dholakia
c1b2553827 fix(utils.py): adding support for anyscale models 2023-10-25 09:08:10 -07:00
Krrish Dholakia
f12dc5df21 fix(vertex_ai.py): fix output parsing 2023-10-24 12:08:22 -07:00
ishaan-jaff
6373f6bddd (feat) add async embeddings 2023-10-23 13:59:37 -07:00
Krrish Dholakia
cd0e699bcf fix(main.py): multiple deployments fix - run in parallel 2023-10-21 14:28:50 -07:00
ishaan-jaff
0b0564167c (fix) embedding() using get_llm_provider 2023-10-20 15:00:08 -07:00
ishaan-jaff
114d8fda65 (feat) native perplexity support 2023-10-20 14:29:07 -07:00
Krrish Dholakia
1f1cf7a11c feat(main.py): support multiple deployments in 1 completion call 2023-10-20 13:01:53 -07:00
Krrish Dholakia
4b48af7c3c fix(anthropic.py-+-bedrock.py): anthropic prompt format 2023-10-20 10:56:15 -07:00
Krrish Dholakia
00993f3575 fix: allow api base to be set for all providers
enables proxy use cases
2023-10-19 19:07:42 -07:00
Krrish Dholakia
76bf8c4be3 fix(anthropic.py): enable api base to be customized 2023-10-19 18:45:29 -07:00
Krrish Dholakia
dcb866b353 docs(proxy_server.md): update proxy server docs to include multi-agent autogen tutorial 2023-10-17 09:22:34 -07:00
Krrish Dholakia
036c50e2bf refactor(main.py): clean up print statement 2023-10-16 17:53:09 -07:00
Krrish Dholakia
4424aaf69f refactor(main.py): remove print statement 2023-10-16 07:33:15 -07:00
Krrish Dholakia
1a09bbd4a9 docs(main.py): adding docstring for text_completion 2023-10-16 07:31:48 -07:00
Zeeland
9f6138ef0e fix: llm_provider add openai finetune compatibility 2023-10-16 18:44:45 +08:00
ishaan-jaff
6413285551 (fix) deepinfra/llama should go to deepinfra not to openrouter 2023-10-14 16:47:25 -07:00
Krrish Dholakia
7358d2e4ea bump: version 0.8.4 → 0.8.5 2023-10-14 16:43:06 -07:00
ishaan-jaff
882ac46727 (feat) add doc string for embedding 2023-10-14 16:08:13 -07:00
Krrish Dholakia
9513d6b862 fix(utils.py): read env variables for known openai-compatible api's (e.g. perplexity), dynamically from th eenvironment 2023-10-13 22:43:32 -07:00
Krrish Dholakia
91c8e92e71 fix(openai.p): adding support for exception mapping for openai-compatible apis via http calls 2023-10-13 21:56:51 -07:00
ishaan-jaff
b72dbe61c0 (feat) set api_base, api_key, api_version for embedding() 2023-10-13 21:09:44 -07:00
Krrish Dholakia
4d4f8bfa5d feat(proxy_server): adding model fallbacks and default model to toml 2023-10-13 15:31:17 -07:00
ishaan-jaff
fabad3dc42 (fix) Ollama use new streaming format 2023-10-11 17:00:39 -07:00
ishaan-jaff
689acb8a08 (feat) add CustomStreamWrapper for Ollama - match OpenAI streaming 2023-10-11 17:00:39 -07:00
Krrish Dholakia
d280a8c434 fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints 2023-10-10 20:53:02 -07:00
Krrish Dholakia
b50013386f fix(openai.py): enable custom proxy to pass in ca_bundle_path 2023-10-10 13:23:27 -07:00
Krrish Dholakia
af2fd0e0de fix: fix value error if model returns empty completion 2023-10-10 10:11:40 -07:00
Krrish Dholakia
db20cb84d4 fix(main.py): return n>1 response for openai text completion 2023-10-09 20:44:07 -07:00
Krrish Dholakia
689371949c fix(main.py): read openai org from env 2023-10-09 16:49:22 -07:00
ishaan-jaff
4e64f123ef (fix) api_base, api_version and api_key 2023-10-09 14:11:05 -07:00
ishaan-jaff
bf4ce08640 (fix) acompletion for ollama non streaing 2023-10-09 13:47:08 -07:00
Krrish Dholakia
704be9dcd1 feat(factory.py): option to add function details to prompt, if model doesn't support functions param 2023-10-09 09:53:53 -07:00
ishaan-jaff
f6f7c0b891 (feat) add api_key, api_base, api_version to completion 2023-10-09 08:13:12 -07:00
Krrish Dholakia
bcf0b0ac7b style(main.py): clean up print statement 2023-10-07 15:43:40 -07:00
Krrish Dholakia
9cda24e1b2 fix(utils): adds complete streaming response to success handler 2023-10-07 15:42:00 -07:00
Krrish Dholakia
d69038883c docs(completion-docs): adds more details on provider-specific params 2023-10-07 13:49:30 -07:00
Krrish Dholakia
306a38880d feat(ollama.py): exposing ollama config 2023-10-06 15:52:58 -07:00
ishaan-jaff
47521c5a97 test commitizen bump 2023-10-06 15:41:38 -07:00
ishaan-jaff
4ae8a71aa3 bump: version 0.2.5 → 0.3.0 2023-10-06 15:40:01 -07:00
Krrish Dholakia
7e34736a38 fix(add-custom-success-callback-for-streaming): add custom success callback for streaming 2023-10-06 15:02:02 -07:00
Krrish Dholakia
a977e94a5d style(main.py): adding spacing 2023-10-06 06:16:17 -07:00
Krrish Dholakia
e0f1cffa87 fix: azure flag check 2023-10-05 22:44:41 -07:00
Krrish Dholakia
060a2e40b2 fix: fixing mypy linting errors and being backwards compatible for azure=true flag 2023-10-05 22:36:32 -07:00