Commit graph

1298 commits

Author SHA1 Message Date
ishaan-jaff
80c6920709 (feat) text_completion return raw openai response for text_completion requests 2023-10-31 15:31:24 -07:00
ishaan-jaff
b99b137f10 (fix) linting errors 2023-10-31 14:43:10 -07:00
ishaan-jaff
daeb4a7f9e (feat) text_completion add support for passing prompt as array 2023-10-31 14:29:43 -07:00
Krrish Dholakia
83fd829b49 fix(main.py): removing print_verbose 2023-10-30 20:37:12 -07:00
Krrish Dholakia
655789ee10 test(test_async_fn.py): more logging 2023-10-30 19:29:37 -07:00
Krrish Dholakia
2c05513cd4 test(test_async_fn.py): adding more logging 2023-10-30 19:11:07 -07:00
Krrish Dholakia
147d69f230 feat(main.py): add support for maritalk api 2023-10-30 17:36:51 -07:00
ishaan-jaff
85cd1faddd (fix) improve batch_completion_models + multiple deployments, if 1 model fails, return result from 2nd 2023-10-30 17:20:07 -07:00
ishaan-jaff
d9f5989a7f (docs) add docstring for batch_completion 2023-10-30 14:31:34 -07:00
ishaan-jaff
e0e468a56b (docs) docstring for completion_with_retries 2023-10-30 14:20:09 -07:00
ishaan-jaff
fed0fec658 (docs) add doc string for aembedding 2023-10-30 13:59:56 -07:00
ishaan-jaff
263c5055ac (fix) update acompletion docstring 2023-10-30 13:53:32 -07:00
ishaan-jaff
19497d0c9a (fix) remove bloat - ratelimitmanager 2023-10-27 18:11:39 -07:00
Krrish Dholakia
daa7aed7a4 fix(utils.py/completion_with_fallbacks): accept azure deployment name in rotations 2023-10-27 16:00:42 -07:00
Krrish Dholakia
715ea54544 fix(utils.py): adding support for anyscale models 2023-10-25 09:08:10 -07:00
Krrish Dholakia
98c25b08cd fix(vertex_ai.py): fix output parsing 2023-10-24 12:08:22 -07:00
ishaan-jaff
a0651533f6 (feat) add async embeddings 2023-10-23 13:59:37 -07:00
Krrish Dholakia
8072366a5e fix(main.py): multiple deployments fix - run in parallel 2023-10-21 14:28:50 -07:00
ishaan-jaff
c8f89f3484 (fix) embedding() using get_llm_provider 2023-10-20 15:00:08 -07:00
ishaan-jaff
d4c81814f1 (feat) native perplexity support 2023-10-20 14:29:07 -07:00
Krrish Dholakia
dcf431dbbe feat(main.py): support multiple deployments in 1 completion call 2023-10-20 13:01:53 -07:00
Krrish Dholakia
2f9e112c14 fix(anthropic.py-+-bedrock.py): anthropic prompt format 2023-10-20 10:56:15 -07:00
Krrish Dholakia
18a6facdb3 fix: allow api base to be set for all providers
enables proxy use cases
2023-10-19 19:07:42 -07:00
Krrish Dholakia
a415c79b8b fix(anthropic.py): enable api base to be customized 2023-10-19 18:45:29 -07:00
Krrish Dholakia
44cafb5bac docs(proxy_server.md): update proxy server docs to include multi-agent autogen tutorial 2023-10-17 09:22:34 -07:00
Krrish Dholakia
5a893a685c refactor(main.py): clean up print statement 2023-10-16 17:53:09 -07:00
Krrish Dholakia
96cdee7d18 refactor(main.py): remove print statement 2023-10-16 07:33:15 -07:00
Krrish Dholakia
38fa9827d5 docs(main.py): adding docstring for text_completion 2023-10-16 07:31:48 -07:00
Zeeland
7b1d55d110 fix: llm_provider add openai finetune compatibility 2023-10-16 18:44:45 +08:00
ishaan-jaff
1e90f4ee9c (fix) deepinfra/llama should go to deepinfra not to openrouter 2023-10-14 16:47:25 -07:00
Krrish Dholakia
5f9dd0b21f bump: version 0.8.4 → 0.8.5 2023-10-14 16:43:06 -07:00
ishaan-jaff
1772478655 (feat) add doc string for embedding 2023-10-14 16:08:13 -07:00
Krrish Dholakia
fa0ff12570 fix(utils.py): read env variables for known openai-compatible api's (e.g. perplexity), dynamically from th eenvironment 2023-10-13 22:43:32 -07:00
Krrish Dholakia
ec5e7aa4a9 fix(openai.p): adding support for exception mapping for openai-compatible apis via http calls 2023-10-13 21:56:51 -07:00
ishaan-jaff
30634492d2 (feat) set api_base, api_key, api_version for embedding() 2023-10-13 21:09:44 -07:00
Krrish Dholakia
74c0d5b7a0 feat(proxy_server): adding model fallbacks and default model to toml 2023-10-13 15:31:17 -07:00
ishaan-jaff
1a7ffbe7b8 (fix) Ollama use new streaming format 2023-10-11 17:00:39 -07:00
ishaan-jaff
64949ff5cf (feat) add CustomStreamWrapper for Ollama - match OpenAI streaming 2023-10-11 17:00:39 -07:00
Krrish Dholakia
87e5f79924 fix(proxy_cli-and-utils.py): fixing how config file is read + infering llm_provider for known openai endpoints 2023-10-10 20:53:02 -07:00
Krrish Dholakia
67b3e792ff fix(openai.py): enable custom proxy to pass in ca_bundle_path 2023-10-10 13:23:27 -07:00
Krrish Dholakia
cc0e4f4f9f fix: fix value error if model returns empty completion 2023-10-10 10:11:40 -07:00
Krrish Dholakia
4b0ba2ba47 fix(main.py): return n>1 response for openai text completion 2023-10-09 20:44:07 -07:00
Krrish Dholakia
6c7fa8a4aa fix(main.py): read openai org from env 2023-10-09 16:49:22 -07:00
ishaan-jaff
344fa14980 (fix) api_base, api_version and api_key 2023-10-09 14:11:05 -07:00
ishaan-jaff
9d6088f65c (fix) acompletion for ollama non streaing 2023-10-09 13:47:08 -07:00
Krrish Dholakia
936548db40 feat(factory.py): option to add function details to prompt, if model doesn't support functions param 2023-10-09 09:53:53 -07:00
ishaan-jaff
7ee9294c07 (feat) add api_key, api_base, api_version to completion 2023-10-09 08:13:12 -07:00
Krrish Dholakia
55bd413585 style(main.py): clean up print statement 2023-10-07 15:43:40 -07:00
Krrish Dholakia
cf7e2595b8 fix(utils): adds complete streaming response to success handler 2023-10-07 15:42:00 -07:00
Krrish Dholakia
c6d36fb59d docs(completion-docs): adds more details on provider-specific params 2023-10-07 13:49:30 -07:00