ishaan-jaff
|
93b1df1c79
|
(feat) embedding - pass model_info, proxy_server request
|
2023-12-08 14:26:18 -08:00 |
|
ishaan-jaff
|
be94a8c478
|
(feat) pass model_info, proxy_server_request to callback
|
2023-12-08 14:26:18 -08:00 |
|
ishaan-jaff
|
762f28e4d7
|
(fix) make print_verbose non blocking
|
2023-12-07 17:31:32 -08:00 |
|
Krrish Dholakia
|
c1e95740b0
|
fix(bedrock.py): fix output format for cohere embeddings
|
2023-12-06 22:47:01 -08:00 |
|
ishaan-jaff
|
e3b24ec797
|
(feat) aembedding - add custom logging support
|
2023-12-06 19:09:06 -08:00 |
|
Krrish Dholakia
|
94f065f83c
|
feat(sagemaker.py): support huggingface embedding models
|
2023-12-06 11:41:38 -08:00 |
|
Krrish Dholakia
|
54d8a9df3f
|
fix(sagemaker.py): enable passing hf model name for prompt template
|
2023-12-05 16:31:59 -08:00 |
|
Krrish Dholakia
|
b4c78c7b9e
|
fix(utils.py): support sagemaker llama2 custom endpoints
|
2023-12-05 16:05:15 -08:00 |
|
Krrish Dholakia
|
69a4497550
|
fix(main.py): accept user in embedding()
|
2023-12-02 21:49:23 -08:00 |
|
estill01
|
737abbb0c1
|
fix
|
2023-12-03 05:37:57 +00:00 |
|
estill01
|
82fbbf67ca
|
Fix; persistent 'model' default value
|
2023-12-03 05:34:24 +00:00 |
|
Krrish Dholakia
|
bb4f82066a
|
fix(main.py): only send user if set
|
2023-12-02 20:36:30 -08:00 |
|
Krrish Dholakia
|
f72dd24ab9
|
fix(main.py): set user to none if not passed in
|
2023-12-02 20:08:25 -08:00 |
|
Krrish Dholakia
|
6c0eec4ff4
|
fix(main.py): fix pydantic warning for usage dict
|
2023-12-02 20:02:55 -08:00 |
|
estill01
|
56e95197c6
|
Enable setting default model value for Completions
add `model` arg to `Completions` class; if you provide a value, it will be used when you create new completions from an instance of the class.
|
2023-12-02 19:50:18 -08:00 |
|
Krrish Dholakia
|
82553e8aac
|
fix(azure.py): fix linting errors
|
2023-11-30 13:32:29 -08:00 |
|
Krrish Dholakia
|
4f07c8565a
|
feat(main.py): add support for azure-openai via cloudflare ai gateway
|
2023-11-30 13:19:49 -08:00 |
|
Krrish Dholakia
|
01c7e18f31
|
fix(utils.py): include system fingerprint in streaming response object
|
2023-11-30 08:45:52 -08:00 |
|
Krrish Dholakia
|
0d200cd8dc
|
feat(main.py): allow updating model cost via completion()
|
2023-11-29 20:14:39 -08:00 |
|
Krrish Dholakia
|
c312ac4ca8
|
fix(main.py): don't pass stream to petals
|
2023-11-29 19:58:04 -08:00 |
|
Krrish Dholakia
|
1f5a1122fc
|
fix(replicate.py): fix custom prompt formatting
|
2023-11-29 19:44:09 -08:00 |
|
ishaan-jaff
|
c05da0797b
|
(feat) Embedding: Async Azure
|
2023-11-29 19:43:47 -08:00 |
|
ishaan-jaff
|
09caab549a
|
(feat) async embeddings: OpenAI
|
2023-11-29 19:35:08 -08:00 |
|
Krrish Dholakia
|
61185aa12c
|
fix(main.py): fix null finish reason issue for ollama
|
2023-11-29 16:50:11 -08:00 |
|
Krrish Dholakia
|
ab76daa90b
|
fix(bedrock.py): support ai21 / bedrock streaming
|
2023-11-29 16:35:06 -08:00 |
|
ishaan-jaff
|
da75b15176
|
(feat) completion: add rpm, tpm as litellm params
|
2023-11-29 16:19:05 -08:00 |
|
Krrish Dholakia
|
6c98715b94
|
bump: version 1.7.13 → 1.7.14
|
2023-11-29 15:19:18 -08:00 |
|
Krrish Dholakia
|
451851e6a4
|
fix(main.py): have stream_chunk_builder return successful response even if token_counter fails
|
2023-11-29 15:19:11 -08:00 |
|
Krrish Dholakia
|
b6bc75e27a
|
fix(utils.py): fix parallel tool calling when streaming
|
2023-11-29 10:56:21 -08:00 |
|
ishaan-jaff
|
6f71299bb0
|
(fix) embedding pop out client from params
|
2023-11-28 21:22:01 -08:00 |
|
Krrish Dholakia
|
383dd53e86
|
fix(main.py): passing client as a litellm-specific kwarg
|
2023-11-28 21:20:05 -08:00 |
|
ishaan-jaff
|
282b9a37e5
|
(fix) router: passing client
|
2023-11-28 16:34:16 -08:00 |
|
ishaan-jaff
|
8ac7801283
|
(feat) completion:openai pass OpenAI client
|
2023-11-28 16:05:01 -08:00 |
|
ishaan-jaff
|
400a268934
|
(feat) completion: Azure allow users to pass client to router
|
2023-11-28 15:56:52 -08:00 |
|
ishaan-jaff
|
7914623fbc
|
(feat) allow users to pass azure client for acmompletion
|
2023-11-28 15:44:56 -08:00 |
|
Krrish Dholakia
|
150b91d476
|
fix(utils.py): fix streaming on-success logging
|
2023-11-28 09:11:47 -08:00 |
|
ishaan-jaff
|
224a028ab6
|
(fix) completion: AZURE_OPENAI_API_KEY
|
2023-11-28 08:06:06 -08:00 |
|
Krrish Dholakia
|
be9fa06da6
|
fix(main.py): fix linting errors
|
2023-11-27 19:11:38 -08:00 |
|
Krrish Dholakia
|
4cdd930fa2
|
fix(stream_chunk_builder): adding support for tool calling in completion counting
|
2023-11-27 18:39:47 -08:00 |
|
Krrish Dholakia
|
04f745e314
|
fix(router.py): speed improvements to the router
|
2023-11-27 17:35:26 -08:00 |
|
Krrish Dholakia
|
56bb39e52c
|
fix(acompletion): fix acompletion raise exception issue when custom llm provider is none
|
2023-11-27 11:34:48 -08:00 |
|
ishaan-jaff
|
6baaf629e5
|
(fix) embedding: filter out metadata from optional_params
|
2023-11-25 11:10:06 -08:00 |
|
Krrish Dholakia
|
e732fb8b97
|
fix(main.py): logit bias mapping for batch_completions
|
2023-11-24 16:05:51 -08:00 |
|
Krrish Dholakia
|
1a6ea20a0b
|
fix(main.py): fixing linting issues
|
2023-11-24 15:25:51 -08:00 |
|
ishaan-jaff
|
a6bea946b3
|
(fix) completion: when logit bias is None
|
2023-11-24 14:01:21 -08:00 |
|
Krrish Dholakia
|
2e8d582a34
|
fix(proxy_server.py): fix linting issues
|
2023-11-24 11:39:01 -08:00 |
|
Krrish Dholakia
|
4a5dae3941
|
fix(main.py): fix streaming_chunk_builder to return usage
|
2023-11-24 11:27:04 -08:00 |
|
Krrish Dholakia
|
bfaed56ffb
|
fix(main.py): only set api key to dynamic api key if it's not none
|
2023-11-23 16:45:44 -08:00 |
|
Krrish Dholakia
|
c074023e14
|
fix: fix linting issues
|
2023-11-23 13:47:43 -08:00 |
|
Krrish Dholakia
|
4f183dc6a0
|
fix(utils.py): support reading api keys dynamically from the os environment
|
2023-11-23 13:41:56 -08:00 |
|