ishaan-jaff
|
76c007d542
|
(test) promptlayer logger
|
2023-11-02 08:18:00 -07:00 |
|
ishaan-jaff
|
2c18f7d374
|
(fix) logging callbacks - promptlayer
|
2023-11-02 08:18:00 -07:00 |
|
Krrish Dholakia
|
1bef2c62a6
|
fix(proxy_server.py): fix linting issues
|
2023-11-02 08:07:20 -07:00 |
|
Krrish Dholakia
|
740460f390
|
fix(main.py): expose custom llm provider for text completions
|
2023-11-02 07:55:54 -07:00 |
|
Krrish Dholakia
|
cb20554c79
|
fix(proxy_server.py): accept single quote json body
|
2023-11-02 07:07:38 -07:00 |
|
Krrish Dholakia
|
a12d0508c4
|
fix(proxy_server.py): fix v1/models get request
|
2023-11-02 06:51:47 -07:00 |
|
ishaan-jaff
|
0a41443c3e
|
(test) hf text_completion
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
8b5ee89d82
|
(feat) add transform_logprobs for text_completion
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
8ca7af3a63
|
(feat) text completion set top_n_tokens for tgi
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
19737f95c5
|
(feat) proxy add testing for openai.Completion.create
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
bc61c81cc6
|
(feat) proxy server add route ngines/{model:path}/completions
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
700fcfa5fb
|
(feat) text_completion add transform_logprobs
|
2023-11-01 18:25:13 -07:00 |
|
ishaan-jaff
|
1d0ce77baf
|
(fix) improve litellm.set_verbose=True logging
|
2023-11-01 18:25:13 -07:00 |
|
Krrish Dholakia
|
b305492a0b
|
fix(huggingface_restapi.py): fix linting issue
|
2023-11-01 16:43:35 -07:00 |
|
Krrish Dholakia
|
2c4cb76ce5
|
fix(huggingface_restapi.py): fix embeddings for sentence-transformer models
|
2023-11-01 16:36:46 -07:00 |
|
Krrish Dholakia
|
ab0a29e160
|
fix(proxy_server.py): return all locally available ollama models
|
2023-11-01 16:20:26 -07:00 |
|
ishaan-jaff
|
d492bca05e
|
(test) test_stream_chunk_builder
|
2023-11-01 14:54:00 -07:00 |
|
ishaan-jaff
|
863867fe00
|
(fix) stream_chunk_builder
|
2023-11-01 14:53:09 -07:00 |
|
ishaan-jaff
|
3fecab30d8
|
(test) amazon.titan-embed-text-v1
|
2023-11-01 14:46:46 -07:00 |
|
ishaan-jaff
|
9cfd218101
|
(feat) detect amazon.titan-embed-text-v1 as bedrock embedding model
|
2023-11-01 14:46:33 -07:00 |
|
Krrish Dholakia
|
f9ff03d5af
|
fix(utils.py): mapping stop sequences for palm
|
2023-11-01 14:00:45 -07:00 |
|
ishaan-jaff
|
a46d6a2dc9
|
(test) add bedrock/amazon.titan-embed-text-v1
|
2023-11-01 13:55:28 -07:00 |
|
ishaan-jaff
|
2ad81bdd7b
|
(feat) embedding() add bedrock/amazon.titan-embed-text-v1
|
2023-11-01 13:55:28 -07:00 |
|
Krrish Dholakia
|
fb4be198ee
|
docs(routing.md): adding context window fallback dict and num retries
|
2023-11-01 13:52:18 -07:00 |
|
ishaan-jaff
|
d72cd8576e
|
(test) use litellm.api_version
|
2023-11-01 11:52:01 -07:00 |
|
ishaan-jaff
|
317c290b69
|
(test) add num retries to caching testing
|
2023-11-01 11:42:40 -07:00 |
|
ishaan-jaff
|
01d90691f9
|
(docs) add num_retries to docstring
|
2023-11-01 10:55:56 -07:00 |
|
ishaan-jaff
|
70885bdba6
|
(test) stream chunk builder
|
2023-11-01 08:38:19 -07:00 |
|
stefan
|
bbc82f3afa
|
Use supplied headers
|
2023-11-01 20:31:16 +07:00 |
|
ishaan-jaff
|
f6983223f9
|
(test) track usage in custom callback streaming
|
2023-10-31 23:05:42 -07:00 |
|
ishaan-jaff
|
d1f2593dc0
|
(fix) add usage tracking in callback
|
2023-10-31 23:02:54 -07:00 |
|
Krrish Dholakia
|
7762ae7762
|
feat(utils.py): accept context window fallback dictionary
|
2023-10-31 22:32:36 -07:00 |
|
ishaan-jaff
|
2f6fed30ad
|
(test) bedrock.cohere testing
|
2023-10-31 22:26:43 -07:00 |
|
ishaan-jaff
|
fa7e063198
|
(feat) add bedrock.cohere streaming
|
2023-10-31 22:26:43 -07:00 |
|
Krrish Dholakia
|
0ffb1dcc38
|
test(test_completion_with_retries.py): cleanup tests
|
2023-10-31 21:49:33 -07:00 |
|
Krrish Dholakia
|
f3efd566c9
|
style(main.py): fix linting issues
|
2023-10-31 19:23:14 -07:00 |
|
Krrish Dholakia
|
125642563c
|
feat(completion()): adding num_retries
https://github.com/BerriAI/litellm/issues/728
|
2023-10-31 19:14:55 -07:00 |
|
Krrish Dholakia
|
6ead8d8c18
|
fix(caching.py): fixing pr issues
|
2023-10-31 18:32:40 -07:00 |
|
ishaan-jaff
|
f016234cbf
|
(test) text_completion add echo for hf
|
2023-10-31 18:20:59 -07:00 |
|
ishaan-jaff
|
ce462824be
|
(feat) add support for echo for HF logprobs
|
2023-10-31 18:20:59 -07:00 |
|
ishaan-jaff
|
e180bba541
|
(test) text completion add logprobs to test
|
2023-10-31 17:15:35 -07:00 |
|
ishaan-jaff
|
9223f7cc7a
|
(feat) textcompletion - transform hf log probs to openai text completion
|
2023-10-31 17:15:35 -07:00 |
|
Krrish Dholakia
|
4d95756432
|
test(test_completion.py): re-add bedrock + sagemaker testing
|
2023-10-31 16:49:13 -07:00 |
|
Krish Dholakia
|
523c08a646
|
Merge pull request #717 from canada4663/main
support for custom bedrock runtime endpoint
|
2023-10-31 16:47:33 -07:00 |
|
Krish Dholakia
|
9bef396d04
|
Merge pull request #722 from karvetskiy/fix-router-caching
Fix caching for Router
|
2023-10-31 16:39:18 -07:00 |
|
ishaan-jaff
|
85692398c1
|
(test) add support for returning logprobs
|
2023-10-31 15:33:45 -07:00 |
|
ishaan-jaff
|
1ada3a13d4
|
(feat) text_completion return raw response for davinci003
|
2023-10-31 15:32:04 -07:00 |
|
ishaan-jaff
|
de47058e32
|
(feat) text_completion return raw openai response for text_completion requests
|
2023-10-31 15:31:24 -07:00 |
|
ishaan-jaff
|
4875af17a1
|
(fix) linting errors
|
2023-10-31 14:43:10 -07:00 |
|
ishaan-jaff
|
b301e4ead7
|
(test) text_completion pass prompt as array
|
2023-10-31 14:29:43 -07:00 |
|