ishaan-jaff
|
07ea2e26f4
|
(fix) proxy - remove bloat - deprecated log_input_output
|
2023-12-27 17:36:53 +05:30 |
|
ishaan-jaff
|
74ea0832ca
|
(feat) proxy remove appdirs as a dep
|
2023-12-27 17:33:47 +05:30 |
|
ishaan-jaff
|
af15e49948
|
(feat) proxy - remove subprocess install
|
2023-12-27 17:28:03 +05:30 |
|
Krrish Dholakia
|
e516cfe9f5
|
fix(utils.py): allow text completion input to be either model or engine
|
2023-12-27 17:24:16 +05:30 |
|
ishaan-jaff
|
7864d9a027
|
(fix) openai + stream - logprobs check
|
2023-12-27 16:59:56 +05:30 |
|
Krrish Dholakia
|
10a74d02c1
|
test(test_router_fallbacks.py): fix test to check previous models in pre_api_call not on success
|
2023-12-27 16:34:40 +05:30 |
|
ishaan-jaff
|
f6f8d1a9df
|
(test) langfuse beta test - text_completion
|
2023-12-27 15:45:40 +05:30 |
|
ishaan-jaff
|
8528d9f809
|
(test) gpt-3.5-turbo-instruct finish reason
|
2023-12-27 15:45:40 +05:30 |
|
ishaan-jaff
|
646c106983
|
(feat) text-completion-openai, send 1 finish_reason
|
2023-12-27 15:45:40 +05:30 |
|
Krrish Dholakia
|
31148922b3
|
fix(azure.py): raise streaming exceptions
|
2023-12-27 15:43:13 +05:30 |
|
ishaan-jaff
|
f4fe2575cc
|
(fix) use client for text_completion()
|
2023-12-27 15:20:26 +05:30 |
|
ishaan-jaff
|
e70f588b87
|
(fix) text_completion use correct finish reason
|
2023-12-27 15:20:26 +05:30 |
|
ishaan-jaff
|
db135aea4c
|
(test) fix langfuse test
|
2023-12-27 15:20:26 +05:30 |
|
Krrish Dholakia
|
6d63c0015b
|
test(test_router_fallbacks.py): add testing for sync streaming fallbacks
|
2023-12-27 15:10:43 +05:30 |
|
Krrish Dholakia
|
c9fdbaf898
|
fix(azure.py,-openai.py): correctly raise errors if streaming calls fail
|
2023-12-27 15:08:37 +05:30 |
|
Krrish Dholakia
|
9ba520cc8b
|
fix(google_kms.py): support enums for key management system
|
2023-12-27 13:19:33 +05:30 |
|
ishaan-jaff
|
4cc59d21d0
|
(feat) add text_completion, atext_completion CallTypes
|
2023-12-27 12:24:16 +05:30 |
|
Krrish Dholakia
|
c88a8d71f0
|
fix: fix linting issues
|
2023-12-27 12:21:31 +05:30 |
|
Krish Dholakia
|
5c3a61d62f
|
Merge pull request #1248 from danikhan632/main
updated oobabooga to new api and support for embeddings
|
2023-12-27 11:33:56 +05:30 |
|
Ishaan Jaff
|
22d0c21829
|
Merge pull request #1249 from evantancy/main
fix: helicone logging
|
2023-12-27 11:24:19 +05:30 |
|
evantancy
|
668c786099
|
fix: helicone logging
|
2023-12-27 12:16:29 +08:00 |
|
dan
|
c4dfd9be7c
|
updated oobabooga to new api and support for embeddings
|
2023-12-26 19:45:28 -05:00 |
|
ishaan-jaff
|
db90ccd19f
|
(test) local ollama_chat
|
2023-12-26 20:11:14 +05:30 |
|
ishaan-jaff
|
3f6e6e7f55
|
(fix) ollama_chat - support function calling + fix for comp
|
2023-12-26 20:07:55 +05:30 |
|
ishaan-jaff
|
0b8d9d177b
|
(test) ollama_chat acompletion without stream
|
2023-12-26 20:01:51 +05:30 |
|
ishaan-jaff
|
3839213d28
|
(feat) ollama_chat acompletion without streaming
|
2023-12-26 20:01:51 +05:30 |
|
ishaan-jaff
|
751d57379d
|
(fix) support ollama_chat for acompletion
|
2023-12-26 20:01:51 +05:30 |
|
Krrish Dholakia
|
235526625d
|
feat(proxy_server.py): support maxage cache control
|
2023-12-26 17:50:27 +05:30 |
|
Krrish Dholakia
|
f0b6b9dce2
|
fix(main.py): support ttl being set for completion, embedding, image generation calls
|
2023-12-26 17:22:40 +05:30 |
|
Krrish Dholakia
|
167a6ba319
|
refactor(google_kms.py): fix linting issue
|
2023-12-26 16:21:35 +05:30 |
|
ishaan-jaff
|
ef074c707a
|
(fix) streaming logprobs=None
|
2023-12-26 15:42:51 +05:30 |
|
Krrish Dholakia
|
2070a785a4
|
feat(utils.py): support google kms for secret management
https://github.com/BerriAI/litellm/issues/1235
|
2023-12-26 15:39:40 +05:30 |
|
ishaan-jaff
|
e29dcf595e
|
(test) azure gpt-vision
|
2023-12-26 15:24:20 +05:30 |
|
ishaan-jaff
|
5643658dac
|
(test) stream + logprobs openai
|
2023-12-26 15:15:37 +05:30 |
|
ishaan-jaff
|
6406046d38
|
(feat) logprobs for streaming openai
|
2023-12-26 15:15:05 +05:30 |
|
ishaan-jaff
|
a463625452
|
(chore) completion - move functions lower
|
2023-12-26 14:35:59 +05:30 |
|
ishaan-jaff
|
9c855a9478
|
(fix) optional params - openai/azure. don't overwrite it
|
2023-12-26 14:32:59 +05:30 |
|
ishaan-jaff
|
b9f2262d5f
|
(test) openai logprobs
|
2023-12-26 14:00:42 +05:30 |
|
ishaan-jaff
|
7b097305c1
|
(feat) support logprobs, top_logprobs openai
|
2023-12-26 14:00:42 +05:30 |
|
Krrish Dholakia
|
871f207124
|
docs(user_keys.md): docs on passing user keys to litellm proxy
|
2023-12-26 13:55:28 +05:30 |
|
ishaan-jaff
|
0b0d22d58c
|
(feat) add logprobs, top_logprobs to litellm.completion
|
2023-12-26 13:39:48 +05:30 |
|
ishaan-jaff
|
ae074814c9
|
(test) azure gpt-4 vision test
|
2023-12-26 13:18:38 +05:30 |
|
Krrish Dholakia
|
f5ed4992db
|
fix(router.py): accept dynamic api key
|
2023-12-26 13:16:22 +05:30 |
|
ishaan-jaff
|
d273d19bd9
|
(feat) proxy, use --model with --test
|
2023-12-26 09:40:58 +05:30 |
|
ishaan-jaff
|
2d5801b69e
|
(feat) add langfuse logging tests to ci/cd
|
2023-12-26 09:16:13 +05:30 |
|
ishaan-jaff
|
18676bb560
|
(fix) langfuse - asycn logger
|
2023-12-26 08:49:49 +05:30 |
|
ishaan-jaff
|
3f15d7230f
|
(test) ollama cht
|
2023-12-26 00:08:35 +05:30 |
|
ishaan-jaff
|
b9123a63fa
|
bump: version 1.15.7 → 1.15.8
|
2023-12-26 00:07:57 +05:30 |
|
ishaan-jaff
|
2690fe6dca
|
(test) ollama-chat
|
2023-12-25 23:59:24 +05:30 |
|
ishaan-jaff
|
837ce269ae
|
(feat) ollama_chat add async stream
|
2023-12-25 23:45:27 +05:30 |
|