evantancy
|
668c786099
|
fix: helicone logging
|
2023-12-27 12:16:29 +08:00 |
|
dan
|
c4dfd9be7c
|
updated oobabooga to new api and support for embeddings
|
2023-12-26 19:45:28 -05:00 |
|
Prateek Sachan
|
940c33680e
|
updating anyscale link to latest json present in main branch
|
2023-12-27 01:41:17 +05:30 |
|
Prateek Sachan
|
2acafb7d31
|
updating anyscale link to latest json present in main branch
|
2023-12-27 01:37:21 +05:30 |
|
Krrish Dholakia
|
31fbb095c2
|
docs(user_keys.md): update code example
|
2023-12-26 22:16:18 +05:30 |
|
ishaan-jaff
|
db90ccd19f
|
(test) local ollama_chat
|
2023-12-26 20:11:14 +05:30 |
|
ishaan-jaff
|
3f6e6e7f55
|
(fix) ollama_chat - support function calling + fix for comp
|
2023-12-26 20:07:55 +05:30 |
|
ishaan-jaff
|
0b8d9d177b
|
(test) ollama_chat acompletion without stream
|
2023-12-26 20:01:51 +05:30 |
|
ishaan-jaff
|
3839213d28
|
(feat) ollama_chat acompletion without streaming
|
2023-12-26 20:01:51 +05:30 |
|
ishaan-jaff
|
751d57379d
|
(fix) support ollama_chat for acompletion
|
2023-12-26 20:01:51 +05:30 |
|
Krrish Dholakia
|
43a345ca13
|
bump: version 1.15.9 → 1.15.10
|
2023-12-26 17:54:21 +05:30 |
|
Krrish Dholakia
|
1b837e9601
|
docs(caching.md): add cache controls to docs
|
2023-12-26 17:54:14 +05:30 |
|
Krrish Dholakia
|
235526625d
|
feat(proxy_server.py): support maxage cache control
|
2023-12-26 17:50:27 +05:30 |
|
ishaan-jaff
|
a5f998375c
|
(docs) use latest ghcr image
|
2023-12-26 17:28:39 +05:30 |
|
ishaan-jaff
|
e9b9323b75
|
bump: version 1.15.8 → 1.15.9
|
2023-12-26 17:24:12 +05:30 |
|
Krrish Dholakia
|
f0b6b9dce2
|
fix(main.py): support ttl being set for completion, embedding, image generation calls
|
2023-12-26 17:22:40 +05:30 |
|
Krish Dholakia
|
dfd2f68c07
|
Update README.md
|
2023-12-26 16:26:22 +05:30 |
|
Krrish Dholakia
|
167a6ba319
|
refactor(google_kms.py): fix linting issue
|
2023-12-26 16:21:35 +05:30 |
|
ishaan-jaff
|
ef074c707a
|
(fix) streaming logprobs=None
|
2023-12-26 15:42:51 +05:30 |
|
Krrish Dholakia
|
2070a785a4
|
feat(utils.py): support google kms for secret management
https://github.com/BerriAI/litellm/issues/1235
|
2023-12-26 15:39:40 +05:30 |
|
ishaan-jaff
|
e29dcf595e
|
(test) azure gpt-vision
|
2023-12-26 15:24:20 +05:30 |
|
ishaan-jaff
|
5643658dac
|
(test) stream + logprobs openai
|
2023-12-26 15:15:37 +05:30 |
|
ishaan-jaff
|
6406046d38
|
(feat) logprobs for streaming openai
|
2023-12-26 15:15:05 +05:30 |
|
ishaan-jaff
|
112ffa3596
|
(docs) add logprobs, top_logprobs
|
2023-12-26 14:53:30 +05:30 |
|
ishaan-jaff
|
f3844b309f
|
(docs) input params - add logprobs, top_logprobs
|
2023-12-26 14:41:33 +05:30 |
|
ishaan-jaff
|
b2e69cbc24
|
(docs) input params
|
2023-12-26 14:36:19 +05:30 |
|
ishaan-jaff
|
a463625452
|
(chore) completion - move functions lower
|
2023-12-26 14:35:59 +05:30 |
|
ishaan-jaff
|
9c855a9478
|
(fix) optional params - openai/azure. don't overwrite it
|
2023-12-26 14:32:59 +05:30 |
|
ishaan-jaff
|
b9f2262d5f
|
(test) openai logprobs
|
2023-12-26 14:00:42 +05:30 |
|
ishaan-jaff
|
7b097305c1
|
(feat) support logprobs, top_logprobs openai
|
2023-12-26 14:00:42 +05:30 |
|
Krrish Dholakia
|
871f207124
|
docs(user_keys.md): docs on passing user keys to litellm proxy
|
2023-12-26 13:55:28 +05:30 |
|
ishaan-jaff
|
0b0d22d58c
|
(feat) add logprobs, top_logprobs to litellm.completion
|
2023-12-26 13:39:48 +05:30 |
|
ishaan-jaff
|
ae074814c9
|
(test) azure gpt-4 vision test
|
2023-12-26 13:18:38 +05:30 |
|
ishaan-jaff
|
08ed509b76
|
(docs) azure gpt-4 vision
|
2023-12-26 13:18:38 +05:30 |
|
Krrish Dholakia
|
f5ed4992db
|
fix(router.py): accept dynamic api key
|
2023-12-26 13:16:22 +05:30 |
|
Krish Dholakia
|
3029e8a197
|
Update README.md
|
2023-12-26 12:29:44 +05:30 |
|
Krish Dholakia
|
fde7c0ec97
|
Update README.md
|
2023-12-26 12:27:37 +05:30 |
|
Ishaan Jaff
|
d915fb8729
|
Update README.md
|
2023-12-26 11:57:00 +05:30 |
|
Ishaan Jaff
|
87cc96b590
|
Update ghcr_deploy.yml
|
2023-12-26 11:15:31 +05:30 |
|
ishaan-jaff
|
a55e7fbc44
|
(ci/cd) use -latest tag if none provided
|
2023-12-26 11:11:15 +05:30 |
|
ishaan-jaff
|
623716fa89
|
(feat) add ollama async_generator to req.txt
|
2023-12-26 10:52:12 +05:30 |
|
ishaan-jaff
|
35765e02cf
|
(fix) proxy req.txt, langfuse
|
2023-12-26 10:43:29 +05:30 |
|
ishaan-jaff
|
094f6355e4
|
(chore) fix ci/cd yaml format
|
2023-12-26 10:39:29 +05:30 |
|
ishaan-jaff
|
157b21c591
|
(ci/cd) deploy new litellm image, after pass ci/cd
|
2023-12-26 10:33:05 +05:30 |
|
ishaan-jaff
|
01deb08570
|
(ci/cd) rename ghcr workflow
|
2023-12-26 10:09:15 +05:30 |
|
ishaan-jaff
|
d273d19bd9
|
(feat) proxy, use --model with --test
|
2023-12-26 09:40:58 +05:30 |
|
ishaan-jaff
|
05b9b3aacd
|
(docs) bump langfuse to >= 2.0.0
|
2023-12-26 09:23:45 +05:30 |
|
ishaan-jaff
|
2d5801b69e
|
(feat) add langfuse logging tests to ci/cd
|
2023-12-26 09:16:13 +05:30 |
|
ishaan-jaff
|
7346b1638c
|
(chore) gitignore langfuse.log
|
2023-12-26 09:09:32 +05:30 |
|
ishaan-jaff
|
b004cc05d3
|
(chore) gitignore langfuse.log
|
2023-12-26 09:08:16 +05:30 |
|