ishaan-jaff
|
fa4a533e91
|
(fix) timeout optional param
|
2023-12-30 11:07:52 +05:30 |
|
ishaan-jaff
|
d5cbef4e36
|
(feat) proxy - support dynamic timeout per request
|
2023-12-30 10:55:42 +05:30 |
|
Marmik Pandya
|
1faad4b0c1
|
add support for mistral json mode via anyscale
|
2023-12-29 22:26:22 +05:30 |
|
ishaan-jaff
|
7afc022ad3
|
(fix) counting streaming prompt tokens - azure
|
2023-12-29 16:13:52 +05:30 |
|
ishaan-jaff
|
4f832bce52
|
(fix) token_counter for tool calling
|
2023-12-29 15:54:03 +05:30 |
|
ishaan-jaff
|
806551ff99
|
(fix) use openai token counter for azure llms
|
2023-12-29 15:37:46 +05:30 |
|
ishaan-jaff
|
70376d3a4f
|
(feat) azure stream - count correct prompt tokens
|
2023-12-29 15:15:39 +05:30 |
|
ishaan-jaff
|
8475fddc78
|
(feat) cloudflare - add exception mapping
|
2023-12-29 12:31:10 +05:30 |
|
ishaan-jaff
|
27f8598867
|
(feat) add cloudflare streaming
|
2023-12-29 12:01:26 +05:30 |
|
ishaan-jaff
|
c69f4f17a5
|
(feat) cloudflare - add optional params
|
2023-12-29 11:50:09 +05:30 |
|
ishaan-jaff
|
5d31bea9e0
|
(fix) tg AI cost tracking - zero-one-ai/Yi-34B-Chat
|
2023-12-29 09:14:07 +05:30 |
|
ishaan-jaff
|
362bed6ca3
|
(fix) together_ai cost tracking
|
2023-12-28 22:11:08 +05:30 |
|
Krrish Dholakia
|
5a48dac83f
|
fix(vertex_ai.py): support function calling for gemini
|
2023-12-28 19:07:04 +05:30 |
|
ishaan-jaff
|
2a147579ec
|
(feat) add voyage ai embeddings
|
2023-12-28 17:10:15 +05:30 |
|
Krrish Dholakia
|
507b6bf96e
|
fix(utils.py): use local tiktoken copy
|
2023-12-28 11:22:33 +05:30 |
|
Krrish Dholakia
|
606de01ac0
|
fix(utils.py): allow text completion input to be either model or engine
|
2023-12-27 17:24:16 +05:30 |
|
ishaan-jaff
|
5f9e18c4c0
|
(fix) openai + stream - logprobs check
|
2023-12-27 16:59:56 +05:30 |
|
ishaan-jaff
|
c65d9a8b54
|
(feat) text-completion-openai, send 1 finish_reason
|
2023-12-27 15:45:40 +05:30 |
|
ishaan-jaff
|
592bcd5eea
|
(fix) text_completion use correct finish reason
|
2023-12-27 15:20:26 +05:30 |
|
Krrish Dholakia
|
fd5e6efb1d
|
fix(azure.py,-openai.py): correctly raise errors if streaming calls fail
|
2023-12-27 15:08:37 +05:30 |
|
Krrish Dholakia
|
85549c3d66
|
fix(google_kms.py): support enums for key management system
|
2023-12-27 13:19:33 +05:30 |
|
ishaan-jaff
|
021d7fab65
|
(feat) add text_completion, atext_completion CallTypes
|
2023-12-27 12:24:16 +05:30 |
|
ishaan-jaff
|
99c86bf890
|
(fix) streaming logprobs=None
|
2023-12-26 15:42:51 +05:30 |
|
Krrish Dholakia
|
6f695838e5
|
feat(utils.py): support google kms for secret management
https://github.com/BerriAI/litellm/issues/1235
|
2023-12-26 15:39:40 +05:30 |
|
ishaan-jaff
|
9c6525e4e2
|
(feat) logprobs for streaming openai
|
2023-12-26 15:15:05 +05:30 |
|
ishaan-jaff
|
0428a5cc04
|
(fix) optional params - openai/azure. don't overwrite it
|
2023-12-26 14:32:59 +05:30 |
|
ishaan-jaff
|
c1b1d0d15d
|
(feat) support logprobs, top_logprobs openai
|
2023-12-26 14:00:42 +05:30 |
|
ishaan-jaff
|
109f82efee
|
(fix) langfuse - asycn logger
|
2023-12-26 08:49:49 +05:30 |
|
ishaan-jaff
|
c199d4c1fc
|
(feat) ollama_chat - add async streaming
|
2023-12-25 23:45:01 +05:30 |
|
ishaan-jaff
|
0f4b5a1446
|
(feat) add ollama_chat exception mapping
|
2023-12-25 23:43:14 +05:30 |
|
ishaan-jaff
|
35a68665d1
|
(feat) ollama_chat - streaming
|
2023-12-25 23:38:47 +05:30 |
|
ishaan-jaff
|
763ba913ec
|
utils - convert ollama_chat params
|
2023-12-25 23:04:17 +05:30 |
|
Krrish Dholakia
|
79978c44ba
|
refactor: add black formatting
|
2023-12-25 14:11:20 +05:30 |
|
Krrish Dholakia
|
6d73a77b01
|
fix(proxy_server.py): raise streaming exceptions
|
2023-12-25 07:18:09 +05:30 |
|
Krrish Dholakia
|
d1dea7c87d
|
fix(utils.py): log user_id to langfuse
|
2023-12-23 12:14:09 +05:30 |
|
Krrish Dholakia
|
1878392f64
|
bump: version 1.15.6 → 1.15.7
|
2023-12-23 10:03:49 +05:30 |
|
Krrish Dholakia
|
c8d3a609e1
|
fix(langsmith.py): fix langsmith streaming logging
|
2023-12-23 10:02:35 +05:30 |
|
Krrish Dholakia
|
f5ffea471d
|
fix(utils.py): handle ollama yielding a dict
|
2023-12-22 12:23:42 +05:30 |
|
Krrish Dholakia
|
e3d486efe2
|
fix(utils.py): handle 'os.environ/' being passed in as kwargs
|
2023-12-22 11:08:44 +05:30 |
|
Krrish Dholakia
|
c084f04a35
|
fix(router.py): add support for async image generation endpoints
|
2023-12-21 14:38:44 +05:30 |
|
Krrish Dholakia
|
8c7d62e62d
|
fix(utils.py): fix non_default_param pop error for ollama
|
2023-12-21 06:59:13 +05:30 |
|
Krrish Dholakia
|
77b11daf28
|
fix(utils.py): add support for anyscale function calling
|
2023-12-20 17:48:33 +05:30 |
|
Krrish Dholakia
|
23d0278739
|
feat(azure.py): add support for azure image generations endpoint
|
2023-12-20 16:37:21 +05:30 |
|
Krrish Dholakia
|
636ac9b605
|
feat(ollama.py): add support for ollama function calling
|
2023-12-20 14:59:55 +05:30 |
|
Krrish Dholakia
|
b0300392b9
|
fix(utils.py): vertex ai exception mapping
|
2023-12-19 15:25:29 +00:00 |
|
Krrish Dholakia
|
40a9d62de9
|
fix(ollama.py): raise async errors
|
2023-12-19 15:01:12 +00:00 |
|
ishaan-jaff
|
97df44396b
|
(feat) add open router transforms, models, route
|
2023-12-18 09:55:35 +05:30 |
|
ishaan-jaff
|
d3c1c4bf28
|
(feat) set default openrouter configs
|
2023-12-18 08:55:51 +05:30 |
|
Krrish Dholakia
|
51cb16a015
|
feat(main.py): add support for image generation endpoint
|
2023-12-16 21:07:29 -08:00 |
|
Krrish Dholakia
|
e62327dd92
|
fix(traceloop.py): add additional openllmetry traces
|
2023-12-16 19:21:39 -08:00 |
|