Commit graph

763 commits

Author SHA1 Message Date
ishaan-jaff
0e8809abf2 (feat) add xinference as an embedding provider 2024-01-02 15:32:26 +05:30
Krrish Dholakia
d4da63800e fix(utils.py): support token counting for gpt-4-vision models 2024-01-02 14:41:42 +05:30
Krrish Dholakia
4eae0c9a0d fix(router.py): correctly raise no model available error
https://github.com/BerriAI/litellm/issues/1289
2024-01-01 21:22:42 +05:30
ishaan-jaff
31bdcb48af (fix) use cloudflare optional params 2023-12-30 12:22:31 +05:30
Krrish Dholakia
7d55a563ee fix(main.py): don't set timeout as an optional api param 2023-12-30 11:47:07 +05:30
Krrish Dholakia
e1925d0e29 fix(router.py): support retry and fallbacks for atext_completion 2023-12-30 11:19:32 +05:30
ishaan-jaff
fa4a533e91 (fix) timeout optional param 2023-12-30 11:07:52 +05:30
ishaan-jaff
d5cbef4e36 (feat) proxy - support dynamic timeout per request 2023-12-30 10:55:42 +05:30
Marmik Pandya
1faad4b0c1 add support for mistral json mode via anyscale 2023-12-29 22:26:22 +05:30
ishaan-jaff
7afc022ad3 (fix) counting streaming prompt tokens - azure 2023-12-29 16:13:52 +05:30
ishaan-jaff
4f832bce52 (fix) token_counter for tool calling 2023-12-29 15:54:03 +05:30
ishaan-jaff
806551ff99 (fix) use openai token counter for azure llms 2023-12-29 15:37:46 +05:30
ishaan-jaff
70376d3a4f (feat) azure stream - count correct prompt tokens 2023-12-29 15:15:39 +05:30
ishaan-jaff
8475fddc78 (feat) cloudflare - add exception mapping 2023-12-29 12:31:10 +05:30
ishaan-jaff
27f8598867 (feat) add cloudflare streaming 2023-12-29 12:01:26 +05:30
ishaan-jaff
c69f4f17a5 (feat) cloudflare - add optional params 2023-12-29 11:50:09 +05:30
ishaan-jaff
5d31bea9e0 (fix) tg AI cost tracking - zero-one-ai/Yi-34B-Chat 2023-12-29 09:14:07 +05:30
ishaan-jaff
362bed6ca3 (fix) together_ai cost tracking 2023-12-28 22:11:08 +05:30
Krrish Dholakia
5a48dac83f fix(vertex_ai.py): support function calling for gemini 2023-12-28 19:07:04 +05:30
ishaan-jaff
2a147579ec (feat) add voyage ai embeddings 2023-12-28 17:10:15 +05:30
Krrish Dholakia
507b6bf96e fix(utils.py): use local tiktoken copy 2023-12-28 11:22:33 +05:30
Krrish Dholakia
606de01ac0 fix(utils.py): allow text completion input to be either model or engine 2023-12-27 17:24:16 +05:30
ishaan-jaff
5f9e18c4c0 (fix) openai + stream - logprobs check 2023-12-27 16:59:56 +05:30
ishaan-jaff
c65d9a8b54 (feat) text-completion-openai, send 1 finish_reason 2023-12-27 15:45:40 +05:30
ishaan-jaff
592bcd5eea (fix) text_completion use correct finish reason 2023-12-27 15:20:26 +05:30
Krrish Dholakia
fd5e6efb1d fix(azure.py,-openai.py): correctly raise errors if streaming calls fail 2023-12-27 15:08:37 +05:30
Krrish Dholakia
85549c3d66 fix(google_kms.py): support enums for key management system 2023-12-27 13:19:33 +05:30
ishaan-jaff
021d7fab65 (feat) add text_completion, atext_completion CallTypes 2023-12-27 12:24:16 +05:30
ishaan-jaff
99c86bf890 (fix) streaming logprobs=None 2023-12-26 15:42:51 +05:30
Krrish Dholakia
6f695838e5 feat(utils.py): support google kms for secret management
https://github.com/BerriAI/litellm/issues/1235
2023-12-26 15:39:40 +05:30
ishaan-jaff
9c6525e4e2 (feat) logprobs for streaming openai 2023-12-26 15:15:05 +05:30
ishaan-jaff
0428a5cc04 (fix) optional params - openai/azure. don't overwrite it 2023-12-26 14:32:59 +05:30
ishaan-jaff
c1b1d0d15d (feat) support logprobs, top_logprobs openai 2023-12-26 14:00:42 +05:30
ishaan-jaff
109f82efee (fix) langfuse - asycn logger 2023-12-26 08:49:49 +05:30
ishaan-jaff
c199d4c1fc (feat) ollama_chat - add async streaming 2023-12-25 23:45:01 +05:30
ishaan-jaff
0f4b5a1446 (feat) add ollama_chat exception mapping 2023-12-25 23:43:14 +05:30
ishaan-jaff
35a68665d1 (feat) ollama_chat - streaming 2023-12-25 23:38:47 +05:30
ishaan-jaff
763ba913ec utils - convert ollama_chat params 2023-12-25 23:04:17 +05:30
Krrish Dholakia
79978c44ba refactor: add black formatting 2023-12-25 14:11:20 +05:30
Krrish Dholakia
6d73a77b01 fix(proxy_server.py): raise streaming exceptions 2023-12-25 07:18:09 +05:30
Krrish Dholakia
d1dea7c87d fix(utils.py): log user_id to langfuse 2023-12-23 12:14:09 +05:30
Krrish Dholakia
1878392f64 bump: version 1.15.6 → 1.15.7 2023-12-23 10:03:49 +05:30
Krrish Dholakia
c8d3a609e1 fix(langsmith.py): fix langsmith streaming logging 2023-12-23 10:02:35 +05:30
Krrish Dholakia
f5ffea471d fix(utils.py): handle ollama yielding a dict 2023-12-22 12:23:42 +05:30
Krrish Dholakia
e3d486efe2 fix(utils.py): handle 'os.environ/' being passed in as kwargs 2023-12-22 11:08:44 +05:30
Krrish Dholakia
c084f04a35 fix(router.py): add support for async image generation endpoints 2023-12-21 14:38:44 +05:30
Krrish Dholakia
8c7d62e62d fix(utils.py): fix non_default_param pop error for ollama 2023-12-21 06:59:13 +05:30
Krrish Dholakia
77b11daf28 fix(utils.py): add support for anyscale function calling 2023-12-20 17:48:33 +05:30
Krrish Dholakia
23d0278739 feat(azure.py): add support for azure image generations endpoint 2023-12-20 16:37:21 +05:30
Krrish Dholakia
636ac9b605 feat(ollama.py): add support for ollama function calling 2023-12-20 14:59:55 +05:30