Krrish Dholakia
|
1c6438c267
|
fix(anthropic.py): support streaming with function calling
|
2024-03-12 09:52:11 -07:00 |
|
ishaan-jaff
|
fb52a98e81
|
(fix) support streaming for azure/instruct models
|
2024-03-12 09:50:43 -07:00 |
|
Krrish Dholakia
|
0806a45bd7
|
fix(utils.py): support response_format for mistral ai api
|
2024-03-11 10:23:41 -07:00 |
|
Krish Dholakia
|
774ceb741c
|
Merge pull request #2426 from BerriAI/litellm_whisper_cost_tracking
feat: add cost tracking + caching for `/audio/transcription` calls
|
2024-03-09 19:12:06 -08:00 |
|
Krrish Dholakia
|
78e178cec1
|
fix(utils.py): fix model setting in completion cost
|
2024-03-09 19:11:37 -08:00 |
|
Krrish Dholakia
|
548e9a3590
|
fix(utils.py): fix model name checking
|
2024-03-09 18:22:26 -08:00 |
|
Krrish Dholakia
|
b2ce963498
|
feat: add cost tracking + caching for transcription calls
|
2024-03-09 15:43:38 -08:00 |
|
Krrish Dholakia
|
d8cf889597
|
fix(bedrock.py): enable claude-3 streaming
|
2024-03-09 14:02:27 -08:00 |
|
Krish Dholakia
|
f461352908
|
Merge branch 'main' into litellm_load_balancing_transcription_endpoints
|
2024-03-08 23:08:47 -08:00 |
|
Krish Dholakia
|
75bc854294
|
Merge pull request #2401 from BerriAI/litellm_transcription_endpoints
feat(main.py): support openai transcription endpoints
|
2024-03-08 23:07:48 -08:00 |
|
Krrish Dholakia
|
0e93ad9c33
|
fix(utils.py): *new* get_supported_openai_params() function
Returns the supported openai params for a given model + provider
|
2024-03-08 23:06:40 -08:00 |
|
Krrish Dholakia
|
43a9f634c0
|
fix(utils.py): add additional providers to get_supported_openai_params
|
2024-03-08 23:06:40 -08:00 |
|
Krrish Dholakia
|
744fe7232d
|
fix(utils.py): add support for anthropic params in get_supported_openai_params
|
2024-03-08 23:06:40 -08:00 |
|
Krrish Dholakia
|
7ab930f6ff
|
fix(azure.py): add pre call logging for transcription calls
|
2024-03-08 22:23:21 -08:00 |
|
Krrish Dholakia
|
93615682fe
|
feat(proxy_server.py): working /audio/transcription endpoint
|
2024-03-08 18:20:27 -08:00 |
|
ishaan-jaff
|
6780b4c3bf
|
(feat) use no-log to disable per request logging
|
2024-03-08 16:56:20 -08:00 |
|
ishaan-jaff
|
e749521c0b
|
(feat) use no-log as a litellm param
|
2024-03-08 16:46:38 -08:00 |
|
ishaan-jaff
|
feefdd631c
|
(feat) disable logging per request
|
2024-03-08 16:25:54 -08:00 |
|
Krrish Dholakia
|
bdf8e2d3c7
|
feat(main.py): support openai transcription endpoints
enable user to load balance between openai + azure transcription endpoints
|
2024-03-08 10:25:19 -08:00 |
|
Krrish Dholakia
|
69ca9cf0fa
|
fix(utils.py): return function name for ollama_chat function calls
|
2024-03-08 08:01:10 -08:00 |
|
Krrish Dholakia
|
badd8cf7ef
|
fix(utils.py): fix google ai studio timeout error raising
|
2024-03-06 21:12:04 -08:00 |
|
Krish Dholakia
|
ede9647e49
|
Merge pull request #2377 from BerriAI/litellm_team_level_model_groups
feat(proxy_server.py): team based model aliases
|
2024-03-06 21:03:53 -08:00 |
|
Krrish Dholakia
|
995c31db84
|
fix(utils.py): fix get optional param embeddings
|
2024-03-06 20:47:05 -08:00 |
|
ishaan-jaff
|
60b2e3c7e6
|
(fix) vertex_ai test_vertex_projects optional params embedding
|
2024-03-06 20:33:25 -08:00 |
|
Krish Dholakia
|
050a056e09
|
Merge pull request #2347 from BerriAI/litellm_retry_rate_limited_requests
feat(proxy_server.py): retry if virtual key is rate limited
|
2024-03-06 19:23:11 -08:00 |
|
Krrish Dholakia
|
ff279ec77b
|
test(test_completion.py): handle gemini timeout error
|
2024-03-06 19:05:39 -08:00 |
|
ishaan-jaff
|
47174c106c
|
(fix) dict changed size during iteration
|
2024-03-06 17:53:01 -08:00 |
|
Krrish Dholakia
|
43c0d31ea6
|
fix(utils.py): set status code for api error
|
2024-03-05 21:37:59 -08:00 |
|
Krrish Dholakia
|
affcfdf561
|
fix(utils.py): fix mistral api exception mapping
|
2024-03-05 20:45:16 -08:00 |
|
Krrish Dholakia
|
500b58de91
|
fix(utils.py): handle dict object for chatcompletionmessagetoolcall
|
2024-03-05 18:10:58 -08:00 |
|
Krrish Dholakia
|
a152658e49
|
fix(utils.py): handle none in tool call for mistral tool calling
|
2024-03-05 16:48:37 -08:00 |
|
Krrish Dholakia
|
38bcc910b7
|
fix: clean up print verbose statements
|
2024-03-05 15:01:03 -08:00 |
|
Krrish Dholakia
|
5e0cfe82a7
|
fix(utils.py): fix logging
|
2024-03-05 13:37:38 -08:00 |
|
Krish Dholakia
|
d53257b7f7
|
Merge branch 'main' into litellm_claude_3_bedrock_access
|
2024-03-05 07:10:45 -08:00 |
|
Krrish Dholakia
|
87831588c8
|
fix(utils.py): fix default message object values
|
2024-03-04 21:19:03 -08:00 |
|
Ishaan Jaff
|
c6ea671548
|
Merge branch 'main' into litellm_maintain_Claude2_support
|
2024-03-04 21:14:28 -08:00 |
|
Krrish Dholakia
|
f0a5e0ffe9
|
fix(bedrock.py): working image calls to claude 3
|
2024-03-04 18:12:47 -08:00 |
|
Krrish Dholakia
|
dad65ca602
|
fix(bedrock.py): support anthropic messages api on bedrock (claude-3)
|
2024-03-04 17:15:47 -08:00 |
|
Krrish Dholakia
|
4f5f6ec812
|
test(test_completion.py): add testing for anthropic vision calling
|
2024-03-04 13:34:49 -08:00 |
|
ishaan-jaff
|
963313412d
|
(feat) maintain anthropic text completion
|
2024-03-04 11:16:34 -08:00 |
|
Krrish Dholakia
|
1e2154317c
|
feat(anthropic.py): adds tool calling support
|
2024-03-04 10:42:28 -08:00 |
|
Krrish Dholakia
|
a1ce24c5f0
|
fix(huggingface_restapi.py): fix huggingface streaming error raising
|
2024-03-04 09:32:41 -08:00 |
|
Ishaan Jaff
|
561e7ff453
|
Merge pull request #2315 from BerriAI/litellm_add_claude_3
[FEAT]- add claude 3
|
2024-03-04 09:23:13 -08:00 |
|
Ishaan Jaff
|
fc34999168
|
Merge pull request #2290 from ti3x/bedrock_mistral
Add support for Bedrock Mistral models
|
2024-03-04 08:42:47 -08:00 |
|
Krrish Dholakia
|
8c2ac9101e
|
fix(utils.py): fix num retries logic
|
2024-03-04 08:01:02 -08:00 |
|
ishaan-jaff
|
b3f738832c
|
(feat) streaming claude-3
|
2024-03-04 07:29:23 -08:00 |
|
ishaan-jaff
|
26eea94404
|
(feat) - add claude 3
|
2024-03-04 07:13:08 -08:00 |
|
Tim Xia
|
7a5602e8f4
|
update comments
|
2024-03-02 13:34:39 -05:00 |
|
Tim Xia
|
79a62564f3
|
map optional params
|
2024-03-02 13:25:04 -05:00 |
|
Krish Dholakia
|
da0ab536d3
|
Merge pull request #2292 from BerriAI/litellm_mistral_streaming_error
fix(utils.py): handle mistral streaming error
|
2024-03-02 07:48:14 -08:00 |
|