Ishaan Jaff
|
ad47fee181
|
feat add text completion config for mistral text
|
2024-06-17 12:48:46 -07:00 |
|
Ishaan Jaff
|
3ae05c0404
|
vo - init commit adding codestral API
|
2024-06-17 11:05:24 -07:00 |
|
Krrish Dholakia
|
b886812787
|
fix(__init__.py): add gemini models to all model list
Fixes https://github.com/BerriAI/litellm/issues/4240
|
2024-06-17 10:54:28 -07:00 |
|
Krrish Dholakia
|
115adc7c30
|
fix(init.py): fix imports
|
2024-06-15 11:31:09 -07:00 |
|
Krrish Dholakia
|
4f91205530
|
refactor(utils.py): refactor Logging to it's own class. Cut down utils.py to <10k lines.
Easier debugging
Reference: https://github.com/BerriAI/litellm/issues/4206
|
2024-06-15 10:57:20 -07:00 |
|
Krrish Dholakia
|
6f715b4782
|
feat(router.py): support content policy fallbacks
Closes https://github.com/BerriAI/litellm/issues/2632
|
2024-06-14 17:15:44 -07:00 |
|
Krrish Dholakia
|
83d8711f16
|
feat(__init__.py): allow setting drop_params as an env
Closes https://github.com/BerriAI/litellm/issues/4175
|
2024-06-13 16:00:14 -07:00 |
|
Krish Dholakia
|
05e21441a6
|
Merge branch 'main' into litellm_vertex_completion_httpx
|
2024-06-12 21:19:22 -07:00 |
|
Krrish Dholakia
|
c426d75e91
|
fix(vertex_httpx.py): add function calling support to httpx route
|
2024-06-12 21:11:00 -07:00 |
|
Ishaan Jaff
|
e128dc4e1f
|
feat - add azure ai studio models on litellm ui
|
2024-06-12 20:28:16 -07:00 |
|
Krrish Dholakia
|
3955b058ed
|
fix(vertex_httpx.py): support streaming via httpx client
|
2024-06-12 19:55:14 -07:00 |
|
Wonseok Lee (Jack)
|
776c75c1e5
|
Merge branch 'main' into feat/friendliai
|
2024-06-13 09:59:56 +09:00 |
|
Ishaan Jaff
|
7eeef7ec1f
|
feat - add mistral embedding config
|
2024-06-12 15:00:00 -07:00 |
|
Ishaan Jaff
|
f09158504b
|
feat - support vertex ai dimensions
|
2024-06-12 09:29:51 -07:00 |
|
Krrish Dholakia
|
54f9faac79
|
fix(__init__.py): fix linting error
|
2024-06-11 18:42:01 -07:00 |
|
Krish Dholakia
|
a53ba9b2fb
|
Merge pull request #4134 from BerriAI/litellm_azure_ai_route
Azure AI support all models
|
2024-06-11 18:24:05 -07:00 |
|
Krrish Dholakia
|
6305d2dbcf
|
fix(__init__.py): add 'log_raw_request_response' flag to init
|
2024-06-11 17:26:03 -07:00 |
|
Krrish Dholakia
|
88e567af2c
|
fix(utils.py): add new 'azure_ai/' route
supports azure's openai compatible api endpoint
|
2024-06-11 14:06:56 -07:00 |
|
Krrish Dholakia
|
7eae0ff7e3
|
fix(utils.py): allow user to opt in to raw request logging to langfuse
|
2024-06-11 13:35:22 -07:00 |
|
wslee
|
18cc703aa2
|
change friendli_ai -> friendliai
|
2024-06-11 16:17:30 +09:00 |
|
wslee
|
fe8d59f5eb
|
add friendli_ai provider
|
2024-06-10 17:27:15 +09:00 |
|
Krrish Dholakia
|
b26c3c7d22
|
fix(cost_calculator.py): fixes tgai unmapped model pricing
Fixes error where tgai helper function returned None. Enforces stronger type hints, refactors code, adds more unit testing.
|
2024-06-08 19:43:57 -07:00 |
|
Ishaan Jaff
|
d2857fc24c
|
Merge branch 'main' into litellm_redact_messages_slack_alerting
|
2024-06-07 12:43:53 -07:00 |
|
Krish Dholakia
|
26993c067e
|
Merge branch 'main' into litellm_bedrock_converse_api
|
2024-06-07 08:49:52 -07:00 |
|
Krrish Dholakia
|
51ba5652a0
|
feat(bedrock_httpx.py): working bedrock converse api streaming
|
2024-06-06 22:13:21 -07:00 |
|
Krrish Dholakia
|
6e9bca59b0
|
fix(utils.py): fix exception mapping for azure internal server error
|
2024-06-06 17:12:30 -07:00 |
|
Ishaan Jaff
|
1e8429bb20
|
feat - redact messages from slack alerting
|
2024-06-06 10:38:15 -07:00 |
|
Krrish Dholakia
|
a76a9b7d11
|
feat(bedrock_httpx.py): add support for bedrock converse api
closes https://github.com/BerriAI/litellm/issues/4000
|
2024-06-05 21:20:36 -07:00 |
|
Ishaan Jaff
|
5bd658493f
|
feat - working audit logs for create, update delete team
|
2024-06-05 17:50:27 -07:00 |
|
Krrish Dholakia
|
52a2f5150c
|
fix(utils.py): fix cost calculation for openai-compatible streaming object
|
2024-06-04 10:36:25 -07:00 |
|
Krish Dholakia
|
5ee3b0f30f
|
Merge pull request #3996 from BerriAI/litellm_azure_assistants_api_support
feat(assistants/main.py): Azure Assistants API support
|
2024-06-03 21:05:03 -07:00 |
|
Krrish Dholakia
|
7163bce37b
|
feat(assistants/main.py): Closes https://github.com/BerriAI/litellm/issues/3993
|
2024-06-03 18:47:05 -07:00 |
|
Krrish Dholakia
|
872cd2d8a0
|
fix(langfuse.py): log litellm response cost as part of langfuse metadata
|
2024-06-03 12:58:30 -07:00 |
|
Ishaan Jaff
|
fb49d036fb
|
Merge pull request #3962 from BerriAI/litellm_return_num_rets_max_exceptions
[Feat] return `num_retries` and `max_retries` in exceptions
|
2024-06-01 17:48:38 -07:00 |
|
Ishaan Jaff
|
286d42a881
|
feat - add num retries and max retries in exception
|
2024-06-01 16:53:00 -07:00 |
|
Krrish Dholakia
|
a16a1c407a
|
fix(http_handler.py): allow setting ca bundle path
|
2024-06-01 14:48:53 -07:00 |
|
Krish Dholakia
|
8375e9621c
|
Merge pull request #3954 from BerriAI/litellm_simple_request_prioritization
feat(scheduler.py): add request prioritization scheduler
|
2024-05-31 23:29:09 -07:00 |
|
Krish Dholakia
|
e7ff3adc26
|
Merge pull request #3944 from BerriAI/litellm_fix_parallel_streaming
fix: fix streaming with httpx client
|
2024-05-31 21:42:37 -07:00 |
|
Krrish Dholakia
|
f8d4be710e
|
docs(scheduler.md): add request prioritization to docs
|
2024-05-31 19:35:47 -07:00 |
|
Yulong Liu
|
6a004b9211
|
add document
|
2024-05-31 18:55:22 -07:00 |
|
Krrish Dholakia
|
93c3635b64
|
fix: fix streaming with httpx client
prevent overwriting streams in parallel streaming calls
|
2024-05-31 10:55:18 -07:00 |
|
Ishaan Jaff
|
f52bf5976b
|
fix - vertex ai cache clients
|
2024-05-30 21:22:32 -07:00 |
|
Krrish Dholakia
|
a67cbf47f6
|
feat(main.py): support openai tts endpoint
Closes https://github.com/BerriAI/litellm/issues/3094
|
2024-05-30 14:28:28 -07:00 |
|
Krrish Dholakia
|
3167bee25a
|
fix(proxy_cli.py): enable json logging via litellm_settings param on config
allows user to enable json logs without needing to figure out env variables
|
2024-05-29 21:41:20 -07:00 |
|
Ishaan Jaff
|
d5dbf084ed
|
feat - import batches in __init__
|
2024-05-28 15:35:11 -07:00 |
|
Krrish Dholakia
|
f0f853b941
|
fix(utils.py): support deepinfra optional params
Fixes https://github.com/BerriAI/litellm/issues/3855
|
2024-05-27 09:16:56 -07:00 |
|
Krrish Dholakia
|
f04e4b921b
|
feat(ui/model_dashboard.tsx): add databricks models via admin ui
|
2024-05-23 20:28:54 -07:00 |
|
Krish Dholakia
|
c14584722e
|
Merge pull request #3808 from BerriAI/litellm_databricks_api
feat(databricks.py): adds databricks support - completion, async, streaming
|
2024-05-23 19:23:19 -07:00 |
|
Krrish Dholakia
|
43353c28b3
|
feat(databricks.py): add embedding model support
|
2024-05-23 18:22:03 -07:00 |
|
Krrish Dholakia
|
d2229dcd21
|
feat(databricks.py): adds databricks support - completion, async, streaming
Closes https://github.com/BerriAI/litellm/issues/2160
|
2024-05-23 16:29:46 -07:00 |
|