Krrish Dholakia
|
7de77ab677
|
fix(init.py): fix imports
|
2024-06-15 11:31:09 -07:00 |
|
Krrish Dholakia
|
8f07399c57
|
fix(types/utils.py): fix import
|
2024-06-15 11:04:15 -07:00 |
|
Krrish Dholakia
|
9d7f5d503c
|
refactor(utils.py): refactor Logging to it's own class. Cut down utils.py to <10k lines.
Easier debugging
Reference: https://github.com/BerriAI/litellm/issues/4206
|
2024-06-15 10:57:20 -07:00 |
|
Krrish Dholakia
|
ab4b1d931b
|
fix(vertex_httpx.py): support json schema
|
2024-06-12 21:46:43 -07:00 |
|
Krrish Dholakia
|
e60b0e96e4
|
fix(vertex_httpx.py): add function calling support to httpx route
|
2024-06-12 21:11:00 -07:00 |
|
Krrish Dholakia
|
1dac2aa59f
|
fix(vertex_httpx.py): support streaming via httpx client
|
2024-06-12 19:55:14 -07:00 |
|
Krrish Dholakia
|
29169b3039
|
feat(vertex_httpx.py): Moving to call vertex ai via httpx (instead of their sdk). Allows us to support all their api updates.
|
2024-06-12 16:47:00 -07:00 |
|
Krrish Dholakia
|
2d95eaa5bc
|
fix(bedrock_httpx.py): fix tool calling for anthropic bedrock calls w/ streaming
Fixes https://github.com/BerriAI/litellm/issues/4091
|
2024-06-10 14:20:25 -07:00 |
|
Krrish Dholakia
|
58cce8a922
|
fix(types/router.py): modelgroupinfo to handle mode being None and supported_openai_params not being a list
|
2024-06-08 20:13:45 -07:00 |
|
Krrish Dholakia
|
22b51c5af4
|
fix(litellm_pre_call_utils.py): add support for key level caching params
|
2024-06-07 22:09:14 -07:00 |
|
Ishaan Jaff
|
37cc6150e3
|
linting fix
|
2024-06-07 17:23:05 -07:00 |
|
Ishaan Jaff
|
92841dfe1b
|
Merge branch 'main' into litellm_security_fix
|
2024-06-07 16:52:25 -07:00 |
|
Krrish Dholakia
|
de98bd939c
|
fix(test_custom_callbacks_input.py): unit tests for 'turn_off_message_logging'
ensure no raw request is logged either
|
2024-06-07 15:39:15 -07:00 |
|
Ishaan Jaff
|
6ffe5e75ba
|
use MappingProxyType for now. will fix python 3.8 install later
|
2024-06-07 14:31:31 -07:00 |
|
Ishaan Jaff
|
0cd836cf46
|
fix type error on python 3.8
|
2024-06-07 14:23:06 -07:00 |
|
Ishaan Jaff
|
860c9b52b6
|
Merge branch 'main' into litellm_svc_logger
|
2024-06-07 14:01:54 -07:00 |
|
Krish Dholakia
|
8f1e3aab7b
|
Merge pull request #4055 from UsableMachines/additional-gemini-types
Fix to support all file types supported by Gemini
|
2024-06-07 13:24:06 -07:00 |
|
Krrish Dholakia
|
672dcf0c6f
|
fix(factory.py): handle bedrock claude image url's
|
2024-06-07 10:04:03 -07:00 |
|
Krrish Dholakia
|
c41b60f6bf
|
feat(bedrock_httpx.py): working bedrock converse api streaming
|
2024-06-06 22:13:21 -07:00 |
|
Ishaan Jaff
|
c867f88c57
|
fix - add new types for ServiceLoggerPayload
|
2024-06-06 22:06:28 -07:00 |
|
Krrish Dholakia
|
f8b5aa3df6
|
fix(bedrock_httpx.py): working claude 3 function calling
|
2024-06-06 20:12:41 -07:00 |
|
nick-rackauckas
|
93bf678026
|
Comment
|
2024-06-06 16:35:39 -07:00 |
|
nick-rackauckas
|
b4eba4bddd
|
Fix to work with all supported Gemini file types
|
2024-06-06 16:15:01 -07:00 |
|
Krrish Dholakia
|
96b556f385
|
feat(bedrock_httpx.py): add support for bedrock converse api
closes https://github.com/BerriAI/litellm/issues/4000
|
2024-06-05 21:20:36 -07:00 |
|
Krrish Dholakia
|
c1f987e5d3
|
fix(types/router.py): add 'drop_params' as a litellm param to router types
|
2024-06-05 09:12:45 -07:00 |
|
Krrish Dholakia
|
20cb525a5c
|
feat(assistants/main.py): add assistants api streaming support
|
2024-06-04 16:30:35 -07:00 |
|
Krish Dholakia
|
127d1457de
|
Merge pull request #3996 from BerriAI/litellm_azure_assistants_api_support
feat(assistants/main.py): Azure Assistants API support
|
2024-06-03 21:05:03 -07:00 |
|
Krrish Dholakia
|
a2ba63955a
|
feat(assistants/main.py): Closes https://github.com/BerriAI/litellm/issues/3993
|
2024-06-03 18:47:05 -07:00 |
|
Ishaan Jaff
|
fb9a174462
|
feat - set allowed fails policy
|
2024-06-01 17:39:44 -07:00 |
|
Ishaan Jaff
|
a11175c05b
|
feat - set custom AllowedFailsPolicy
|
2024-06-01 17:26:21 -07:00 |
|
Krish Dholakia
|
1529f665cc
|
Merge pull request #3954 from BerriAI/litellm_simple_request_prioritization
feat(scheduler.py): add request prioritization scheduler
|
2024-05-31 23:29:09 -07:00 |
|
Krrish Dholakia
|
8ff137bce3
|
feat(scheduler.py): add request prioritization scheduler
allow user to set priority for a request
|
2024-05-31 18:51:13 -07:00 |
|
Krish Dholakia
|
c049b6b4af
|
Merge pull request #3936 from BerriAI/litellm_assistants_api_proxy
feat(proxy_server.py): add assistants api endpoints to proxy server
|
2024-05-31 18:43:22 -07:00 |
|
Krrish Dholakia
|
2fdf4a7bb4
|
feat(proxy_server.py): add assistants api endpoints to proxy server
|
2024-05-30 22:44:43 -07:00 |
|
lj
|
f1fe41db74
|
Merge branch 'main' into fix-pydantic-warnings-again
|
2024-05-31 11:35:42 +08:00 |
|
Krrish Dholakia
|
eb159b64e1
|
fix(openai.py): fix openai response for /audio/speech endpoint
|
2024-05-30 16:41:06 -07:00 |
|
Krrish Dholakia
|
1e89a1f56e
|
feat(main.py): support openai tts endpoint
Closes https://github.com/BerriAI/litellm/issues/3094
|
2024-05-30 14:28:28 -07:00 |
|
Krrish Dholakia
|
a4dae8e9f1
|
docs(customers.md): add customer cost tracking to docs
|
2024-05-29 14:55:33 -07:00 |
|
Ishaan Jaff
|
cd4a3627e8
|
feat - add afile_content, file_content
|
2024-05-28 20:58:22 -07:00 |
|
Ishaan Jaff
|
9daf02e977
|
fix python 3.8 error
|
2024-05-28 17:21:59 -07:00 |
|
Ishaan Jaff
|
fc4ca265b8
|
working create_batch
|
2024-05-28 15:45:23 -07:00 |
|
Ishaan Jaff
|
4dc7bfebd4
|
feat - import batches in __init__
|
2024-05-28 15:35:11 -07:00 |
|
Ishaan Jaff
|
f83c81a2d8
|
feat - add batches types
|
2024-05-28 12:53:46 -07:00 |
|
Ishaan Jaff
|
5fed67dcc3
|
Merge pull request #3868 from BerriAI/litellm_show_updated_created_models
[Feat] Show Created at, Created by on `Models` Page
|
2024-05-27 16:32:29 -07:00 |
|
Ishaan Jaff
|
d7043baf6d
|
router - include updated at and created at in model info
|
2024-05-27 15:53:16 -07:00 |
|
Krrish Dholakia
|
23b28601b7
|
fix(fix-'get_model_group_info'-to-return-a-default-value-if-unmapped-model-group): allows model hub to return all model groupss
|
2024-05-27 13:53:01 -07:00 |
|
Ishaan Jaff
|
69ea7d57fb
|
feat - show openai params on model hub ui
|
2024-05-27 08:49:51 -07:00 |
|
Krrish Dholakia
|
8e9a3fef81
|
feat(proxy_server.py): expose new /model_group/info endpoint
returns model-group level info on supported params, max tokens, pricing, etc.
|
2024-05-26 14:07:35 -07:00 |
|
Krrish Dholakia
|
c50074a0b7
|
feat(ui/model_dashboard.tsx): add databricks models via admin ui
|
2024-05-23 20:28:54 -07:00 |
|
Krrish Dholakia
|
143a44823a
|
feat(databricks.py): adds databricks support - completion, async, streaming
Closes https://github.com/BerriAI/litellm/issues/2160
|
2024-05-23 16:29:46 -07:00 |
|