Ishaan Jaff
2cb65b450d
bump: version 1.49.1 → 1.49.2
2024-10-12 16:02:17 +05:30
Ishaan Jaff
80ecf0829c
(fix) provider wildcard routing - when models specificed without provider prefix ( #6173 )
...
* fix wildcard routing scenario
* fix pattern matching hits
2024-10-12 16:01:21 +05:30
Ishaan Jaff
b032e898c2
(fix) batch_completion fails with bedrock due to extraneous [max_workers] key ( #6176 )
...
* fix batch_completion
* fix import batch completion
* fix batch completion usage
2024-10-12 14:10:24 +05:30
Krish Dholakia
11f9df923a
LiteLLM Minor Fixes & Improvements (10/10/2024) ( #6158 )
...
* refactor(vertex_ai_partner_models/anthropic): refactor anthropic to use partner model logic
* fix(vertex_ai/): support passing custom api base to partner models
Fixes https://github.com/BerriAI/litellm/issues/4317
* fix(proxy_server.py): Fix prometheus premium user check logic
* docs(prometheus.md): update quick start docs
* fix(custom_llm.py): support passing dynamic api key + api base
* fix(realtime_api/main.py): Add request/response logging for realtime api endpoints
Closes https://github.com/BerriAI/litellm/issues/6081
* feat(openai/realtime): add openai realtime api logging
Closes https://github.com/BerriAI/litellm/issues/6081
* fix(realtime_streaming.py): fix linting errors
* fix(realtime_streaming.py): fix linting errors
* fix: fix linting errors
* fix pattern match router
* Add literalai in the sidebar observability category (#6163 )
* fix: add literalai in the sidebar
* fix: typo
* update (#6160 )
* Feat: Add Langtrace integration (#5341 )
* Feat: Add Langtrace integration
* add langtrace service name
* fix timestamps for traces
* add tests
* Discard Callback + use existing otel logger
* cleanup
* remove print statments
* remove callback
* add docs
* docs
* add logging docs
* format logging
* remove emoji and add litellm proxy example
* format logging
* format `logging.md`
* add langtrace docs to logging.md
* sync conflict
* docs fix
* (perf) move s3 logging to Batch logging + async [94% faster perf under 100 RPS on 1 litellm instance] (#6165 )
* fix move s3 to use customLogger
* add basic s3 logging test
* add s3 to custom logger compatible
* use batch logger for s3
* s3 set flush interval and batch size
* fix s3 logging
* add notes on s3 logging
* fix s3 logging
* add basic s3 logging test
* fix s3 type errors
* add test for sync logging on s3
* fix: fix to debug log
---------
Co-authored-by: Ishaan Jaff <ishaanjaffer0324@gmail.com>
Co-authored-by: Willy Douhard <willy.douhard@gmail.com>
Co-authored-by: yujonglee <yujonglee.dev@gmail.com>
Co-authored-by: Ali Waleed <ali@scale3labs.com>
2024-10-11 23:04:36 -07:00
Ishaan Jaff
9db4ccca9f
add azure/gpt-4o-2024-05-13 ( #6174 )
2024-10-12 10:47:45 +05:30
Ishaan Jaff
91ecb36277
Revert "(perf) move s3 logging to Batch logging + async [94% faster perf under 100 RPS on 1 litellm instance] ( #6165 )"
...
This reverts commit 2a5624af47
.
2024-10-12 07:08:30 +05:30
Ishaan Jaff
2a5624af47
(perf) move s3 logging to Batch logging + async [94% faster perf under 100 RPS on 1 litellm instance] ( #6165 )
...
* fix move s3 to use customLogger
* add basic s3 logging test
* add s3 to custom logger compatible
* use batch logger for s3
* s3 set flush interval and batch size
* fix s3 logging
* add notes on s3 logging
* fix s3 logging
* add basic s3 logging test
* fix s3 type errors
* add test for sync logging on s3
2024-10-11 19:49:03 +05:30
Ishaan Jaff
4e1c892dfc
docs fix
2024-10-11 19:32:59 +05:30
Ali Waleed
7ec414a3cf
Feat: Add Langtrace integration ( #5341 )
...
* Feat: Add Langtrace integration
* add langtrace service name
* fix timestamps for traces
* add tests
* Discard Callback + use existing otel logger
* cleanup
* remove print statments
* remove callback
* add docs
* docs
* add logging docs
* format logging
* remove emoji and add litellm proxy example
* format logging
* format `logging.md`
* add langtrace docs to logging.md
* sync conflict
2024-10-11 19:19:53 +05:30
yujonglee
42174fde4e
update ( #6160 )
2024-10-11 19:18:56 +05:30
Willy Douhard
8b00d2a25f
Add literalai in the sidebar observability category ( #6163 )
...
* fix: add literalai in the sidebar
* fix: typo
2024-10-11 19:18:47 +05:30
Ishaan Jaff
d28c6b390c
fix pattern match router
2024-10-11 12:12:57 +05:30
Ishaan Jaff
63c63612c2
bump: version 1.49.0 → 1.49.1
2024-10-11 00:14:03 +05:30
Ishaan Jaff
98b1abbff8
drop imghdr ( #5736 ) ( #6153 )
...
Co-authored-by: Leon Derczynski <leonderczynski@gmail.com>
2024-10-10 19:35:48 +05:30
Ishaan Jaff
1a9d9e1cad
fix typing on opik.py
2024-10-10 18:46:07 +05:30
Ishaan Jaff
aadbbe9841
fix _opik logger
2024-10-10 18:43:39 +05:30
Ishaan Jaff
fbf756806e
fix opik types
2024-10-10 18:37:53 +05:30
Jacques Verré
4064bfc6dd
[Feat] Observability integration - Opik by Comet ( #6062 )
...
* Added Opik logging and evaluation
* Updated doc examples
* Default tags should be [] in case appending
* WIP
* Work in progress
* Opik integration
* Opik integration
* Revert changes on litellm_logging.py
* Updated Opik integration for synchronous API calls
* Updated Opik documentation
---------
Co-authored-by: Douglas Blank <doug@comet.com>
Co-authored-by: Doug Blank <doug.blank@gmail.com>
2024-10-10 18:27:50 +05:30
Ishaan Jaff
89506053a4
(feat) use regex pattern matching for wildcard routing ( #6150 )
...
* use pattern matching for llm deployments
* code quality fix
* fix linting
* add types to PatternMatchRouter
* docs add example config for regex patterns
2024-10-10 18:24:16 +05:30
Krish Dholakia
6005450c8f
LiteLLM Minor Fixes & Improvements (10/09/2024) ( #6139 )
...
* fix(utils.py): don't return 'none' response headers
Fixes https://github.com/BerriAI/litellm/issues/6123
* fix(vertex_and_google_ai_studio_gemini.py): support parsing out additional properties and strict value for tool calls
Fixes https://github.com/BerriAI/litellm/issues/6136
* fix(cost_calculator.py): set default character value to none
Fixes https://github.com/BerriAI/litellm/issues/6133#issuecomment-2403290196
* fix(google.py): fix cost per token / cost per char conversion
Fixes https://github.com/BerriAI/litellm/issues/6133#issuecomment-2403370287
* build(model_prices_and_context_window.json): update gemini pricing
Fixes https://github.com/BerriAI/litellm/issues/6133
* build(model_prices_and_context_window.json): update gemini pricing
* fix(litellm_logging.py): fix streaming caching logging when 'turn_off_message_logging' enabled
Stores unredacted response in cache
* build(model_prices_and_context_window.json): update gemini-1.5-flash pricing
* fix(cost_calculator.py): fix default prompt_character count logic
Fixes error in gemini cost calculation
* fix(cost_calculator.py): fix cost calc for tts models
2024-10-10 00:42:11 -07:00
Krrish Dholakia
60baa65e0e
docs(configs.md): add litellm config / s3 bucket object info in configs.md
2024-10-09 09:07:43 -07:00
Ishaan Jaff
b35da5014b
doc onboarding orgs
2024-10-09 19:11:36 +05:30
Ishaan Jaff
5da6863804
docs rbac
2024-10-09 16:46:26 +05:30
Ishaan Jaff
399f50d558
fix rbac doc
2024-10-09 16:44:46 +05:30
Ishaan Jaff
fa1451af90
ui new build
2024-10-09 16:04:49 +05:30
Ishaan Jaff
74ae7deee3
uo fixes for default team ( #6134 )
2024-10-09 16:02:08 +05:30
Ishaan Jaff
4b4bb9296f
bump: version 1.48.20 → 1.49.0
2024-10-09 15:45:39 +05:30
Ishaan Jaff
005846316d
fix get_all_team_memberships
2024-10-09 15:43:32 +05:30
Ishaan Jaff
54d8d46a3b
remove unused file from root
2024-10-09 15:28:36 +05:30
Ishaan Jaff
0e83a68a69
doc - move rbac under auth
2024-10-09 15:27:32 +05:30
Ishaan Jaff
8a9bb51f4e
fix schema.prisma change
2024-10-09 15:25:27 +05:30
Ishaan Jaff
a0bebc3413
fix literal ai typing errors
2024-10-09 15:23:39 +05:30
Ishaan Jaff
1fd437e263
(feat proxy) [beta] add support for organization role based access controls ( #6112 )
...
* track LiteLLM_OrganizationMembership
* add add_internal_user_to_organization
* add org membership to schema
* read organization membership when reading user info in auth checks
* add check for valid organization_id
* add test for test_create_new_user_in_organization
* test test_create_new_user_in_organization
* add new ADMIN role
* add test for org admins creating teams
* add test for test_org_admin_create_user_permissions
* test_org_admin_create_user_team_wrong_org_permissions
* test_org_admin_create_user_team_wrong_org_permissions
* fix organization_role_based_access_check
* fix getting user members
* fix TeamBase
* fix types used for use role
* fix type checks
* sync prisma schema
* docs - organization admins
* fix use organization_endpoints for /organization management
* add types for org member endpoints
* fix role name for org admin
* add type for member add response
* add organization/member_add
* add error handling for adding members to an org
* add nice doc string for oranization/member_add
* fix test_create_new_user_in_organization
* linting fix
* use simple route changes
* fix types
* add organization member roles
* add org admin auth checks
* add auth checks for orgs
* test for creating teams as org admin
* simplify org id usage
* fix typo
* test test_org_admin_create_user_team_wrong_org_permissions
* fix type check issue
* code quality fix
* fix schema.prisma
2024-10-09 15:18:18 +05:30
Krrish Dholakia
945267a511
build: bump version
2024-10-08 22:10:14 -07:00
Krish Dholakia
9695c1af10
LiteLLM Minor Fixes & Improvements (10/08/2024) ( #6119 )
...
* refactor(cost_calculator.py): move error line to debug - https://github.com/BerriAI/litellm/issues/5683#issuecomment-2398599498
* fix(migrate-hidden-params-to-read-from-standard-logging-payload): Fixes https://github.com/BerriAI/litellm/issues/5546#issuecomment-2399994026
* fix(types/utils.py): mark weight as a litellm param
Fixes https://github.com/BerriAI/litellm/issues/5781
* feat(internal_user_endpoints.py): fix /user/info + show user max budget as default max budget
Fixes https://github.com/BerriAI/litellm/issues/6117
* feat: support returning team member budget in `/user/info`
Sets user max budget in team as max budget on ui
Closes https://github.com/BerriAI/litellm/issues/6117
* bug fix for optional parameter passing to replicate (#6067 )
Signed-off-by: Mandana Vaziri <mvaziri@us.ibm.com>
* fix(o1_transformation.py): handle o1 temperature=0
o1 doesn't support temp=0, allow admin to drop this param
* test: fix test
---------
Signed-off-by: Mandana Vaziri <mvaziri@us.ibm.com>
Co-authored-by: Mandana Vaziri <mvaziri@us.ibm.com>
2024-10-08 21:57:03 -07:00
Willy Douhard
ac6fb0cbef
Fix: Literal AI llm completion logging ( #6096 )
...
* fix: log llm output
* chore: rename var
2024-10-08 08:33:32 -07:00
Kyrylo Yefimenko
b68fee48a6
(fix) Fix Groq pricing for llama3.1 ( #6114 )
...
* Adjust ollama models to chat instead of completions
* Fix Groq prices for llama3.1
2024-10-08 20:20:58 +05:30
Ishaan Jaff
92a1924112
trigger ci/cd run
2024-10-08 20:16:37 +05:30
Ishaan Jaff
d1760b1b04
(fix) clean up root repo - move entrypoint.sh and build_admin_ui to /docker ( #6110 )
...
* fix move docker files to docker folders
* move check file length
* fix docker hub deploy
* fix clean up root
* fix circle ci config
2024-10-08 11:34:43 +05:30
Krrish Dholakia
cc960da4b6
docs(azure.md): add o1 model support to config
2024-10-07 22:37:49 -07:00
Krrish Dholakia
9ee1a3ff8c
bump: version 1.48.18 → 1.48.19
2024-10-07 22:22:02 -07:00
Krish Dholakia
6729c9ca7f
LiteLLM Minor Fixes & Improvements (10/07/2024) ( #6101 )
...
* fix(utils.py): support dropping temperature param for azure o1 models
* fix(main.py): handle azure o1 streaming requests
o1 doesn't support streaming, fake it to ensure code works as expected
* feat(utils.py): expose `hosted_vllm/` endpoint, with tool handling for vllm
Fixes https://github.com/BerriAI/litellm/issues/6088
* refactor(internal_user_endpoints.py): cleanup unused params + update docstring
Closes https://github.com/BerriAI/litellm/issues/6100
* fix(main.py): expose custom image generation api support
Fixes https://github.com/BerriAI/litellm/issues/6097
* fix: fix linting errors
* docs(custom_llm_server.md): add docs on custom api for image gen calls
* fix(types/utils.py): handle dict type
* fix(types/utils.py): fix linting errors
2024-10-07 22:17:22 -07:00
Ishaan Jaff
5de69cb1b2
fix using Dockerfile
2024-10-08 08:45:40 +05:30
Ishaan Jaff
59b247ab23
fix config.yml
2024-10-08 08:36:03 +05:30
Ishaan Jaff
d742e8cb43
(clean up) move docker files from root to docker
folder ( #6109 )
...
* fix move docker files to docker folders
* move check file length
* fix docker hub deploy
2024-10-08 08:23:52 +05:30
Ishaan Jaff
ef815f3a84
(docs) add remaining litellm settings on configs.md doc ( #6108 )
...
* docs add litellm settings configs
* docs langfuse tags on config
2024-10-08 07:57:04 +05:30
Ishaan Jaff
2b370f8e9e
(docs) key based callbacks ( #6107 )
2024-10-08 07:12:01 +05:30
Pradyumna Singh Rathore
b7ba558b74
fix links due to broken list ( #6103 )
2024-10-07 15:47:29 -04:00
Ishaan Jaff
5afc45d411
bump: version 1.48.17 → 1.48.18
2024-10-07 18:22:21 +05:30
Ishaan Jaff
b1e9d344b2
Update readme.md
2024-10-07 18:15:15 +05:30