Commit graph

4682 commits

Author SHA1 Message Date
Ishaan Jaff
f9d9b70538 Merge branch 'main' into litellm_anthropic_responses_api_support 2025-04-18 18:46:03 -07:00
Ishaan Jaff
e29219e55c fix handler 2025-04-18 18:08:05 -07:00
Ishaan Jaff
d32b3fd91a fixes for anthropic codex support 2025-04-17 23:20:13 -07:00
Krrish Dholakia
614d80cb1b build(model_prices_and_context_window.json): add azure gpt-4.1 pricing
ensures cost tracking for gpt-4.1 works
2025-04-17 20:09:17 -07:00
Krrish Dholakia
ff81f48af3 bump: version 1.66.2 → 1.66.3 2025-04-16 22:20:10 -07:00
Krrish Dholakia
78c6d73dea build: new ui build 2025-04-16 22:11:53 -07:00
Krish Dholakia
c73a6a8d1e
Add new /vertex_ai/discovery route - enables calling AgentBuilder API routes (#10084)
* feat(llm_passthrough_endpoints.py): expose new `/vertex_ai/discovery/` endpoint

Allows calling vertex ai discovery endpoints via passthrough

 For agentbuilder api calls

* refactor(llm_passthrough_endpoints.py): use common _base_vertex_proxy_route

Prevents duplicate code

* feat(llm_passthrough_endpoints.py): add vertex endpoint specific passthrough handlers
2025-04-16 21:45:51 -07:00
Ishaan Jaff
12ccb954a6 ui new build 2025-04-16 19:23:04 -07:00
Ishaan Jaff
6220f3e7b8
[Feat SSO] Add LiteLLM SCIM Integration for Team and User management (#10072)
* fix NewUser response type

* add scim router

* add v0 scim v2 endpoints

* working scim transformation

* use 1 file for types

* fix scim firstname and givenName storage

* working SCIMErrorResponse

* working team / group provisioning on SCIM

* add SCIMPatchOp

* move scim folder

* fix import scim_router

* fix dont auto create scim keys

* add auth on all scim endpoints

* add is_virtual_key_allowed_to_call_route

* fix allowed routes

* fix for key management

* fix allowed routes check

* clean up error message

* fix code check

* fix for route checks

* ui SCIM support

* add UI tab for SCIM

* fixes SCIM

* fixes for SCIM settings on ui

* scim settings

* clean up scim view

* add migration for allowed_routes in keys table

* refactor scim transform

* fix SCIM linting error

* fix code quality check

* fix ui linting

* test_scim_transformations.py
2025-04-16 19:21:47 -07:00
Krish Dholakia
7ca553b235
Add team based usage dashboard at 1m+ spend logs (+ new /team/daily/activity API) (#10081)
* feat(ui/): add team based usage to dashboard

allows admin to see spend across teams + within teams at 1m+ spend logs

* fix(entity_usage.tsx): add activity page to entity usage

* style(entity_usage.tsx): place filter above tab switcher
2025-04-16 18:10:14 -07:00
Krish Dholakia
c0d7e9f16d
Add new /tag/daily/activity endpoint + Add tag dashboard to UI (#10073)
Some checks failed
Read Version from pyproject.toml / read-version (push) Successful in 15s
Helm unit test / unit-test (push) Successful in 24s
Publish Prisma Migrations / publish-migrations (push) Failing after 1m47s
* feat: initial commit adding daily tag spend table to db

* feat(db_spend_update_writer.py): correctly log tag spend transactions

* build(schema.prisma): add new tag table to root

* build: add new migration file

* feat(common_daily_activity.py): add `/tag/daily/activity` API endpoint

allows viewing daily spend by tag

* feat(tag_management_endpoints.py): support comma separated list of tags + tag breakdown metric

allows querying multiple tags + knowing what tags are driving spend

* feat(entity_usage.tsx): initial commit adding tag based usage to litellm dashboard

brings back tag based usage tracking to UI at 1m+ spend logs

* feat(entity_usage.tsx): add top api key view to ui

* feat(entity_usage.tsx): add tag table to ui

* feat(entity_usage.tsx): allow filtering by tag

* refactor(entity_usage.tsx): reorder components

* build(ui/): fix linting error

* fix: fix ruff checks

* fix(schema.prisma): drop uniqueness requirement on tag

allows dailytagspend to have multiple rows with the same tag

* build(schema.prisma): drop uniqueness requirement on tag in dailytagspend

allows tag agg. view to work on multiple rows with same tag

* build(schema.prisma): drop tag uniqueness requirement
2025-04-16 15:24:44 -07:00
Krish Dholakia
d8a1071bc4
Add aggregate spend by tag (#10071)
* feat: initial commit adding daily tag spend table to db

* feat(db_spend_update_writer.py): correctly log tag spend transactions

* build(schema.prisma): add new tag table to root

* build: add new migration file
2025-04-16 12:26:21 -07:00
ChaoFu Yang
c07eea864e
/utils/token_counter: get model_info from deployment directly (#10047) 2025-04-16 07:53:18 -07:00
Michael Leshchinsky
e19d05980c
Add litellm call id passing to Aim guardrails on pre and post-hooks calls (#10021)
All checks were successful
Read Version from pyproject.toml / read-version (push) Successful in 16s
Helm unit test / unit-test (push) Successful in 19s
* Add litellm_call_id passing to aim guardrails on pre and post-hooks

* Add test that ensures that pre_call_hook receives litellm call id when common_request_processing called
2025-04-16 07:41:28 -07:00
Ishaan Jaff
1d4fea509d ui new build 2025-04-15 22:36:44 -07:00
Ishaan Jaff
dcc43e797a
[Docs] Auto prompt caching (#10044)
* docs prompt cache controls

* doc fix auto prompt caching
2025-04-15 22:29:47 -07:00
Ishaan Jaff
bd88263b29
[Feat - Cost Tracking improvement] Track prompt caching metrics in DailyUserSpendTransactions (#10029)
* stash changes

* emit cache read/write tokens to daily spend update

* emit cache read/write tokens on daily activity

* update types.ts

* docs prompt caching

* undo ui change

* fix activity metrics

* fix prompt caching metrics

* fix typed dict fields

* fix get_aggregated_daily_spend_update_transactions

* fix aggregating cache tokens

* test_cache_token_fields_aggregation

* daily_transaction

* add cache_creation_input_tokens and cache_read_input_tokens to LiteLLM_DailyUserSpend

* test_daily_spend_update_queue.py
2025-04-15 21:40:57 -07:00
Ishaan Jaff
d32d6fe03e
[UI] Bug Fix - Show created_at and updated_at for Users Page (#10033)
* add created_at and updated_at as fields for internal user table

* test_get_users_includes_timestamps
2025-04-15 21:15:44 -07:00
Krish Dholakia
9b77559ccf
Add aggregate team based usage logging (#10039)
* feat(schema.prisma): initial commit adding aggregate table for team spend

allows team spend to be visible at 1m+ logs

* feat(db_spend_update_writer.py): support logging aggregate team spend

allows usage dashboard to work at 1m+ logs

* feat(litellm-proxy-extras/): add new migration file

* fix(db_spend_update_writer.py): fix return type

* build: bump requirements

* fix: fix ruff error
2025-04-15 20:58:48 -07:00
Krish Dholakia
33ead69c0a
Support checking provider /models endpoints on proxy /v1/models endpoint (#9958)
* feat(utils.py): support global flag for 'check_provider_endpoints'

enables setting this for `/models` on proxy

* feat(utils.py): add caching to 'get_valid_models'

Prevents checking endpoint repeatedly

* fix(utils.py): ensure mutations don't impact cached results

* test(test_utils.py): add unit test to confirm cache invalidation logic

* feat(utils.py): get_valid_models - support passing litellm params dynamically

Allows for checking endpoints based on received credentials

* test: update test

* feat(model_checks.py): pass router credentials to get_valid_models - ensures it checks correct credentials

* refactor(utils.py): refactor for simpler functions

* fix: fix linting errors

* fix(utils.py): fix test

* fix(utils.py): set valid providers to custom_llm_provider, if given

* test: update test

* fix: fix ruff check error
2025-04-14 23:23:20 -07:00
Krish Dholakia
9b0f871129
Add /vllm/* and /mistral/* passthrough endpoints (adds support for Mistral OCR via passthrough)
* feat(llm_passthrough_endpoints.py): support mistral passthrough

Closes https://github.com/BerriAI/litellm/issues/9051

* feat(llm_passthrough_endpoints.py): initial commit for adding vllm passthrough route

* feat(vllm/common_utils.py): add new vllm model info route

make it possible to use vllm passthrough route via factory function

* fix(llm_passthrough_endpoints.py): add all methods to vllm passthrough route

* fix: fix linting error

* fix: fix linting error

* fix: fix ruff check

* fix(proxy/_types.py): add new passthrough routes

* docs(config_settings.md): add mistral env vars to docs
2025-04-14 22:06:33 -07:00
Ishaan Jaff
ce2595f56a bump: version 1.66.0 → 1.66.1 2025-04-14 21:30:07 -07:00
Ishaan Jaff
b210639dce ui new build 2025-04-14 21:19:21 -07:00
Ishaan Jaff
c1a642ce20
[UI] Allow setting prompt cache_control_injection_points (#10000)
* test_anthropic_cache_control_hook_system_message

* test_anthropic_cache_control_hook.py

* should_run_prompt_management_hooks

* fix should_run_prompt_management_hooks

* test_anthropic_cache_control_hook_specific_index

* fix test

* fix linting errors

* ChatCompletionCachedContent

* initial commit for cache control

* fixes ui design

* fix inserting cache_control_injection_points

* fix entering cache control points

* fixes for using cache control on ui + backend

* update cache control settings on edit model page

* fix init custom logger compatible class

* fix linting errors

* fix linting errors

* fix get_chat_completion_prompt
2025-04-14 21:17:42 -07:00
Krish Dholakia
2ed593e052
Updated cohere v2 passthrough (#9997)
* Add cohere `/v2/chat` pass-through cost tracking support (#8235)

* feat(cohere_passthrough_handler.py): initial working commit with cohere passthrough cost tracking

* fix(v2_transformation.py): support cohere /v2/chat endpoint

* fix: fix linting errors

* fix: fix import

* fix(v2_transformation.py): fix linting error

* test: handle openai exception change
2025-04-14 19:51:01 -07:00
Ishaan Jaff
72c1f7e09a ui new build 2025-04-12 20:42:43 -07:00
Ishaan Jaff
89dfb42697
[UI QA checklist] (#9957)
* fix typo on UI

* fix for edit user tab

* fix for user spend

* add /team/permissions_list to management routes

* fix auth check for team member permissions

* fix team endpoints test
2025-04-12 20:41:50 -07:00
Krish Dholakia
00e49380df
Litellm UI qa 04 12 2025 p1 (#9955)
* fix(model_info_view.tsx): cleanup text

* fix(key_management_endpoints.py): fix filtering litellm-dashboard keys for internal users

* fix(proxy_track_cost_callback.py): prevent flooding spend logs with admin endpoint errors

* test: add unit testing for logic

* test(test_auth_exception_handler.py): add more unit testing

* fix(router.py): correctly handle retrieving model info on get_model_group_info

fixes issue where model hub was showing None prices

* fix: fix linting errors
2025-04-12 19:30:48 -07:00
Ishaan Jaff
c86e678809
[Docs] v1.66.0-stable fixes (#9953)
* add categories for spend tracking improvements

* xai reasoning usage

* docs tag management

* docs tag based routing

* [Beta] Routing based

* docs tag based routing

* docs tag routing

* docs enterprise web search
2025-04-12 16:57:25 -07:00
Ishaan Jaff
4e81b2cab4
[Team Member permissions] - Fixes (#9945)
* only load member permissions for non-admins

* run member permission checks on update + regenerate endpoints

* run check for /key/generate

* working test_default_member_permissions

* passing test with permissions on update delete endpoints

* test_create_permissions

* _team_key_generation_check

* fix TeamBase

* fix team endpoints

* fix api docs check
2025-04-12 11:17:51 -07:00
Krish Dholakia
d004fb542f
fix(litellm_proxy_extras): add baselining db script (#9942)
* fix(litellm_proxy_extras): add baselining db script

Fixes https://github.com/BerriAI/litellm/issues/9885

* fix(prisma_client.py): fix ruff errors

* ci(config.yml): add publish_proxy_extras step

* fix(config.yml): compare contents between versions to check for changes

* fix(config.yml): fix check

* fix: install toml

* fix: update check

* fix: ensure versions in sync

* fix: fix version compare

* fix: correct the cost for 'gemini/gemini-2.5-pro-preview-03-25' (#9896)

* fix: Typo in the cost 'gemini/gemini-2.5-pro-preview-03-25', closes #9854

* chore: update in backup file as well

* Litellm add managed files db (#9930)

* fix(openai.py): ensure openai file object shows up on logs

* fix(managed_files.py): return unified file id as b64 str

allows retrieve file id to work as expected

* fix(managed_files.py): apply decoded file id transformation

* fix: add unit test for file id + decode logic

* fix: initial commit for litellm_proxy support with CRUD Endpoints

* fix(managed_files.py): support retrieve file operation

* fix(managed_files.py): support for DELETE endpoint for files

* fix(managed_files.py): retrieve file content support

supports retrieve file content api from openai

* fix: fix linting error

* test: update tests

* fix: fix linting error

* feat(managed_files.py): support reading / writing files in DB

* feat(managed_files.py): support deleting file from DB on delete

* test: update testing

* fix(spend_tracking_utils.py): ensure each file create request is logged correctly

* fix(managed_files.py): fix storing / returning managed file object from cache

* fix(files/main.py): pass litellm params to azure route

* test: fix test

* build: add new prisma migration

* build: bump requirements

* test: add more testing

* refactor: cleanup post merge w/ main

* fix: fix code qa errors

* [DB / Infra] Add new column team_member_permissions  (#9941)

* add team_member_permissions to team table

* add migration.sql file

* fix poetry lock

* fix prisma migrations

* fix poetry lock

* fix migration

* ui new build

* fix(factory.py): correct indentation for message index increment in ollama,  This fixes bug #9822 (#9943)

* fix(factory.py): correct indentation for message index increment in ollama_pt function

* test: add unit tests for ollama_pt function handling various message types

* ci: update test

* fix: fix check

* ci: see what dir looks like

* ci: more checks

* ci: fix filepath

* ci: cleanup

* ci: fix ci

---------

Co-authored-by: Nilanjan De <nilanjan.de@gmail.com>
Co-authored-by: Ishaan Jaff <ishaanjaffer0324@gmail.com>
Co-authored-by: Dan Shaw <dan@danieljshaw.com>
2025-04-12 10:29:34 -07:00
Ishaan Jaff
69a3aab4c8 ui new build 2025-04-12 09:13:00 -07:00
Ishaan Jaff
fb0c3d9e18
[DB / Infra] Add new column team_member_permissions (#9941)
* add team_member_permissions to team table

* add migration.sql file

* fix poetry lock

* fix prisma migrations

* fix poetry lock

* fix migration
2025-04-12 09:06:04 -07:00
Krish Dholakia
421e0a3004
Litellm add managed files db (#9930)
* fix(openai.py): ensure openai file object shows up on logs

* fix(managed_files.py): return unified file id as b64 str

allows retrieve file id to work as expected

* fix(managed_files.py): apply decoded file id transformation

* fix: add unit test for file id + decode logic

* fix: initial commit for litellm_proxy support with CRUD Endpoints

* fix(managed_files.py): support retrieve file operation

* fix(managed_files.py): support for DELETE endpoint for files

* fix(managed_files.py): retrieve file content support

supports retrieve file content api from openai

* fix: fix linting error

* test: update tests

* fix: fix linting error

* feat(managed_files.py): support reading / writing files in DB

* feat(managed_files.py): support deleting file from DB on delete

* test: update testing

* fix(spend_tracking_utils.py): ensure each file create request is logged correctly

* fix(managed_files.py): fix storing / returning managed file object from cache

* fix(files/main.py): pass litellm params to azure route

* test: fix test

* build: add new prisma migration

* build: bump requirements

* test: add more testing

* refactor: cleanup post merge w/ main

* fix: fix code qa errors
2025-04-12 08:24:46 -07:00
Krish Dholakia
3ca82c22b6
Support CRUD endpoints for Managed Files (#9924)
* fix(openai.py): ensure openai file object shows up on logs

* fix(managed_files.py): return unified file id as b64 str

allows retrieve file id to work as expected

* fix(managed_files.py): apply decoded file id transformation

* fix: add unit test for file id + decode logic

* fix: initial commit for litellm_proxy support with CRUD Endpoints

* fix(managed_files.py): support retrieve file operation

* fix(managed_files.py): support for DELETE endpoint for files

* fix(managed_files.py): retrieve file content support

supports retrieve file content api from openai

* fix: fix linting error

* test: update tests

* fix: fix linting error

* fix(files/main.py): pass litellm params to azure route

* test: fix test
2025-04-11 21:48:27 -07:00
Ishaan Jaff
91c0a794b9
[Feat - Team Member Permissions] - CRUD Endpoints for managing team member permissions (#9919)
* add team_member_permissions

* add GetTeamMemberPermissionsRequest types

* crud endpoint for team member permissions

* test team member permissions CRUD

* fix GetTeamMemberPermissionsRequest
2025-04-11 17:15:16 -07:00
Ishaan Jaff
8b1d2d6956
[Feat - UI] - Allow setting Default Team setting when LiteLLM SSO auto creates teams (#9918)
* endpoint for updating default team settings on ui

* add GET default team settings endpoint

* ui expose default team settings on UI

* update to use DefaultTeamSSOParams

* DefaultTeamSSOParams

* fix DefaultTeamSSOParams

* docs team management

* test_update_default_team_settings
2025-04-11 14:07:10 -07:00
Krish Dholakia
0415f1205e
Litellm dev 04 10 2025 p3 (#9903)
* feat(managed_files.py): encode file type in unified file id

simplify calling gemini models

* fix(common_utils.py): fix extracting file type from unified file id

* fix(litellm_logging.py): create standard logging payload for create file call

* fix: fix linting error
2025-04-11 09:29:42 -07:00
Krish Dholakia
9f27e8363f
Realtime API: Support 'base_model' cost tracking + show response in spend logs (if enabled) (#9897)
* refactor(litellm_logging.py): refactor realtime cost tracking to use common code as rest

Ensures basic features like base model just work

* feat(realtime/): support 'base_model' cost tracking on realtime api

Fixes issue where base model was not working on realtime

* fix: fix ruff linting error

* test: fix test
2025-04-10 21:24:45 -07:00
Ishaan Jaff
f5c5c79ea4 update docs 2025-04-10 20:18:54 -07:00
Ishaan Jaff
98e34cbf5d
[Docs] Tutorial using MSFT auto team assignment with LiteLLM (#9898)
* add default_team_params as a config.yaml setting

* create_litellm_team_from_sso_group

* test_default_team_params

* test_create_team_without_default_params

* docs default team settings

* docs msft entra id tutorial

* commit litellm docs msft group assignment

* litellm MSFT sso

* member, team assignment on litellm

* docs msft auto assignment

* bug fix default team setting

* docs litellm default team settings

* test_default_team_params
2025-04-10 20:07:55 -07:00
Ishaan Jaff
72a12e91c4
[Bug Fix MSFT SSO] Use correct field for user email when using MSFT SSO (#9886)
* fix openid_from_response

* test_microsoft_sso_handler_openid_from_response_user_principal_name

* test upsert_sso_user
2025-04-10 17:40:58 -07:00
Ishaan Jaff
94a553dbb2
[Feat] Emit Key, Team Budget metrics on a cron job schedule (#9528)
* _initialize_remaining_budget_metrics

* initialize_budget_metrics_cron_job

* initialize_budget_metrics_cron_job

* initialize_budget_metrics_cron_job

* test_initialize_budget_metrics_cron_job

* LITELLM_PROXY_ADMIN_NAME

* fix code qa checks

* test_initialize_budget_metrics_cron_job

* test_initialize_budget_metrics_cron_job

* pod lock manager allow dynamic cron job ID

* fix pod lock manager

* require cronjobid for PodLockManager

* fix DB_SPEND_UPDATE_JOB_NAME acquire / release lock

* add comment on prometheus logger

* add debug statements for emitting key, team budget metrics

* test_pod_lock_manager.py

* test_initialize_budget_metrics_cron_job

* initialize_budget_metrics_cron_job

* initialize_remaining_budget_metrics

* remove outdated test
2025-04-10 16:59:14 -07:00
Ishaan Jaff
90d862b041
[Feat SSO] - Allow admins to set default_team_params to have default params for when litellm SSO creates default teams (#9895)
* add default_team_params as a config.yaml setting

* create_litellm_team_from_sso_group

* test_default_team_params

* test_create_team_without_default_params

* docs default team settings
2025-04-10 16:58:28 -07:00
Krish Dholakia
0dbd663877
fix(cost_calculator.py): handle custom pricing at deployment level fo… (#9855)
* fix(cost_calculator.py): handle custom pricing at deployment level for router

* test: add unit tests

* fix(router.py): show custom pricing on UI

check correct model str

* fix: fix linting error

* docs(custom_pricing.md): clarify custom pricing for proxy

Fixes https://github.com/BerriAI/litellm/issues/8573#issuecomment-2790420740

* test: update code qa test

* fix: cleanup traceback

* fix: handle litellm param custom pricing

* test: update test

* fix(cost_calculator.py): add router model id to list of potential model names

* fix(cost_calculator.py): fix router model id check

* fix: router.py - maintain older model registry approach

* fix: fix ruff check

* fix(router.py): router get deployment info

add custom values to mapped dict

* test: update test

* fix(utils.py): update only if value is non-null

* test: add unit test
2025-04-09 22:13:10 -07:00
Krish Dholakia
0c5b4aa96d
feat(realtime/): add token tracking + log usage object in spend logs … (#9843)
* feat(realtime/): add token tracking + log usage object in spend logs metadata

* test: fix test

* test: update tests

* test: update testing

* test: update test

* test: update test

* test: update test

* test: update test

* test: update tesdt

* test: update test
2025-04-09 22:11:00 -07:00
Krish Dholakia
87733c8193
Fix anthropic prompt caching cost calc + trim logged message in db (#9838)
* fix(spend_tracking_utils.py): prevent logging entire mp4 files to db

Fixes https://github.com/BerriAI/litellm/issues/9732

* fix(anthropic/chat/transformation.py): Fix double counting cache creation input tokens

Fixes https://github.com/BerriAI/litellm/issues/9812

* refactor(anthropic/chat/transformation.py): refactor streaming to use same usage calculation block as non-streaming

reduce errors

* fix(bedrock/chat/converse_transformation.py): don't increment prompt tokens with cache_creation_input_tokens

* build: remove redisvl from requirements.txt (temporary)

* fix(spend_tracking_utils.py): handle circular references

* test: update code cov test

* test: update test
2025-04-09 21:26:43 -07:00
Ishaan Jaff
1359e6d7a6
[SSO] Connect LiteLLM to Azure Entra ID Enterprise Application (#9872)
* refactor SSO handler

* render sso JWT on ui

* docs debug sso

* fix sso login flow use await

* fix ui sso debug JWT

* test ui sso

* remove redis vl

* fix redisvl==0.5.1

* fix ml dtypes

* fix redisvl

* fix redis vl

* fix debug_sso_callback

* fix linting error

* fix redis semantic caching dep

* working graph api assignment

* test msft sso handler openid

* testing for msft group assignment

* fix debug graph api sso flow

* fix linting errors

* add_user_to_teams_from_sso_response

* ui sso fix team assignments

* linting fix _get_group_ids_from_graph_api_response

* add MicrosoftServicePrincipalTeam

* create_litellm_teams_from_service_principal_team_ids

* create_litellm_teams_from_service_principal_team_ids

* docs MICROSOFT_SERVICE_PRINCIPAL_ID

* fix linting errors
2025-04-09 20:26:59 -07:00
Krish Dholakia
ac4f32fb1e
Cost tracking for gemini-2.5-pro (#9837)
* build(model_prices_and_context_window.json): add google/gemini-2.0-flash-lite-001 versioned pricing

Closes https://github.com/BerriAI/litellm/issues/9829

* build(model_prices_and_context_window.json): add initial support for 'supported_output_modalities' param

* build(model_prices_and_context_window.json): add initial support for 'supported_output_modalities' param

* build(model_prices_and_context_window.json): add supported endpoints to gemini-2.5-pro

* build(model_prices_and_context_window.json): add gemini 200k+ pricing

* feat(utils.py): support cost calculation for gemini-2.5-pro above 200k tokens

Fixes https://github.com/BerriAI/litellm/issues/9807

* build: test dockerfile change

* build: revert apk change

* ci(config.yml): pip install wheel

* ci: test problematic package first

* ci(config.yml): pip install only binary

* ci: try more things

* ci: test different ml_dtypes version

* ci(config.yml): check ml_dtypes==0.4.0

* ci: test

* ci: cleanup config.yml

* ci: specify ml dtypes in requirements.txt

* ci: remove redisvl depedency (temporary)

* fix: fix linting errors

* test: update test

* test: fix test
2025-04-09 18:48:43 -07:00
Ishaan Jaff
4c1bb74c3d
[Feat] - SSO - Use MSFT Graph API to assign users to teams (#9865)
* refactor SSO handler

* render sso JWT on ui

* docs debug sso

* fix sso login flow use await

* fix ui sso debug JWT

* test ui sso

* remove redis vl

* fix redisvl==0.5.1

* fix ml dtypes

* fix redisvl

* fix redis vl

* fix debug_sso_callback

* fix linting error

* fix redis semantic caching dep

* working graph api assignment

* test msft sso handler openid

* testing for msft group assignment

* fix debug graph api sso flow

* fix linting errors

* add_user_to_teams_from_sso_response

* fix linting error
2025-04-09 18:26:43 -07:00